How Do We Police Computer Algorithms That Collude?

0
85

The rules governing humans colluding to fix prices and reduce competition are well established law. But what if two computer programs are tasked with setting prices and end up colluding based on their strategy choices?

Two law professors, Ariel Ezrachi of Oxford and Maurice E. Stucke of the University of Tennessee, have have been taking a hard look at just what happens when computers are involved in pricing for goods and services (like, say, at Amazon or Uber). The specifically examined the potential for collusion and the results are pretty shocking and pretty illegal.

Computers can’t have secret back-room conversation to fix prices, but they can be programmed to predict the way competing computers are going to behave. By anticipating other computer based pricing decisions and responding based on solid mathematical principles they can effectively cooperate with each other in advancing their own profit-maximizing interests. In human terms we call this colluding.

“Computers may limit competition not only through agreement or concerted practice, but also through more subtle means. For example, this may be the case when similar computer algorithms promote a stable market environment in which they predict each other’s reaction and dominant strategy. Such a digitalized environment may be more predictable and controllable. Furthermore, it does not suffer from behavioral biases and is less susceptible to possible deterrent effects generated through antitrust enforcement.” the authors write.

The research exposes a hue problem: what happens when the law hasn’t caught up to the technology?

While there has been one prosecution for this type of collusion, the law is still way behind.

More disturbingly, it isn’t clear if the law can ever catch up.

In easy cases, a computer is just a tool used to help humans collude, which is theoretically easy to prosecute.

But what if the computer learns to collude on its own? What if two competing machines using the same theory end up fixing prices when that wasn’t specifically the intention? Can a machine even be prosecuted?

In this type of collusion, which the authors call Autonomous Machine COllusion, “the computer executes whichever strategy it deems optimal, based on learning and ongoing feedback collected from the market. Issues of liability, as we will discuss, raise challenging legal and ethical issues.”

With the net effect being bad for consumers there is an urgent need to fix the regulations and have legal scholars with technical backgrounds examine the issues in detail. But it also takes politicians who understand technology and on that front the future does not look bright. Many of our elected representatives don’t even use email so it remains unlikely they will be able to properly construct laws where needed to deal with this predatory behavior.

Stay Connected