ReportTitle_SR.jpg

Where angels fear to trade: algorithms and antitrust

March 2019  |  SPECIAL REPORT: MANAGING RISK

Financier Worldwide Magazine

March 2019 Issue


Algorithms are at the core of many of today’s largest and fastest-growing sectors. Defined by the Organisation for Economic Co-operation and Development (OECD) as a “sequence of rules that should be performed in an exact order to carry out a certain task”, algorithms enable businesses to take millions of decisions in a matter of seconds by applying predetermined rules, such as which search results to show, which products to advertise, whether to buy or sell particular financial instruments, how to price insurance cover and even which candidates to hire.

Yet the speed and prevalence of algorithmic decision making has led to concerns about transparency, the possibility of in-built biases and the amplification of errors. A ‘bad’ decision becomes much more dangerous if algorithms are programmed to repeat it on a large scale. Algorithms characterised by these and other shortcomings have been described as ‘weapons of math destruction’.

Antitrust agencies are starting to ask what might happen if algorithms are programmed to take anti-competitive, rather than pro-competitive, decisions. What if firms agree to fix prices by agreeing how their pricing algorithms will work? What if competitors decide to outsource their decisions to the same third-party algorithm? Will algorithmic decision making result in alignment between rivals even if their algorithms are designed independently? And how might these issues affect financial trading?

Using algorithms to cooperate

In 2013, the European Commission imposed fines of over €1.3bn on banks that were found to have participated in a cartel to manipulate euro interest rate derivatives. Traders from different banks discussed intended EURIBOR submissions and trading or pricing strategies in chat rooms and via instant messengers. The Commission found that the banks “colluded instead of competing with each other on the euro derivatives market”. Similar investigations are ongoing in relation to other financial markets, including foreign currency exchanges.

The way trading decisions are taken is changing rapidly. In 2017, the Financial Times reported that job postings for data scientists were outpacing demand for fundamental or even quantitative analysts. It described the “growing view that the future of active investing will not be human fund managers judging a company by the firmness of the chief executive’s handshake but computer scientists tinkering with trading algorithms”. And in a 2018 report, it showed that high frequency trading has consistently accounted for 50 to 60 percent of trading in US equities since 2008. The use of algorithms to replace human decisions is not unique to financial markets, however. For example, a number of recent studies suggest that computers may be better than doctors at diagnosing medical conditions from skin cancer to Alzheimer’s disease.

Coordination among data scientists and algorithm designers from rival institutions can raise competition concerns just as much as coordination among traditional traders. This is borne out by a recent investigation by the UK Competition and Markets Authority (CMA) of two sellers of licensed sports and entertainment merchandise, such as posters and frames, on Amazon. One seller – GB Eye – supplied merchandise to another seller, Trod, while also competing with Trod for sales direct to consumers. Trod complained to GB Eye that its low prices risked putting Trod out of business, which in the long term would also hurt GB Eye. Following a series of exchanges, the companies agreed not to undercut each other on price.

So far, so traditional, as far as price-fixing cartels go. What made the case unusual was the parties’ implementation of their agreement through automated re-pricing software to monitor and adjust their respective prices, thereby ensuring that neither party undercut the other. Employees of Trod and GB Eye stayed in contact to ensure that the pricing arrangement was working and to address any questions about the operation of the re-pricing algorithm. The CMA fined Trod £163,371 – around 1 percent of its total revenues – under a settlement agreement and Trod’s managing director was disqualified from managing any UK company for five years. GB Eye was exempt from sanction only because it blew the whistle on the arrangement.

Applied to the trading context, agreements to implement a common set of algorithms to coordinate on, say, when to buy or sell particular financial instruments or to try to influence market indices will likely be treated just as any other agreement to coordinate on strategy. In other words, translating an agreement into the language of technical algorithms will not shield it from competition law.

Outsourcing algorithmic decisions

A less straightforward application of competition rules arises when firms engage third parties to build, design or consult on their trading algorithms, knowing that those third parties provide the same services to rivals. Suppose a consultant helps two clients implement the exact same algorithm, which results in them taking identical decisions and thereby coordinating their behaviour. Is that an antitrust concern?

Industry groups have already started addressing the regulatory risks involved in using third parties to develop trading algorithms. In December 2018, the Futures Industry Association published ‘Guidance for Firms Working with Third-Party Algorithmic Trading System Providers’. This guidance noted that where a firm outsourced its algorithmic trading environment, it had a responsibility “to include oversight of third-party provided algorithm trading systems”.

As a matter of principle, competition law does not object to rivals using the same suppliers – in many industries it is unavoidable. Concerns arise, though, where common suppliers or customers act as a focal point for exchanging competitively sensitive information between rivals or helping them coordinate prices, including when third parties act in a consultative capacity. This is known as a ‘hub and spoke cartel’ and third-party facilitators can themselves be held responsible. In 2015, the European Commission fined broking firm ICAP for facilitating a cartel in yen interest rate derivatives by serving as a communications channel between members of the cartel and disseminating misleading information about anticipated LIBOR rates to other banks.

At what point does using a third party to help design algorithms cross over from legitimate outsourcing to a hub and spoke cartel? This question was considered in a 2017 case concerning the designation of Uber as a transport service: if Uber drivers are, in principle, independent operators, is it lawful for them to defer their pricing decisions to the same algorithm? Advocate General Maciej Szpunar – an adviser to the Court of Justice – opined that “classifying Uber as a platform which groups together independent service providers may raise questions from the standpoint of competition law… the use by competitors of the same algorithm to calculate the price is not in itself unlawful, but might give rise to hub and spoke conspiracy concerns when the power of the platform increases”.

This formulation raises more questions than answers. More practical guidance emerges from the E-Turas case in 2016, which concerned 30 travel agencies in Lithuania that used the E-Turas platform to sell holiday bookings. The platform informed these agencies by email that its software would limit the maximum discounts offered on its site to 3 percent in order to ‘normalise competition’. The Lithuanian competition agency treated this as a hub and spoke price-fixing cartel. The Court of Justice confirmed that the platform’s message to travel agencies, combined with the technical restrictions it placed on discounts, could give rise to a legal presumption of cartel conduct. It was up to the travel agencies to prove that they had either not received the platform’s message or had not read it in order to avoid liability.

Alignment of independent algorithms

What if two trading parties unilaterally design their algorithms to align with each other’s behaviour, but without the parties having any direct or indirect contacts? This is not merely a theoretical question. Strategies for ‘copy-trading’ and ‘mirror trading’ exist as ways of imitating the strategies of successful traders. For non-experts investing small sums, these strategies can help investors to benefit from traders with greater expertise. Applied to institutional investors, though, interdependent algorithms could cause severe market swings.

In 2015, the US Markets Editor of the Financial Times argued “it is conceivable that, unforeseen, two rival algos somehow undermine or amplify each other, ramping prices up or down at dizzying speeds. They could also lead to an invisible herding effect, with disastrous consequences should one or several investors have to liquidate their positions”. And in the words of The Washington Post’s business and economics columnist in 2018, much trading is “done automatically by computers, according to complex algorithms that focus on changes in market prices or indexes caused by the trading done by other computers. In this kind of robots vs. robots trading with its circular logic, fundamentals are irrelevant, the volumes are enormous and the holding periods are often a matter of minutes, or even seconds”.

Whatever its potentially harmful effects, competition law does not prohibit so-called ‘tacit coordination’, whereby firms in an oligopolistic, price-transparent market react unilaterally to each other’s decisions. This follows from a general principle that firms should be free to adapt themselves sensibly to rivals’ conduct.

Take the example of a town with two petrol stations. The owner of one station climbs her ladder and increases the price on her sign by £0.25 per litre. Seeing this, her rival does the same. This continues several times, producing price increases that are unrelated to the normal equilibrium of supply and demand.

Former US Federal Trade Commission (FTC) Chairman Maureen Ohlhausen argues convincingly that competition law does not prohibit this type of ‘conscious parallelism’ because there is simply no sensible remedy: “We cannot tell firms to ignore the public behaviour of their rivals when they set prices without deleting the ‘free’ in free market. Enjoining this kind of behaviour would inevitably lead to price regulation, which is completely inimical to the underlying purposes of the antitrust laws”. This is true regardless of whether the price-setting process involves algorithms or ladders and signs.

Put differently, if it is lawful for an employee to behave in a certain way, it should be lawful for a computer to do the same. But, just as with petrol stations, parallel algorithmic pricing or trading decisions can raise suspicions of collusion. Evidence that decisions are genuinely independent – for example, evidence that competition compliance is built into the design of an algorithm – can be invaluable when dealing with allegations of cartel behaviour.

Practical measures for compliance

Faced with a complex commercial environment and legal uncertainties, what can companies do to address antitrust risk stemming from their use of algorithms?

Three measures stand out. First, the design of pricing, trading or other algorithms should not be discussed with rivals. Just as human pricing or trading decisions should be taken independently, so too should the choice of algorithms and the decisions taken by those algorithms. Second, firms should not agree to be bound – along with their rivals – by the algorithmic decision making of a third party. Third, firms should ensure that any employees, consultants or outside providers working on algorithms are bound by strict confidentiality rules and firewalls to avoid hub and spoke concerns. This could include the right to audit outside parties’ systems to ensure that there is no cross-contamination.

Competition authorities around the world see algorithms as one of the next big challenges and they are already bringing enforcement cases. Companies relying on algorithms for pricing or trading decisions must ensure that the operations of their computers – not just the actions of their employees – comply with competition rules.

 

Paul Gilbert is counsel and Alexander Waksman is an associate at Cleary Gottlieb Steen & Hamilton LLP. Mr Gilbert can be contacted on +44 (0)20 7614 2335 or by email: pgilbert@cgsh.com. Mr Waksman can be contacted on +44 (0)20 7614 2333 or by email: awaksman@cgsh.com.

© Financier Worldwide


©2001-2019 Financier Worldwide Ltd. All rights reserved.