Key Points
- The increasing use of algorithms to optimize pricing strategies has drawn the attention of competition authorities on both sides of the Atlantic, who fear the technology can facilitate price fixing and collusion.
- The DOJ, joined by eight states, recently filed its first civil enforcement action against an algorithm provider for allegedly facilitating price alignment and monopolization. Private plaintiffs are also bringing civil antitrust claims.
- As courts begin to delineate the boundaries of lawful algorithmic pricing, companies can reduce the risks of using these tools by, among other things, retaining final decision-making power over prices and exercising caution about any communications with competitors.
A range of businesses are increasingly turning to pricing algorithms to gain a competitive edge and increase revenue. At the same time, competition regulators are increasing their focus on algorithmic pricing, intent on spotting anticompetitive or unfair practices driven or facilitated by their use. Kamala Harris’ August 2024 economic plan spotlighted algorithmic pricing among its targets, and the Department of Justice (DOJ), joined by eight states, recently filed its first civil enforcement action alleging an algorithm provider unlawfully facilitated information sharing and price alignment and engaged in monopolization.
Meanwhile, private plaintiffs are bringing civil antitrust claims against companies that employ algorithms in pricing, though with mixed success. The upshot of the government and private moves together is an evolving and uncertain legal landscape. Here is a primer on the issues from a board perspective.
The Indispensable Pricing Algorithm
In simplest terms, pricing algorithms are computer programs that assist in setting prices. They analyze data and can be programmed to provide pricing recommendations or even automatically adjust prices. By and large, they rely on the same types of data points that businesses have traditionally used to make pricing decisions, including historical data, current indicators of supply and demand in the market, and sometimes competitors’ prices, but are capable of considering a broader set of inputs.
And unlike humans or rudimentary spreadsheets, pricing algorithms can access vast amounts of information and process that in real time to suggest optimum prices, often through the use of artificial intelligence or machine learning techniques. That enables companies to price dynamically in response to changes in market conditions and competitors’ prices based on a more accurate, real-time understanding of those conditions and prices.
The Regulatory Response and Risks
Government regulators have steadily increased their scrutiny of pricing algorithms. Most recently, in a July 2024 joint statement, the DOJ, Federal Trade Commission (FTC), U.K. Competition and Markets Authority and the European Commission promised to “be vigilant” of “the risk that algorithms can allow competitors to share competitively sensitive information, fix prices, or collude on other terms or business strategies in violation of our competition laws.”
The following month, the DOJ filed a civil enforcement action against an algorithm provider, alleging that the defendant facilitates the sharing of nonpublic, sensitive data and alignment of prices for multifamily rental housing. The DOJ’s complaint deems this provider “an algorithmic intermediary that collects, combines, and exploits landlords’ competitively sensitive information” and thereby “enriches itself and landlords at the expense of renters.”
“If anything, the use of A.I. or algorithmic-based technologies should concern us more because it’s much easier to price fix when you’re outsourcing it to an algorithm versus when you’re sharing manila envelopes in a smoke-filled room.” — Jonathan Kanter, Assistant Attorney General.
For several months before this lawsuit, DOJ and FTC have explained how, in their view, the risk of algorithmic “price fixing” can arise. Specifically, in a series of court filings in private suits, the agencies argued that it is “price fixing” for competitors to “jointly” delegate key aspects of their pricing to a common pricing algorithm provided by a third party. In the government’s view, that potentially amounts to a hub-and-spoke price-fixing conspiracy, with the algorithm provider serving as hub and the competing algorithm users as spokes. That would constitute a violation of section 1 of the Sherman Act, which in some circumstances can be prosecuted criminally. In the agencies’ view, “price fixing” could occur even if:
- Each competitor retained authority to deviate from the pricing algorithm’s recommendations.
- The competitors adopted the common pricing algorithm at different times over an extended span.
- None of the competitors directly communicated with one another about its adoption or use of the algorithm.
It is enough, the agencies argued, that the competitors acted “jointly” by, for example, each relying on the same algorithm to make pricing decisions with the knowledge that their competitors will do the same.
Courts are not required to accept the DOJ and FTC’s arguments — and the courts that have considered them so far have not — but the agencies’ statements reflect the arguments DOJ is making in its own enforcement action and likely preview the approach the agencies will take going forward.
North Carolina, California, Colorado, Connecticut, Minnesota, Oregon, Tennessee and Washington joined the DOJ’s suit. In addition to these eight states, attorneys general in Arizona and the District of Columbia have opened their own investigations of pricing algorithms and filed civil actions alleging collusion in the multifamily rental housing market.
Private Actions and the Evolving Judicial Landscape
There has been a wave of civil antitrust lawsuits by private plaintiffs against algorithm providers and their customers. For example, in October 2022, the first putative class action complaint was filed alleging a conspiracy among landlords to inflate the prices of multifamily rental housing via the concurrent use of one software company’s pricing algorithms. That complaint was then consolidated with over 40 follow-on lawsuits. Plaintiffs have filed similar class action lawsuits concerning pricing algorithms used for Las Vegas casino hotels, Atlantic City casino hotels, luxury hotels and major health insurers.
Comparing rulings in two of these cases provides insight into where federal courts have begun to draw the line. In one case, plaintiffs alleged hotels conspired to adopt pricing suggestions provided by an algorithm for rooms on the Las Vegas strip. The court dismissed the case, reasoning that plaintiffs had not alleged that the hotels are required to accept the pricing recommendations, nor that the competing hotels had pooled their confidential information in the dataset used by the algorithm to make pricing recommendations. Similarly, the court found wanting the plaintiffs’ generic allegations of “machine learning.” (Plaintiffs are appealing the dismissal.)1
In the other case, by contrast, a federal court in Tennessee refused to dismiss a complaint alleging that multifamily rental housing managers conspired to adopt pricing suggestions provided by a pricing algorithm. The court reasoned that, unlike the Las Vegas hotels case, plaintiffs alleged the algorithms recommendations are accepted upwards of 80-90% of the time and that the algorithm draws on a “melting pot” of confidential competitor information provided by its users and produces recommendations based on that information. (Of course, those allegations may not be borne out as the case proceeds.) In a similar case involving multifamily rental housing and a different pricing algorithm, a state court in California recently reached similar conclusions and declined to dismiss claims.
The Potential Cost of a Violation
Courts may ultimately conclude that the use of pricing algorithms, on their own, does not pose anticompetitive risks or violate the antitrust laws at all. The use of algorithms to access and analyze vast amounts of information about market conditions, including competitor pricing, may in fact be profoundly pro-competitive, facilitating more informed, competitive pricing that better reflects supply and demand in the marketplace.
Yet, given the focus of government enforcers and the threat of private damages actions, companies should be mindful of the potential antitrust risks posed by the use of pricing algorithms and, where business considerations permit, take steps to reduce those risks.
The DOJ opted to bring a civil suit in its first case on algorithmic pricing and thus it remains to be seen whether it will bring a criminal price-fixing case on this theory. The consequences for a defendant of a criminal conviction are far greater than they are of a civil order to cease the conduct. If convicted, a company faces fines up to $100 million or twice the gain or loss from the offense and individuals can be sentenced to up to 10 years in prison. While most foreign competition agencies do not proceed criminally, some routinely obtain large monetary penalties for price fixing.
On top of that, in the U.S., private plaintiffs can recover treble damages from companies found to have violated the Sherman Act, and the use of class actions can further increase companies’ exposure, pressuring defendants to settle before courts and juries can definitively address the merits. Private antitrust actions are also becoming more common in foreign jurisdictions.
Minimizing Risk: Questions To Ask and Mitigation Strategies
Risk assessment begins with determining how the algorithm functions:
- What are the algorithm’s data sources, for both training the algorithm and generating prices or pricing recommendation?
- What limits are there on how data from your company can be used in making recommendations to its competitors?
- What role does the algorithm play in decision-making on prices and what other considerations factor in those decisions?
More specifically, here are questions boards and their companies can ask, together with risk-mitigating strategies addressed to those questions.
Does the algorithm generate prices or recommendations based solely on public data and the user’s internal data?
If the pricing algorithm uses data from competitors for its pricing determinations, antitrust risk can be reduced by limiting the algorithm’s inputs exclusively to public competitor data.
What limits are there on the potential uses of your company’s data?
Limiting how the pricing algorithm provider can use the company’s data (e.g., barring its use to make recommendations to competitors) can lower antitrust risk.
How does the company communicate with clients and competitors about use of pricing algorithms?
Exercise care when communicating with competitors about adopting or using pricing algorithms, because careless communications could be misinterpreted as evidence of an agreement among competitors to use and abide by the pricing algorithm.
What information is the company sharing directly with competitors?
Communications among competitors about competitively sensitive topics, such as prices, discounts or other concessions, demand, or capacity, can raise significant antitrust concerns. They are often seen as red flags by government investigators and private plaintiffs indicating possible price-fixing or customer- or supply-allocation conspiracies. In some circumstances, exchange of such information on its own, without an agreement, can amount to an antitrust violation.
What do the documents say?
Be aware that regulators and plaintiffs will review internal communications concerning use of pricing algorithms. Clearly document decision-making regarding their adoption or use (e.g., a unilateral decision not coordinated with or dependent on competitors’ decision-making)
Does the company promote or mandate use of the recommended price?
Unless business considerations direct otherwise, treat algorithm-generated pricing recommendations as only one data point to help inform independent pricing decisions. Antitrust risk is lower when it’s apparent that a company using the algorithm does not automatically adopt recommendations or have policies requiring their automatic adoption.
______
1 Skadden represents one of the casino-hotel defendants and is involved in the litigation over algorithmic pricing in multifamily housing.
View other articles from this issue of The Informed Board
- AI Safety: The Role of the Board in Assessing and Managing AI Risk
- Are Fintechs Prepared for More Regulatory Scrutiny? Questions Fintech Boards Will Want To Ask
- Multinationals Face Challenges as They Prepare To Comply With the EU’s Sustainability Reporting Law
- Podcast: What Goes On Inside Your Boardroom? Investors Want To Know
See all the editions of The Informed Board
This memorandum is provided by Skadden, Arps, Slate, Meagher & Flom LLP and its affiliates for educational and informational purposes only and is not intended and should not be construed as legal advice. This memorandum is considered advertising under applicable state laws.