The competition watchdogs have vowed to protect open and fair competition practices amid the risks that the AI market could bring.
Regulatory bodies from the US, UK and Europe have signed a joint statement promising to protect consumers from unfair competition practices in the AI space.
As generative AI continues to evolve and Big Tech companies invest heavily in their own models, the statement promises that the signatories will work “in the interests of fair, open and competitive markets”.
The statement was signed by EU commissioner for competition Margrethe Vestager, UK Competition and Markets Authority CEO Sarah Cardell, US Federal Trade Commission chair Lina M Khan, and Jonathan Kanter, assistant attorney general at the US Department of Justice.
“Guided by our respective laws, we will work to ensure effective competition and the fair and honest treatment of consumers and businesses. This is grounded in the knowledge that fair, open and competitive markets will help unlock the opportunity, growth and innovation that these technologies could provide,” the statement reads.
“We are working to share an understanding of the issues as appropriate and are committed to using our respective powers where appropriate.”
An inflection point for AI
The statement also highlighted the current “technological inflection point” the world has reached with the evolution of AI, which can introduce new means of competing.
“This requires being vigilant and safeguarding against tactics that could undermine fair competition,” it said.
“Given the speed and dynamism of AI developments, and learning from our experience with digital markets, we are committed to using our available powers to address any such risks before they become entrenched or irreversible harms.”
One major concern for AI competition is the concentrated control of key inputs such as specialised chips or data at scale. These could be in the hands of a small number of companies that would then be able to exploit their position of power.
Another concern outlined in the statement is around companies’ ability to extend their market power in AI-related markets due to the expansive use of foundation models. This can give tech giants the ability to control the channels of distribution of AI or AI-enabled services.
In a statement from the UK’s CMA, Cardell said AI is a borderless technology. “That’s why we’ve come together with our EU and US partners to set out our commitment to help ensure fair, open and effective competition in AI drives growth and positive change for our societies,” she said.
In a post on X, Vestager said AI comes with “unique growth and innovation power that needs open and contestable markets to unlock its full potential”.
Competition heats up
Following the mass adoption of OpenAI’s ChatGPT, major tech giants were quick to enter the fray with their own large language models. From Google’s Bard, which got off to a rocky start, to Meta’s Llama, which just released its latest model, it is undeniable that this is Big Tech’s current battleground. In fact, the UK’s CMA launched a competition probe into Microsoft and Inflection AI earlier this month.
But fair competition and open markets have been a concern within the tech world for many years and competition authorities around the world have been kept busy with investigations into potential unfair practices.
In recent years, some major deals that have come under scrutiny include Microsoft’s acquisition of Activision Blizzard, Amazon’s plan to buy Roomba maker iRobot, and a multibillion-dollar merger between Adobe and Figma.
While the Microsoft-Activision deal did eventually go through following several concessions, the regulatory pressure proved to be too much for Amazon, which walked back its plans to buy iRobot in January of this year. Similarly, Adobe and Figma abandoned their major deal following intense scrutiny from multiple competition watchdogs.
Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.