Facebook owner Meta Platforms Inc is testing its first in-house chip for training artificial intelligence (AI) systems, a key milestone as it moves to design more of its own custom silicon and reduce reliance on external suppliers like Nvidia Corp, two sources told Reuters.
The world’s biggest social media company has begun a small deployment of the chip and plans to ramp up production for wide-scale use if the test goes well, the sources said.
The push to develop in-house chips is part of a long-term plan at Meta to bring down its mammoth infrastructure costs, as the company places expensive bets on AI tools to drive growth.
Photo: Reuters
Meta, which also owns Instagram and WhatsApp, has forecast total expenses of US$114 billion to US$119 billion for this year, including up to US$65 billion in capital expenditure largely driven by spending on AI infrastructure.
One of the sources said Meta’s new training chip is a dedicated accelerator, meaning it is designed to handle only AI-specific tasks. This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.
Meta is working with Taiwan Semiconductor Manufacturing Co (TSMC, 台積電) to produce the chip, this person said.
The test deployment began after Meta finished its first “tape-out” of the chip, a significant marker of success in silicon development work that involves sending an initial design through a chip factory, the other source said.
A typical tape-out costs tens of millions of US dollars and takes roughly three to six months to complete, with no guarantee the test will succeed. A failure would require Meta to diagnose the problem and repeat the tape-out step.
Meta and TSMC declined to comment.
Meta executives have said they want to start using their own chips by next year for training, or the compute-intensive process of feeding the AI system reams of data to “teach” it how to perform.
As with the inference chip, the goal for the training chip is to start with recommendation systems and later use it for generative AI products like chatbot Meta AI, the executives said.
“We’re working on how would we do training for recommender systems and then eventually how do we think about training and inference for gen AI,” Meta chief product officer Chris Cox said at the Morgan Stanley technology, media and telecom conference last week.
Cox described Meta’s chip development efforts as “kind of a walk, crawl, run situation” so far, but said executives considered the first-generation inference chip for recommendations to be a “big success.”
Meta previously pulled the plug on an in-house custom inference chip after it flopped in a small-scale test deployment similar to the one it is doing now for the training chip, instead reversing course and placing orders for billions of dollars worth of Nvidia GPUs in 2022.
The social media company has remained one of Nvidia’s biggest customers since then, amassing an arsenal of GPUs to train its models, including for recommendations and ads systems and its Llama foundation model series. The units also perform inference for the more than 3 billion people who use its apps each day.
The value of those GPUs has been thrown into question this year, as AI researchers increasingly express doubts about how much more progress can be made by continuing to “scale up” large language models by adding ever more data and computing power.
Those doubts were reinforced with the late-January launch of new low-cost models from Chinese start-up DeepSeek (深度求索), which optimize computational efficiency by relying more heavily on inference than most incumbent models.
In a DeepSeek-induced global rout in AI stocks, Nvidia shares lost as much as a fifth of their value at one point. They subsequently regained most of that ground, with investors wagering the company’s chips will remain the industry standard for training and inference, although they have dropped again on broader trade concerns.
RUN IT BACK: A succesful first project working with hyperscalers to design chips encouraged MediaTek to start a second project, aiming to hit stride in 2028 MediaTek Inc (聯發科), the world’s biggest smartphone chip supplier, yesterday said it is engaging a second hyperscaler to help design artificial intelligence (AI) accelerators used in data centers following a similar project expected to generate revenue streams soon. The first AI accelerator project is to bring in US$1 billion revenue next year and several billion US dollars more in 2027, MediaTek chief executive officer Rick Tsai (蔡力行) told a virtual investor conference yesterday. The second AI accelerator project is expected to contribute to revenue beginning in 2028, Tsai said. MediaTek yesterday raised its revenue forecast for the global AI accelerator used
TEMPORARY TRUCE: China has made concessions to ease rare earth trade controls, among others, while Washington holds fire on a 100% tariff on all Chinese goods China is effectively suspending implementation of additional export controls on rare earth metals and terminating investigations targeting US companies in the semiconductor supply chain, the White House announced. The White House on Saturday issued a fact sheet outlining some details of the trade pact agreed to earlier in the week by US President Donald Trump and Chinese President Xi Jinping (習近平) that aimed to ease tensions between the world’s two largest economies. Under the deal, China is to issue general licenses valid for exports of rare earths, gallium, germanium, antimony and graphite “for the benefit of US end users and their suppliers
Dutch chipmaker Nexperia BV’s China unit yesterday said that it had established sufficient inventories of finished goods and works-in-progress, and that its supply chain remained secure and stable after its parent halted wafer supplies. The Dutch company suspended supplies of wafers to its Chinese assembly plant a week ago, calling it “a direct consequence of the local management’s recent failure to comply with the agreed contractual payment terms,” Reuters reported on Friday last week. Its China unit called Nexperia’s suspension “unilateral” and “extremely irresponsible,” adding that the Dutch parent’s claim about contractual payment was “misleading and highly deceptive,” according to a statement
Artificial intelligence (AI) giant Nvidia Corp’s most advanced chips would be reserved for US companies and kept out of China and other countries, US President Donald Trump said. During an interview that aired on Sunday on CBS’ 60 Minutes program and in comments to reporters aboard Air Force One, Trump said only US customers should have access to the top-end Blackwell chips offered by Nvidia, the world’s most valuable company by market capitalization. “The most advanced, we will not let anybody have them other than the United States,” he told CBS, echoing remarks made earlier to reporters as he returned to Washington