Microsoft Corp on Wednesday unveiled its first homegrown artificial intelligence (AI) chip and cloud-computing processor in an attempt to take more control of its technology and ramp up its offerings in the increasingly competitive market for AI computing.
The company also announced new software that lets clients design their own AI assistants.
The Maia 100 chip, announced at the company’s annual Ignite conference in Seattle, will provide Microsoft Azure cloud customers with a new way to develop and run AI programs that generate content.
Microsoft is already testing the chip with its Bing and Office AI products, said Rani Borkar, a vice president who oversees Azure’s chip unit.
Microsoft’s main AI partner, ChatGPT maker OpenAI, is also testing the processor. Both Maia and the server chip, Cobalt, are to debut in some Microsoft data centers early next year.
“Our goal is to ensure that the ultimate efficiency, performance and scale is something that we can bring to you from us and our partners,” Microsoft CEO Satya Nadella said at the conference.
Maia will power Microsoft’s own AI apps first and then be available to partners and customers, he added.
Microsoft’s multiyear investment shows how critical chips have become to gaining an edge in AI and the cloud. Making them in-house lets companies wring performance and price benefits from the hardware. The initiative also could insulate Microsoft from becoming overly dependent on any one supplier, a vulnerability underscored by the industrywide scramble for Nvidia Corp’s AI chips.
Microsoft’s push into processors follows similar moves by cloud rivals. Amazon.com Inc acquired a chipmaker in 2015 and sells services built on several kinds of cloud and AI chips. Google began letting customers use its AI accelerator processors in 2018.
For a company of Microsoft’s scale, “it’s important to optimize and integrate” every element of its hardware to provide the best performance and avoid supply-chain bottlenecks, Borkar said in an interview. “And really, at the end of the day, to give customers the infrastructure choice.”
Microsoft will also sell customers services based on Nvidia’s latest H200 chip and Advanced Micro Devices Inc’s (AMD) MI300X processor, both intended for AI tasks, some time next year. Still, the industry seems to be embarking on a lasting shift toward in-house chips. The transition is particularly bad news for Intel Corp, whose own AI chip efforts are running behind. Meanwhile, with Cobalt, Microsoft is joining efforts by Amazon and AMD to grab share in the server chip market, which Intel currently dominates.
Maia is designed to help AI systems more quickly process the massive amount of data required to do such tasks as recognize speech and images. Azure Cobalt is a central processing unit that will come with 128 computing cores — or mini processors—putting it in the same league as products from Intel and AMD. The more cores the better, because they can quickly divide work into small tasks and do them all at once.
Cobalt also uses Arm Holdings PLC designs, which proponents say are inherently more efficient, because they were developed from designs used in battery-powered devices like smartphones. Both chips are to be manufactured by Taiwan Semiconductor Manufacturing Co (台積電).
The company also announced Copilot Studio — software that lets clients customize AI assistant software from Microsoft or build their own AI assistants from scratch. Customers can also design ways for Microsoft’s copilot software to show up in their own existing apps.
In general, Microsoft said it is integrating its various AI copilots, as well as the Bing Chat AI features, into one piece of software that lets users access all of them.
SOLID FOUNDATION: Given its decades of expertise in megatronics, manufacturing and robotics, Japan has the wherewithal to create its own AI, Jensen Huang said Nvidia Corp plans to help build an artificial intelligence (AI) tech-related ecosystem in Japan to meet demand in a country eager to gain an edge in this emerging technology. The US company will seek to partner with Japanese research organizations, companies and start-ups to build factories for AI, Nvidia CEO Jensen Huang (黃仁勳) said yesterday during opening remarks in a meeting with Japanese Minister of Economy, Trade and Industry Yasutoshi Nishimura. The company is to set up an AI research laboratory, and invest in local start-ups and educate the public on using AI, Huang said. Huang earlier this week met with Japanese Prime
Huawei Technologies Co (華為) is among a field of “very formidable” competitors to Nvidia Corp in the race to produce the best artificial intelligence (AI) chips, Nvidia chief executive officer Jensen Huang (黃仁勳) said yesterday. Huawei, Intel Corp and an expanding group of semiconductor start-ups pose a stiff challenge to Nvidia’s dominant position in the market for AI accelerators, Huang told reporters in Singapore. Shenzhen-based Huawei has grown into China’s chip tech champion and returned to the spotlight this year with an advanced made-in-China smartphone processor. “We have a lot of competitors, in China and outside China,” Huang said. “Most of our competitors
TAPPING TAIWAN? TSMC is committed to hiring locally, but circumstances might require the firm to bring in foreign workers with specialized experience, they said Arizona labor unions and Taiwan Semiconductor Manufacturing Co (TSMC, 台積電) have reached an agreement to resolve labor disputes that have dogged the chipmaker’s construction site in Phoenix, Arizona, they said in a joint statement. The new accord is the result of months of negotiations between the world’s leading chipmaker and the Arizona Building and Construction Trades Council (AZBTC), a coalition of unions with 3,000 members on site — about one-quarter of TSMC’s total construction workforce. Under the agreement, TSMC — which is investing US$40 billion in Arizona — plans to partner with unions to develop workforce training programs and maintain transparency on
Global semiconductor industry revenue is expected to grow 16.8 percent year-on-year next year, following a slump this year, largely fueled by a surge in demand for memory chips, US-based market information advisory firm Gartner Inc said on Monday. Gartner said the global semiconductor industry is forecast to generate US$624 billion in sales next year, up from US$534 billion this year, during which revenue is predicted to fall 10.9 percent amid inventory adjustments caused by weak demand. The global market for memory chips is expected to rebound next year by 66.3 percent, following a fall of 38.8 percent this year due to oversupply