Chip companies, all eyeing this market_Shenzhen Zhonghexin Optoelectronic Technology Co., Ltd. (2024)

(Summary description)Nvidia rose 27 percent in May, putting its market value at $2.7 trillion, just behind Microsoft and Apple. The company is one of the world's most valuable publicly traded companies. Driven by soaring demand for artificial intelligence processors, the chipmaker reported that its sales tripled year-over-year for the third consecutive quarter.

Mizuho Securities estimates that Nvidia controls 70 to 95 percent of the market for AI chips used to train and deploy models such as GPT for OpenAI. Nvidia's pricing power is reflected in its 78 percent gross margin, a surprisingly high number for a hardware company that must manufacture and ship physical products.

Rival chipmakers Intel and AMD posted gross margins of 41 percent and 47 percent, respectively, in the latest quarter.

Some experts describe Nvidia's position in the AI chip market as a moat. Its flagship AI graphics processing units (Gpus), such as the H100, combined with the company's CUDA software, give it a head start on the competition to the point where a move to other alternatives is almost unthinkable.

Still, Nvidia CEO Jen-Hsun Huang said he was "concerned" that the 31-year-old company would lose its edge. Huang's net worth has soared from $3 billion to about $90 billion over the past five years. He acknowledged at a conference late last year that a number of strong competitors were emerging.

"I don't think anyone wants to put me out of business," Huang said in November. "I probably know they want to do it, so it's different."

Nvidia has pledged to release a new AI chip architecture every year, rather than every other year as it has historically, and to introduce new software that can embed its chips more deeply into AI software. But Nvidia's Gpus aren't the only ones capable of running the complex math that generative AI relies on. If less powerful chips can do the same job, Huang's concerns may be justified.

The shift from training AI models to so-called inference (or deployment models) could also present opportunities for companies to replace Nvidia Gpus, especially when they are less expensive to buy and run. Nvidia's flagship chips sell for about $40,000 or more, giving customers plenty of incentive to find alternatives.

"Nvidia wants to have 100 percent market share, but customers don't want Nvidia to have 100 percent market share," said Sid Sheth, co-founder of rival D-Matrix. "The opportunity is too big. If any one company owns all of it, it's not healthy."

Founded in 2019, D-Matrix plans to launch a semiconductor card for servers later this year, aiming to reduce the cost and latency of running AI models. The company raised $110 million in September.

In addition to D-Matrix, companies ranging from multinationals to emerging startups are vying for a share of the AI chip market, which could reach $400 billion in annual sales over the next five years, according to market analysts and AMD. Over the past four quarters, Nvidia has generated about $80 billion in revenue, and Bank of America estimates the company's AI chip sales were $34.5 billion last year.

Many companies that adopt Nvidia Gpus believe that different architectures or certain trade-offs can result in chips that are better suited for specific tasks. Device makers are also developing technology that could eventually do a lot of the computing for AI that is currently done in large GPU clusters in the cloud.

Fernando Vidal, co-founder of 3Fourteen Research, told CNBC: "No one would deny that Nvidia is the hardware that people want most when they train and run AI models today. But from hyperscale companies developing their own chips to small startups designing their own chips, we've made some progress in leveling the playing field."

AMD CEO Lisa Su wants investors to believe there's enough room in the space for many successful companies.

"The key is that there are a lot of options," Su told reporters in December when the company introduced its latest AI chip. "I think we're going to see not just one solution, but multiple solutions."

1

Traditional chip manufacturer

AMD makes Gpus for gaming and, like Nvidia, ADAPTS them for AI inside data centers. Its flagship chip is the Instinct MI300X. Microsoft has purchased AMD processors and offers access to them through its Azure cloud.

At the press conference, Su Zifeng emphasized the chip's excellent performance in reasoning, rather than competing with Nvidia training. Last week, Microsoft said it was using AMD Instinct Gpus for its Copilot model. Morgan Stanley analysts believe that the news indicates that AMD's AI chip sales could exceed $4 billion this year, which is the company's publicly stated goal.

Intel, which was overtaken by Nvidia in revenue last year, is also struggling to gain a foothold in artificial intelligence. The company recently released the third version of its AI accelerator Gaudi 3. This time, Intel is directly comparing it to its competitors, saying it's a more cost-effective alternative that's better at

  • Categories:News
  • Author:
  • Origin:
  • Time of issue:2024-06-03 16:18
  • Views:

Nvidia rose 27 percent in May, putting its market value at $2.7 trillion, just behind Microsoft and Apple. The company is one of the world's most valuable publicly traded companies. Driven by soaring demand for artificial intelligence processors, the chipmaker reported that its sales tripled year-over-year for the third consecutive quarter.

Mizuho Securities estimates that Nvidia controls 70 to 95 percent of the market for AI chips used to train and deploy models such as GPT for OpenAI. Nvidia's pricing power is reflected in its 78 percent gross margin, a surprisingly high number for a hardware company that must manufacture and ship physical products.

Rival chipmakers Intel and AMD posted gross margins of 41 percent and 47 percent, respectively, in the latest quarter.

Some experts describe Nvidia's position in the AI chip market as a moat. Its flagship AI graphics processing units (Gpus), such as the H100, combined with the company's CUDA software, give it a head start on the competition to the point where a move to other alternatives is almost unthinkable.

Still, Nvidia CEO Jen-Hsun Huang said he was "concerned" that the 31-year-old company would lose its edge. Huang's net worth has soared from $3 billion to about $90 billion over the past five years. He acknowledged at a conference late last year that a number of strong competitors were emerging.

"I don't think anyone wants to put me out of business," Huang said in November. "I probably know they want to do it, so it's different."

Nvidia has pledged to release a new AI chip architecture every year, rather than every other year as it has historically, and to introduce new software that can embed its chips more deeply into AI software. But Nvidia's Gpus aren't the only ones capable of running the complex math that generative AI relies on. If less powerful chips can do the same job, Huang's concerns may be justified.

The shift from training AI models to so-called inference (or deployment models) could also present opportunities for companies to replace Nvidia Gpus, especially when they are less expensive to buy and run. Nvidia's flagship chips sell for about $40,000 or more, giving customers plenty of incentive to find alternatives.

"Nvidia wants to have 100 percent market share, but customers don't want Nvidia to have 100 percent market share," said Sid Sheth, co-founder of rival D-Matrix. "The opportunity is too big. If any one company owns all of it, it's not healthy."

Founded in 2019, D-Matrix plans to launch a semiconductor card for servers later this year, aiming to reduce the cost and latency of running AI models. The company raised $110 million in September.

In addition to D-Matrix, companies ranging from multinationals to emerging startups are vying for a share of the AI chip market, which could reach $400 billion in annual sales over the next five years, according to market analysts and AMD. Over the past four quarters, Nvidia has generated about $80 billion in revenue, and Bank of America estimates the company's AI chip sales were $34.5 billion last year.

Many companies that adopt Nvidia Gpus believe that different architectures or certain trade-offs can result in chips that are better suited for specific tasks. Device makers are also developing technology that could eventually do a lot of the computing for AI that is currently done in large GPU clusters in the cloud.

Fernando Vidal, co-founder of 3Fourteen Research, told CNBC: "No one would deny that Nvidia is the hardware that people want most when they train and run AI models today. But from hyperscale companies developing their own chips to small startups designing their own chips, we've made some progress in leveling the playing field."

AMD CEO Lisa Su wants investors to believe there's enough room in the space for many successful companies.

"The key is that there are a lot of options," Su told reporters in December when the company introduced its latest AI chip. "I think we're going to see not just one solution, but multiple solutions."

1

Traditional chip manufacturer

AMD makes Gpus for gaming and, like Nvidia, ADAPTS them for AI inside data centers. Its flagship chip is the Instinct MI300X. Microsoft has purchased AMD processors and offers access to them through its Azure cloud.

At the press conference, Su Zifeng emphasized the chip's excellent performance in reasoning, rather than competing with Nvidia training. Last week, Microsoft said it was using AMD Instinct Gpus for its Copilot model. Morgan Stanley analysts believe that the news indicates that AMD's AI chip sales could exceed $4 billion this year, which is the company's publicly stated goal.

Intel, which was overtaken by Nvidia in revenue last year, is also struggling to gain a foothold in artificial intelligence. The company recently released the third version of its AI accelerator Gaudi 3. This time, Intel is directly comparing it to its competitors, saying it's a more cost-effective alternative that's better at running reasoning than Nvidia's H100, while being faster at training models.

Bank of America analysts recently estimated that Intel will have less than 1 percent of the AI chip market this year. Intel says it has a $2 billion backlog of orders for the chip.

The main obstacle to wider adoption is likely to be software. Both AMD and Intel have joined a large industry group called the UXL Foundation, which includes Google, working to create a free alternative to Nvidia CUDA for hardware that controls AI applications.

In addition to chip rivals, Nvidia's customers are also developing their own chips.

2

Customer developed chip

One potential challenge for Nvidia is that it is competing with some of its biggest customers. Cloud providers including Google, Microsoft and Amazon are all developing processors for internal use. The big three tech companies, plus Oracle, account for more than 40 percent of Nvidia's revenue.

Amazon launched its own AI chip in 2018 under the brand name Inferentia. Inferentia is now in its second edition. In 2021, Amazon Web Services launched Tranium for training. Customers can't buy the chips, but can rent systems through AWS, which promotes them as more cost-effective than Nvidia's.

Google may be the cloud provider most committed to its own chips. Since 2015, the company has been using so-called tensor processing units (Tpus) to train and deploy AI models. In May, Google announced Trillium, the sixth version of its chip, which the company says is used to develop its models, including Gemini and Imagen.

Google also uses Nvidia chips and offers them through its cloud.

Microsoft is not making fast progress on this front. The company said last year it was developing its own AI accelerators and processors, called Maia and Cobalt, respectively.

Meta is not a cloud provider, but the company requires a lot of computing power to run its software and website and serve ads. While Facebook's parent company is buying billions of dollars worth of Nvidia processors, it said in April that some of its homegrown chips are already in use in data centers and have "higher efficiency" compared to Gpus.

Analysts at jpmorgan estimated in May that the market for building custom chips for large cloud providers could be worth as much as $30 billion, with a potential growth rate of 20 percent a year.

3

Start-up chip enterprise

Venture capitalists see an opportunity for new companies to join the field. According to PitchBook, they invested $6 billion in AI semiconductor companies in 2023, up slightly from $5.7 billion a year earlier.

It's a tough space for startups because semiconductors are expensive to design, develop, and manufacture. But there are also opportunities for differentiation.

For Cerebras Systems, an AI chipmaker in Silicon Valley, the focus is on the basic operations and bottlenecks of AI, rather than the generality of Gpus. The company was founded in 2015 and was valued at $4 billion in its most recent funding round, according to Bloomberg.

Andrew Feldman, the company's chief executive, said the Cerebras chip WSE-2 combines GPU capabilities as well as central processing and additional memory into a single device, which is better suited for training large models.

"We use giant chips, and they use a lot of small chips," Feldman said. "They have data transmission challenges that we don't have."

Feldman said his company, which includes clients such as the Mayo Clinic and GlaxoSmithKline as well as the U.S. military, is competing with Nvidia for business with its supercomputing systems.

"There's a lot of competition, and I think that's good for the ecosystem," Feldman said.

D-Matrix's Sheth said his company plans to launch a card with a chip later this year so that more calculations can be done in memory rather than on chips such as Gpus. D-Matrix's product can be plugged into AI servers alongside existing Gpus, but it could ease the workload of Nvidia's chips and help reduce the cost of generating AI.

Sheth said customers are "very receptive to new solutions and very willing to push them into the market."

4

Apple and Qualcomm are the X factor

Perhaps the biggest threat to Nvidia's data center business is a change in processing location.

Developers are increasingly betting that AI jobs will move from server farms to the laptops, PCS and phones we own.

The large models developed by OpenAI require large and powerful GPU clusters for inference, but companies such as Apple and Microsoft are developing "smaller models" that require less power and data and can run on battery-powered devices. They may not be as skilled as the latest version of ChatGPT, but they can also perform other applications, such as summary text or visual search.

Apple and Qualcomm are updating their chips to run AI more efficiently, adding specialized parts to AI models called neural processors, which can have privacy and speed advantages.

Qualcomm recently released a PC chip that allows laptops to run Microsoft's AI services on the device. The company has also invested in a number of chipmakers that produce low-power processors to run AI algorithms outside of smartphones or laptops.

Thanks to the neural engine on the chip, Apple has been touting that its latest laptops and tablets are optimized for artificial intelligence. At its upcoming developer conference, Apple plans to show off a slew of new AI features that will most likely run on the chips the company powers for the iPhone.

Chip companies, all eyeing this market_Shenzhen Zhonghexin Optoelectronic Technology Co., Ltd. (2024)
Top Articles
Latest Posts
Article information

Author: Pres. Lawanda Wiegand

Last Updated:

Views: 6138

Rating: 4 / 5 (71 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Pres. Lawanda Wiegand

Birthday: 1993-01-10

Address: Suite 391 6963 Ullrich Shore, Bellefort, WI 01350-7893

Phone: +6806610432415

Job: Dynamic Manufacturing Assistant

Hobby: amateur radio, Taekwondo, Wood carving, Parkour, Skateboarding, Running, Rafting

Introduction: My name is Pres. Lawanda Wiegand, I am a inquisitive, helpful, glamorous, cheerful, open, clever, innocent person who loves writing and wants to share my knowledge and understanding with you.