[ad_1]
March 22 (Reuters) – Nvidia Corp (NVDA.O) on Tuesday announced new chips and technologies that it said will boost the computing speed of increasingly complicated artificial intelligence algorithms, stepping up competition against rival chipmakers vying for lucrative data center business.
The company provided details of new graphic chips (GPU) that will be at the core of AI infrastructure, releasing the H100 chip and a new processor chip called the Grace CPU Superchip, based on British chip firm Arm Ltd’s technology. It’s the first Arm-based chip from Nvidia to be unveiled since its deal to buy Arm fell apart last month.
Nvidia also announced its new supercomputer “Eos”, which it said will be the world’s fastest AI system when it begins operation later this year.
Register now for FREE unlimited access to Reuters.com
“Data centers are becoming AI factories – processing and refining mountains of data to produce intelligence,” said Nvidia Chief Executive Officer Jensen Huang at Nvidia’s AI developer conference online, calling the H100 chip the “engine” of AI infrastructure.
Nvidia said the new technologies together will help reduce computing times from weeks to days for some work involving training AI models.
Companies have been using AI and machine learning for a multitude of things – from making recommendations for the next video to watch on TVs and cell phones – to new drug discovery.
“It’s clear from the latest announcements that Nvidia is becoming a more significant threat to Intel and AMD in the data center and cloud computing markets,” said Bob O’Donnell chief analyst at TECHnalysis Research.
Intel Corp(INTC.O)has been the biggest maker of central processors for data centers, but has seen competition for the lucrative fast growing space rise.
However, Vlad Galabov, head of the cloud and data center research practice at research firm Omdia said he has concerns about the H100 chip’s power consumption and said that it might inhibit the processor’s broad market appeal.
Nvidia’s Chief Financial Officer Colette Kress said that with the new chips pushing AI computing forward, the company’s market opportunity was about a trillion dollars, from gaming to chips and systems, and enterprise businesses.
Nvidia, whose open-source software has been a key driver for companies to use its chips, said it was looking to monetize on its software business even more in the future.
“Already we have been selling software to our enterprises and this is a couple hundred million dollars today and we believe this is a growth opportunity for us,” Kress said, adding that going forward the software business will help Nvidia’s gross margins improve at a time when chip component shortages and supply constraints have increased costs.
Software for the automotive market will also be a key driver forward, said Huang. “Auto is on its way to be our next multi-billion dollar business,” he said.
Nvidia has started shipping its autonomous vehicle computer “Drive Orin” this month and Chinese electric vehicle maker BYD Co Ltd (002594.SZ) and luxury electric car maker Lucid Motors (LCID.O) will be using Nvidia Drive for their next generation fleets, he said.
Danny Shapiro, Nvidia’s vice president for automotive, said there was $11 billion worth of automotive business in the “pipeline” in the next six years, up from $8 billion that it forecast last year. The growth in anticipated revenue will come from hardware and from increased, recurring revenue from Nvidia software, said Shapiro.
Nvidia shares closed down 0.8% at $265.24 on the Nasdaq.
Register now for FREE unlimited access to Reuters.com
Reporting By Jane Lanhee Lee, additional reporting by Joseph White; Editing by Bernard Orr
Our Standards: The Thomson Reuters Trust Principles.
[ad_2]
Source link