Baidu’s ChatGPT Rival Ernie Bot Is Coming in March, Says CEO

Baidu’s ChatGPT Rival Ernie Bot Is Coming in March, Says CEO

Source Node: 1981257

Nvidia is cashing in on the AI craze, with its A100 chip powering most of the industry applications in their race for AI supremacy. Tech giants like Microsoft and Google are in a race to incorporate AI tech into their search engines, while the likes of Stable Diffusion and OpenAI are already ahead.

OpenAI released their ChatGPT in November to immediate fanfare, compelling other tech firms try to emulate it or come up with rival products. Such efforts require heavy-duty processing power, and as one of the most critical tools for the AI industry, Nvidia’s A100 has been instrumental.

Also read: US Copyright Office Says You Can’t Copyright AI-generated Images

The operating system of today’s AI

Nathan Benaich, an investor who publishes an AI-focused newsletter, says the A100 has become the “workhorse” for AI pros at the moment.

According to New Street Research, Nvidia dominates 95% of the total market for graphics processors that can be used for machine learning.

“Nvidia AI is essentially the operating system of AI systems today,” said Nvidia CEO Jensen Huang on the company’s earnings call with analysts on Wednesday.

The A100 is suited to the kind of machine learning models that power tools like ChatGPT, Bing AI, or Stable Diffusion. It can perform many simple calculations at once, which is important for training and using neural network models.

While companies like Google and Amazon are developing their own chips specially designed for AI workloads, a State of AI Report Compute Index indicates AI hardware remains strongly consolidated to Nvidia.

As of December, more than 21 000 open-source AI papers said they used Nvidia chips. Further, most researchers included in the report used the V100, first released in 2017.

The future is definitely AI for Nvidia

Nvidia popularized the graphics processing unit (GPU) and gets the vast majority of its revenue from these specialized chips. The company designs and sells GPUs for gaming, crypto mining, and professional applications, as well as chip systems for use in vehicles, robotics, and other tools.

Its fourth quarter earnings report, released this past Wednesday, shows an overall sales decline of 21%. But investors pushed the share price by about 14% the following day, mainly because of the AI business.

According to the earnings report, the AI chip business – Data Center – rose 11% to $3.62 billion during Q4. The company said the growth was driven by US cloud service providers who bought more products.

Looking at the year to Feb 23, the stock was up 65%. Huang said AI is at an “inflection point,” pushing businesses of all sizes to buy their chips to develop learning software.

Generative AI’s versatility and capability has triggered a sense of urgency at enterprises around the world to develop and deploy AI strategies,” Huang said.

“The activity around the AI infrastructure that we built, and the activity around inferencing using Hopper and Ampere to influence large language models, has just gone through the roof in the last 60 days,” he added.

AI-as-a-service is on the way

The company also announced it will start offering a cloud-based AI-as-a-service option that will let smaller companies leverage its processing power for training AI models including the kind that underpins ChatGPT.

Arete Research Center co-founder Brett Simpson told Yahoo Finance Live that it’s about more than just the likes of ChatGPT.

“Ultimately, the intensity of AI is far greater than conventional computing. So I think the revenue that’s associated with AI will be directly proportional to the computing opportunity here we see. The promise of technology goes way beyond a chatbot, so to speak,” said Simpson.

Yahoo Finance Live technology editor Daniel Howley, meanwhile, said: “For now, Nvidia’s future is tied directly to AI.”

Bullish sentiment for the stock

The hype around AI, ChatGPT, and Nvidia’s ability to monetize this has resulted in positive remarks from Wall Street giants like Goldman Sachs.

Goldman Sachs said it “was wrong” to sit on the sidelines in anticipation of a pullback in the company’s fundamentals and upgraded the stock from “Neutral” to “Buy” in a Thursday note.

“We believe the acceleration in AI development/adoption across hyper-scalers, as well as enterprises, will, if anything, serve the company’s leadership position as customers with any sense of urgency will lean on solutions that are scalable and available,” said Goldman.

Nvidia’s A100 was first introduced in 2020 before the more expensive H100 arrived in 2022. But the company said it recorded more revenue from the H100 in chips for the quarter ending January than the A100.

According to Nvidia, the H100 is the first one of its data center GPUs to be optimized for transformers, an increasingly important technique that many of the latest AI applications use.

Nvidia says it wants to make AI training over one million percent faster. Talk about warp speed.

Time Stamp:

More from MetaNews