Nice88 bet sign up bonus.Royal meaning in Urdu,Jilievo 666

News

Nvidia CEO Announces New AI Chips

Nvidia announced on Monday, March 18, a new generation of artificial intelligence chips and software for running AI models.

Nvidia CEO Announces New AI Chips

The mentioned company made a statement about its new products during the developer conference, which is taking place in San Jose, California. Currently, Nvidia, which is one of the main beneficiaries of the so-called artificial intelligence boom, is seeking to strengthen its position in the market. The company is the world’s main supplier of chips for AI developers. Nvidia is currently showing impressive financial performance due to the high level of consumer demand for its microcircuits. The company’s market value has exceeded the historical mark of $2 trillion.

After the debut of ChatGPT from OpenAI took place at the end of 2022, artificial intelligence became a kind of main object of global attention. The impressive capabilities of AI have become an incentive for many companies to start operating in the sphere of this advanced technology. Nvidia’s high-end server GPUs are used for training and deployment of large artificial intelligence systems. Tech giants such as Meta and Microsoft have spent billions of dollars to buy chips.

The new generation of graphics processors with artificial intelligence from Nvidia announced in San Jose, was named Blackwell. The first chip of this series of GB200 products will be available for purchase before the end of the current year.

Nvidia is encouraging its customers to purchase more powerful chips. It is worth noting that in this case, from the point of view of the financial interests of the company, a kind of system of the impact of successful first sales of new products on the subsequent dynamic of commercial indicators is being formed. If several large customers buy a new generation of chips and begin to implement it in their digital developments, the corresponding process will, in a sense, independently stimulate subsequent consumer interest in these microcircuits. The use of advanced chips will automatically increase the competitiveness of the particular company, which will mean for other firms the need to apply more modern products. Against this background, there will form a steady demand for microcircuits of the new generation.

At the same time, many companies and software makers are still showing significant interest in the current generation of Hopper H100s and similar chips. Nvidia CEO Jensen Huang said during the developer conference that Hooper is fantastic, but bigger GPUs are needed.

The company also presented a new software called NIM (Nvidia Inference Microservice). This digital product will simplify the process of deploying artificial intelligence. The new software may be another reason for customers to be more interested in Nvidia chips than in the microcircuits of its competitors.

The company’s management says that this player in the advanced technology industry is gradually transforming from a mercenary chip provider to a platform provider such as Microsoft or Apple, based on which other brands can develop software.

Jensen Huang also stated that Blackwell, in terms of its functional features, goes beyond the traditional concept of microcircuit. According to him, this development should be characterized as a platform.

Nvidia enterprise vice-president Manuvir Das said during a conversation with media representatives that the commercial product being sold was a graphics processor, for which the software is intended for various use cases. According to him, the company now has a software development business that will make it easier to run programs on any of the GPUs of this brand, even those belonging to the outdated category. Manuvir Das stated that perhaps the chips released a few years ago are better suited for deploying artificial intelligence, but not for building it.

Nvidia’s vice president of enterprise also said that developers who are ready to offer an interesting AI model can put their digital product in NIM, after which the company will make sure that it runs on all its GPUs and reaches many consumers.

It is worth noting that every two years the company updates the architecture of its GPU. This approach provides a significant increase in performance. Many artificial intelligence models presented over the past year have been trained in the Hopper architecture used in chips such as the H100, announced in 2022.

Nvidia said that Blackwell-based processors, including the GB200, provide significant performance increases for companies operating in the AI industry. In this case, the performance of artificial intelligence is 20 petaflops. A similar indicator of the H100 is at around 4 petaflops. Nvidia said that the increased performance will allow companies operating in the artificial intelligence industry to train more intricate and bigger AI models.

The new chip includes what is called the transformer engine, specifically designed to run artificial intelligence based on transformers, one of the main technologies used by ChatGPT.

The Blackwell GPU is large and combines two separately manufactured dies into one microcircuit manufactured by TSMC. This product will also be available as an entire server called NVLink 2, combining 72 Blackwell GPUs and other Nvidia parts designed to train artificial intelligence models.

Microsoft, Google, Amazon, and Oracle will sell access to GB200 through cloud services. The GB200 pairs two Blackwell B200 GPUs with one Arm-based Grace processor.

Nvidia announced that Amazon Web Services will build a server cluster with 20,000 GB200 chips. The company also said that its new system can deploy an artificial intelligence model with 27 trillion parameters. This figure exceeds the capabilities of the largest AI configurations of other firms, including GPT-4, which has 1.7 trillion parameters.

Many artificial intelligence researchers say that bigger artificial intelligence models with more parameters and data can demonstrate new capabilities.

Nvidia has not yet provided information on the cost of the GB200 or the systems that use this chip. The Hopper-based H100 can be purchased by users for between $25,000 and $40,000 for a microcircuit. The cost of the whole system, according to analyst estimates, starts from $200,000.

NIM makes it easier to use older Nvidia GPUs for logical inference or the process of running artificial intelligence software. The company also focuses on the fact that this product will be compatible with hundreds of millions of GPUs that users already own. Inference requires implies the need for less computing power compared to the initial training of a new artificial intelligence model.

NIM has been added to the Nvidia enterprise software subscription. The company notes that this product will provide customers with the opportunity to run their own artificial intelligence models instead of buying access to AI results as services from other firms. As part of these solutions, the brand follows a strategy aimed at increasing the number of customers buying Nvidia-based servers and the number of subscribers to Nvidia enterprise, a license for which costs $4,500 per GPU per year.

The company intends to work with AI industry players such as Microsoft and Hugging Face to ensure that their artificial intelligence models are tuned to run on all compatible Nvidia chips.

Tom Plumb, CEO and portfolio manager at Plumb Funds says that Blackwell’s debut is not what can be described as a surprise. According to him, the new product confirms that Nvidia is at the forefront of the chip industry and continues to be a leader in all areas of graphics processing. Separately, Tom Plumb noted that there will still be a space of opportunities for AMD and other companies in the market. At the same time, he said, Blackwell is evidence that Nvidia’s leadership is pretty insurmountable.

Analyst Jacob Borne says that Jensen Huang’s company has opportunities to strengthen its dominance in the artificial intelligence industry, but noted that competitors such as AMD, Intel, startups, and even Big Tech’s desire to develop chips, could potentially cause Nvidia’s position to weaken somewhat.

As we have reported earlier, Nvidia CEO Says About Prospects of Artificial General Intelligence.

Serhii Mikhailov

2864 Posts 0 Comments

Serhii’s track record of study and work spans six years at the Faculty of Philology and eight years in the media, during which he has developed a deep understanding of various aspects of the industry and honed his writing skills; his areas of expertise include fintech, payments, cryptocurrency, and financial services, and he is constantly keeping a close eye on the latest developments and innovations in these fields, as he believes that they will have a significant impact on the future direction of the economy as a whole.