AI, Nvidia designs the processors of the future in Taiwan

Listen to the audio version of the article

All in onartificial intelligence. During the Computex conference in Taipei, Jensen HuangCEO of Nvidia Corp. , outlined the company’s vision for the AI-related future of processors. It not only revealed annual release plans for new accelerators, but also two future products: the Blackwell Ultra chip for 2025 and the Rubin platform for 2026. The latter, in particular, promises significant improvements in terms of energy efficiencyaddressing growing concerns about the energy impact of AI data centers.

The strategy

The strategy of Nvidia, a leader in the field of data center systems for artificial intelligence, extends well beyond the production of hardware. During his keynote at National Taiwan University, Huang illustrated how artificial intelligence is triggering a new industrial revolution. The expectation is that the technology will also spread to personal computers. Nvidia, already the main beneficiary of the huge wave of investment in AI, now seeks to expand its customer base beyond the giants of cloud computingspanning a broader range of industries and government agencies.

The promise: less costs, less energy consumption

Huang’s philosophy, which he has labeled his approach as “CEO math,” holds that growing “computational inflation” requires accelerated methods to handle huge volumes of data, promising 98% cost savings and a 97% reduction in power consumption thanks to Nvidia technologies. In addition to these chip innovations, the company is introducing new models and software tools, mainly oriented towards the implementation of AI features in PCs. Collaboration with Microsoft and hardware partners at Computex showed off new laptops with AI enhancements under the brand Copilot+. However, the presence of an Nvidia graphics card significantly enhances the performance of these devices, expanding the capabilities of popular software, such as games.

New horizons

In parallel, Nvidia is working on a new design for servers, the plan MGX, which will allow partners such as Hewlett Packard Enterprise and Dell Technologies to accelerate the commercialization of products based on its chips. Rivals such as Advanced Micro Devices and Intel are also adapting to this new configuration, integrating their processors with Nvidia chips. And it didn’t end here. Nvidia’s commitment to innovation is also reflected in the introduction of services such as Nvidia Inference Microservices, or NIM, which Huang described as “AI in a box”, now available to the public. These services, offered free of charge, are designed to accelerate the implementation of AI solutions, but require a license to use.

 
For Latest Updates Follow us on Google News
 

PREV How Lamborghini DNA was born: the origin of the Toro supercars will leave you speechless
NEXT Do you have this old cell phone at home? Collectors ready to go crazy: how much is it worth today