Intel Vision 2024 Offers New Look at Gaudi 3 AI Chip

After first announcing the existence of the Gaudi 3 AI accelerator last year, Intel is ready to put the chip in the hands of OEMs in Q2 2024. Intel announced this and other news, including a new Xeon 6 brand and an open Ethernet standard for AI workloads, at a pre-briefing held on April 1 ahead of the Intel Vision conference, which is being held April 8-9 in Phoenix, Arizona.

Gaudi 3 AI accelerator will ship to Dell, Hewlett Packard Enterprise, Lenovo and Supermicro

The Gaudi 3 will launch with Dell, Hewlett Packard Enterprise, Lenovo and Supermicro as OEM partners.

Intel Gaudi 3 will be available from vendors in three form factors: mezzanine card, universal baseboard or PCle CEM. Gaudi 3 has 40% faster time-to-train large language models versus NVIDIA’s H100 AI chip and is 50% faster inferencing on LLMs versus the NVIDIA H100, Intel said.

Gaudi 3 may go head-to-head with NVIDIA’s newly announced AI accelerator chip, Blackwell. Gaudi 3 is “highly competitive,” said Jeff McVeigh, corporate vice president & general manager of the Software Engineering Group at Intel. McVeigh noted that real-world testing for the two products hasn’t been possible yet.

New Xeon 6 brand coming in Q2

Xeon 6 processors, which come in the two variants of Performance-core and Efficient-core, will ship soon. E-core processors will ship in Q2 2024, with P-core processors following soon after.

The two variants of Xeon 6 processors share the same platform foundation and software stack. The Performance core is optimized for compute-intensive and AI workloads, while the Efficient-core is optimized for efficiency in the same workloads. The Intel Xeon 6 processor with the E-core shows 2.4 times performance per watt improvement compared to prior generations and 2.7 times performance per rack improvement compared to prior generations.

The Xeon 6 processor shows marked energy savings compared to the 2nd Gen Intel Xeon Processor due to needing fewer server racks, for up to 1 megawatt of power reduction.

Network interface card supports open internet standard for AI workloads

As part of Intel’s effort to provide a wide range of AI infrastructure, the company announced an AI network interface card for Intel Ethernet Network Adapters and Intel IPUs. The AI network interface cards, which are already being used by Google Cloud, will provide a secure way to offload functionality like storage, networking and container management and manage AI infrastructure, Intel said. The intent is to be able to train and run inference on the increasingly larger generative AI models Intel predicts organizations will want to deploy all over Ethernet.

Intel is working with the Ultra Ethernet Consortium to create an open standard for AI networking over Ethernet.

The AI network interface cards are expected to be available in 2026.

Wide-reaching scalable systems strategy aims to smooth out AI adoption

In order to prepare for what the company predicts will be the future of AI, Intel plans to roll out a scalable systems strategy for enterprise.

“We want it to be open and for enterprises to have choice in hardware, software and applications,” said Intel Senior Vice President and General Manager of Network and Edge Group Sachin Katti at the pre-briefing.

In order to do so, the scalable systems strategy provides Intel products for all segments of AI within the enterprise: hardware, software, frameworks and tools. Intel is working with a variety of partners to make this strategy a reality, including:

  • Google Cloud.
  • Thales.
  • Cohesity.
  • NAVER.
  • Bosch.
  • Ola/Kutrim.
  • NielsenIQ.
  • Seekr.
  • IFF.
  • CtrlS Group.
  • Landing AI.
  • Roboflow.

Intel predicts a future of AI agents and AI functions

Katti said in the pre-brief that enterprise is in an age of AI copilots. Next might come an age of AI agents, which can coordinate other AI to perform tasks autonomously, followed by an age of AI functions. The rise of AI functions could mean groups of agents taking on the work of an entire department, Sachin said.

SEE: Articul8, makers of a generative AI software platform, spun out of Intel in January. (TechRepublic) 

Competitors to Intel

Intel is trying to set itself apart from competitors by focusing on interoperability in the open ecosystem. Intel competes in the AI chip space with:

  • NVIDIA, which announced the next-generation Blackwell chip in March 2024.
  • AMD, which in February 2024 announced a new architectural solution for AI inferencing based on AMD Ryzen Embedded processors.

Intel competes for chip manufacturing dominance with Taiwan Semiconductor Manufacturing Co., Samsung, IBM, Micron Technologies, Qualcomm and others.

TechRepublic is covering Intel Vision remotely.

Source link

After first announcing the existence of the Gaudi 3 AI accelerator last year, Intel is ready to put the chip in the hands of OEMs in Q2 2024. Intel announced this and other news, including a new Xeon 6 brand and an open Ethernet standard for AI workloads, at a pre-briefing held on April 1…

After first announcing the existence of the Gaudi 3 AI accelerator last year, Intel is ready to put the chip in the hands of OEMs in Q2 2024. Intel announced this and other news, including a new Xeon 6 brand and an open Ethernet standard for AI workloads, at a pre-briefing held on April 1…

Leave a Reply

Your email address will not be published. Required fields are marked *