But the same qualities that make those graphics processor chips, or GPUs, so effective at creating powerful AI systems from scratch make them less efficient at putting AI products to work.
That’s opened up the AI chip industry to rivals who think they can compete with Nvidia in selling so-called AI inference chips that are more attuned to the day-to-day running of AI tools and designed to reduce some of the huge computing costs of generative AI.
“These companies are seeing opportunity for that kind of specialized hardware,” said Jacob Feldgoise, an analyst at Georgetown University’s Center for Security and Emerging Technology. “The broader the adoption of these models, the more compute will be needed for inference and the more demand there will be for inference chips.”
WHAT IS AI INFERENCE?
It takes a lot of computing power to make an AI chatbot. It starts with a process called training or pretraining — the “P” in ChatGPT — that involves AI systems “learning” from the patterns of huge troves of data. GPUs are good at doing that work because they can run many calculations at a time on a network of devices in communication with each other.
However, once trained, a generative AI tool still needs chips to do the work — such as when you ask a chatbot to compose a document or generate an image. That’s where inferencing comes in. A trained AI model must take in new information and make inferences from what it already knows to produce a response.
GPUs can do that work, too. But it can be a bit like taking a sledgehammer to crack a nut.
“With training, you’re doing a lot heavier, a lot more work. With inferencing, that’s a lighter weight,” said Forrester analyst Alvin Nguyen.
This story is from the {{IssueName}} edition of {{MagazineName}}.
Start your 7-day Magzter GOLD free trial to access thousands of curated premium stories, and 9,000+ magazines and newspapers.
Already a subscriber ? Sign In
This story is from the {{IssueName}} edition of {{MagazineName}}.
Start your 7-day Magzter GOLD free trial to access thousands of curated premium stories, and 9,000+ magazines and newspapers.
Already a subscriber? Sign In
AI Home
THE NEXT FRONTIER FOR APPLE INTELLIGENCE, INTERACTION, AND AUTOMATION
Watch
SERIES 10 & ULTRA 2: REFRESHING THE LINEUP WITH NEW FINISHES & BOLD COLORS
Mac mini
SMALL IN SIZE, POWERFUL, AND ENGINEERED FOR APPLE INTELLIGENCE
iMac M4 INTRODUCING THE FIRST APPLE DESKTOP READY FOR ADVANCED AI
Apple has taken another step towards revolutionizing the world of personal computing with the launch of its latest iMac.
MacBook Pro M4 UNMATCHED AI PERFORMANCE THROUGH A NEW STUNNING XDR DISPLAY
Apple has launched its latest lineup featuring the cutting-edge M4 chip family and the all-new Apple Intelligence system.
New-Gen iPads
2024 AIR AND PRO LINES BRING IDEAS TO LIFE LIKE NEVER BEFORE
Assistance A NEW PRODUCTIVITY ERA BEGINS FOR THE ENTIRE APPLE ECOSYSTEM
As Apple continues to lead in hardware-software integration, the debut of Apple Intelligence reinforces its commitment to offering an AI experience unique to its ecosystem, one that aligns with the company’s ethos of privacy and ease of use.
NVIDIA®
THE BREAKTHROUGHS BEHIND THE LATEST AI HARDWARE
2025 SCALING NEW HEIGHTS IN PROCESSING POWER
Over the past few years, Apple has reshaped nearly every part of its ecosystem, from the hardware to the software, to push the boundaries of technology.
iPad mini - AI-READY DESIGN: THE A17 CHIP POWERHOUSE IN A COMPACT SIZE
Technology continues to shrink in size yet grow in capability, and the new iPad mini is no exception. The robust tool was designed to meet the demands of both personal and professional users.