Categories: Stock Market

IBM mulls using its own AI chip in new cloud service to lower costs By Reuters


© Reuters. FILE PHOTO: The logo for IBM is displayed on a screen on the floor of the New York Stock Exchange (NYSE) in New York, U.S., June 27, 2018. REUTERS/Brendan McDermid//File Photo

By Stephen Nellis

SAN FRANCISCO (Reuters) – International Business Machines (NYSE:) is considering the use of artificial intelligence chips that it designed in-house to lower the costs of operating a cloud computing service it made widely available this week, an executive said Tuesday.

In an interview with Reuters at a semiconductor conference in San Francisco, Mukesh Khare, general manager of IBM Semiconductors, said the company is contemplating using a chip called the Artificial Intelligence Unit as part of its new “watsonx” cloud service.

IBM is hoping to take advantage of the boom in generative AI technologies that can write human-like text more than a decade after Watson, its first major AI system, failed to gain market traction.

One of the barriers the old Watson system faced was high costs, which IBM is hoping to address this time. Khare said using its own chips could lower cloud service costs because they are very power efficient.

IBM announced the chip’s existence in October but did not disclose the manufacturer or how it would be used.

Khare said the chip is manufactured by Samsung Electronics (OTC:), which has partnered with IBM on semiconductor research, and that his company is considering it for use in watsonx.

IBM has no set date for when the chip could be available for use by cloud customers, but Khare said the company has several thousand prototype chips already working.

IBM has joined other tech giants such as Alphabet (NASDAQ:)’s Google and Amazon.com (NASDAQ:) in designing its own AI chips.

But Khare said IBM was not trying to design a direct replacement for semiconductors from Nvidia (NASDAQ:), whose chips lead the market in training AI systems with vast amounts of data.

Instead, IBM’s chip aims to be cost-efficient at what AI industry insiders call inference, which is the process of putting an already trained AI system to use making real-world decisions.

“That’s where the volume is right now,” Khare said. “We don’t want to go toward training right now. Training is a different beast in terms of compute. We want to go where we can have the most impact.”

Source link

nasdaqpicks.com

Share
Published by
nasdaqpicks.com

Recent Posts

Behavioral Finance Strategies: Pro Tips

Discover behavioral finance strategies and pro tips to smarter investing. Elevate your financial decision-making today.

3 months ago

What Insights Can We Gain from Stock Market Anomalies?

Explore the world of stock market anomalies and unlock their potential to boost your investment…

3 months ago

Asset Allocation Strategies: Best Practices

Discover expert tips on asset allocation strategies to maximize returns & minimize risks. Learn the…

3 months ago

Kaiser Permanente, Unions Reach Deal on New Contracts

Kaiser Permanente and labor unions reached a tentative agreement to resolve a contentious contract dispute…

9 months ago

Israel Flight Cancellations, Ticket Prices Leave Some Stranded

Oct. 12, 2023 11:34 am ETAirlines have suspended flights into Israel en masse, leaving people…

9 months ago

Disney Agonized About Sports Betting. Now It’s Going All In.

Listen to article(2 minutes)In early 2019, an analyst asked Disney Chief Executive Bob Iger if…

9 months ago