Contact Information

37 Westminster Buildings, Theatre Square,
Nottingham, NG1 6LG

We Are Available 24/ 7. Call Now.


© Reuters. FILE PHOTO: The logo for IBM is displayed on a screen on the floor of the New York Stock Exchange (NYSE) in New York, U.S., June 27, 2018. REUTERS/Brendan McDermid//File Photo

By Stephen Nellis

SAN FRANCISCO (Reuters) – International Business Machines (NYSE:) is considering the use of artificial intelligence chips that it designed in-house to lower the costs of operating a cloud computing service it made widely available this week, an executive said Tuesday.

In an interview with Reuters at a semiconductor conference in San Francisco, Mukesh Khare, general manager of IBM Semiconductors, said the company is contemplating using a chip called the Artificial Intelligence Unit as part of its new “watsonx” cloud service.

IBM is hoping to take advantage of the boom in generative AI technologies that can write human-like text more than a decade after Watson, its first major AI system, failed to gain market traction.

One of the barriers the old Watson system faced was high costs, which IBM is hoping to address this time. Khare said using its own chips could lower cloud service costs because they are very power efficient.

IBM announced the chip’s existence in October but did not disclose the manufacturer or how it would be used.

Khare said the chip is manufactured by Samsung Electronics (OTC:), which has partnered with IBM on semiconductor research, and that his company is considering it for use in watsonx.

IBM has no set date for when the chip could be available for use by cloud customers, but Khare said the company has several thousand prototype chips already working.

IBM has joined other tech giants such as Alphabet (NASDAQ:)’s Google and Amazon.com (NASDAQ:) in designing its own AI chips.

But Khare said IBM was not trying to design a direct replacement for semiconductors from Nvidia (NASDAQ:), whose chips lead the market in training AI systems with vast amounts of data.

Instead, IBM’s chip aims to be cost-efficient at what AI industry insiders call inference, which is the process of putting an already trained AI system to use making real-world decisions.

“That’s where the volume is right now,” Khare said. “We don’t want to go toward training right now. Training is a different beast in terms of compute. We want to go where we can have the most impact.”

Source link

Share:

administrator

Leave a Reply

Your email address will not be published. Required fields are marked *