Mouser
Mouser
Mouser
Mouser
Mouser
Mouser
More

    AI chip market will double in size over the next two years

    There’s been a lot of talk about the potential of artificial intelligence (AI) solutions and while there has been the occasional breakthrough or two, it’s still viewed by the general consumer as a novelty technology.

    That may soon change. According to two new reports from ABI Research, the AI chip market will more than double itself in the next five years.

    The first report, entitled, “Cloud AI Chipsets: Market Landscape and Vendor Positioning,” focuses on how cloud AI inference and training services are growing rapidly. The resulting AI chipset market is expected to grow from US$4.2 billion in 2019 to US$10 billion in 2024.

    For those unfamiliar, the cloud market for AI chipsets is broken into three segments. The public cloud is hosted by cloud service providers like AWS, Microsoft, Google, Alibaba, and others.

    Then there are enterprise data centers, which are effectively private clouds, plus what ABI calls “hybrid cloud,” meaning offerings that combine public and private clouds (VMware, HPE, Dell).

    The report also identified an additional, emerging segment – Telco clouds. This refers to cloud infrastructure deployed by telecom companies for their core network, IT and edge computing workloads.

    This new segment represents a big opportunity for AI chipset makers. Companies like Huawei, and to a lesser extent Nokia, are already rolling out ASICs that are optimized for telco network functions.

    ABI’s second report, entitled “Edge AI Chipsets: Technology Outlook and Use Cases,” puts the edge AI inference chipset market at US$1.9 billion in 2018. The report also identified, perhaps surprisingly, a market for training at the edge, which it placed at US$1.4 million for the same year.

    These numbers beg a very obvious question: which applications are doing training at the edge today? Well, to put it succinctly, this includes gateways (historians or device hubs) and on-premise servers (in private clouds, but geographically located where the AI data is generated). Chipsets designed for training tasks on on-premise servers include the likes of Nvidia’s DGX, Huawei’s gateways and servers which feature their Ascend 910 chipset, and system-level products targeted at on-premise data centers from companies like Cerebras System, Graphcore and Habana Labs.

    All of that being said, training at the edge market will remain small as most companies still prefer the cloud for AI training.

    ELE Times Research Deskhttps://www.eletimes.com
    ELE Times provides a comprehensive global coverage of Electronics, Technology and the Market. In addition to providing in depth articles, ELE Times attracts the industry’s largest, qualified and highly engaged audiences, who appreciate our timely, relevant content and popular formats. ELE Times helps you build awareness, drive traffic, communicate your offerings to right audience, generate leads and sell your products better.

    Related Articles

    Latest Articles