Close Menu
    MENA News 24/7MENA News 24/7
    • Home
    • Contact Us
    • Automotive
    • Business
    • Entertainment
    • Health
    • Lifestyle
    • Luxury
    • News
    • Sports
    • Technology
    • Travel
    MENA News 24/7MENA News 24/7
    Home » Microsoft launches Maia 200 chip for Azure AI inference
    Technology

    Microsoft launches Maia 200 chip for Azure AI inference

    January 28, 2026
    Facebook WhatsApp Twitter Pinterest LinkedIn Telegram Tumblr Email Reddit VKontakte

    MENA Newswire, SAN FRANCISCO: Microsoft on Jan. 26 introduced Maia 200, the second generation of its in-house artificial intelligence accelerator, built to run AI models in production across Azure data centres. The company said Maia 200 is designed for inference, the stage where trained models generate responses to live requests, and will be used to support a range of Microsoft AI services.

    Microsoft launches Maia 200 chip for Azure AI inference
    Microsoft Maia 200 targets faster AI inference in Azure data centers using custom silicon. (AI-generated image)

    Maia 200 is manufactured on TSMC’s 3-nanometer process and includes more than 140 billion transistors, Microsoft said. The chip pairs compute with a new memory system that includes 216 gigabytes of HBM3e high-bandwidth memory and about 272 megabytes of on-chip SRAM, aimed at sustaining large-scale token generation and other inference-heavy workloads.

    Microsoft said Maia 200 delivers more than 10 petaflops of performance at 4-bit precision and about 5 petaflops at 8-bit precision, formats commonly used to run modern generative AI efficiently. The company also said the system is designed around a 750-watt power envelope and is built with scalable networking so chips can be linked for larger deployments.

    The company said the new hardware has begun coming online in an Azure U.S. Central data centre in Iowa, with an additional location planned in Arizona. Microsoft described Maia 200 as its most efficient inference system deployed to date, reporting a 30% improvement in performance per dollar compared with its existing inference systems.

    AI inference focus and Azure deployment

    Microsoft said Maia 200 is intended to support AI products and services that rely on high-volume, low-latency model execution, including workloads running in Azure and Microsoft’s own applications. The company said it has designed the chip and the surrounding system as part of an end-to-end infrastructure approach that includes silicon, servers, networking and software for deploying AI models at scale.

    Alongside the chip, Microsoft announced early access to a Maia software development kit for developers and researchers working on model optimization. The company said the tooling is aimed at helping teams compile and tune models for Maia-based systems, and is structured to fit into common AI development workflows used for deploying inference in the cloud.

    Performance claims and model support

    Microsoft said Maia 200 is built to run large language models and advanced reasoning systems, and that it will be used for internal and hosted model deployments in Azure. The company has positioned the chip as a production inference accelerator, distinguishing it from training-focused systems that are typically used to build models before deployment.

    Microsoft has accelerated custom silicon work as demand has grown for compute to serve generative AI applications, where costs and availability of accelerators can affect how quickly services scale. Maia 200 follows Maia 100, which Microsoft introduced in 2023, and represents the company’s latest iteration of its dedicated AI accelerator line for datacenter inference.

    Related Posts

    Indonesia counts 50 dead after West Bandung landslide buries homes

    January 28, 2026

    U.S. international arrivals fell in 2025 as Canada trips dropped

    January 28, 2026

    Ben Affleck AI remarks ignite Hollywood creative backlash

    January 27, 2026

    Nigeria external reserves top $46 billion, highest since 2018

    January 27, 2026

    Dollar index slides as yen rebounds and risk markets stay mixed

    January 27, 2026

    Pakistan firms face higher operating costs than regional rivals

    January 27, 2026
    Latest News

    Indonesia counts 50 dead after West Bandung landslide buries homes

    January 28, 2026

    Microsoft launches Maia 200 chip for Azure AI inference

    January 28, 2026

    U.S. international arrivals fell in 2025 as Canada trips dropped

    January 28, 2026

    Ben Affleck AI remarks ignite Hollywood creative backlash

    January 27, 2026

    Nigeria external reserves top $46 billion, highest since 2018

    January 27, 2026

    Dollar index slides as yen rebounds and risk markets stay mixed

    January 27, 2026

    Pakistan firms face higher operating costs than regional rivals

    January 27, 2026

    India’s 77th Republic Day highlights growing EU economic ties

    January 26, 2026
    © 2026 MENA News 24/7 | All Rights Reserved
    • Home
    • Contact Us

    Type above and press Enter to search. Press Esc to cancel.