Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...
Robin Rowe talks about coding, programming education, and China in the age of AI feature TrapC, a memory-safe version of the ...
After changing its requirements for high-bandwidth memory chips, Nvidia is close to certifying a new source for supplies.
Microsoft has introduced the second generation of its in‑house artificial intelligence processor, the Maia AI chip.
Amazon Web Services (AWS) today announced the general availability of Amazon EC2 X8i instances, new memory-optimized instances powered by custom Intel Xeon 6 processors with a sustained all-core turbo ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Microsoft unveils its Maia 200 AI chip, offering 30% better performance and improved cloud efficiency with wider customer ...
Cryptopolitan on MSN
Synopsys: Memory chip shortages from AI demand expected to last until 2027
Memory chips, prices and AI infrastructure are now tightly linked as the global semiconductor market enters what industry ...
Cryptopolitan on MSN
Microsoft introduces Maia 200 to reduce AI cloud costs and power use
Microsoft has unveiled its second-generation artificial intelligence chip, Maia 200, as it pushes to strengthen its cloud business and ease reliance on Nvidia processors. Demand for artificial ...
With over 100 billion transistors, Maia 200 offers "powerhouse" AI inferencing possibilites, Microsoft says.
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results