Advanced Computing in the Age of AI | Tuesday, March 19, 2024

With $70M in New Series C Funding, Mythic AI Plans Mass Production of Its Inferencing Chips 

Six months after unveiling its first M1108 Analog Matrix Processor (AMP) for AI inferencing, Mythic AI has just received a new $70 million Series C investment round to bring the chips to mass production and to develop its next hardware and software products.

The new investment round, which was led by led by BlackRock and Hewlett Packard Enterprise (HPE), brings the startup’s total funding to $165.2 million, the company said in a May 11 (Tuesday) press release. Several new investors are also involved in this round, including Alumni Ventures Group and UDC Ventures.

Mythic AI said it will use the latest funding to accelerate its plans to bring the M1108 AI inferencing chips to mass production, while also expanding its support to customers and developing the next-generation of its hardware platform.

The M1108 is aimed at deploying edge AI across a wide range of applications, including smart homes, AR/VR, drones, video surveillance, smart cities, manufacturing and more. Due to their combination of analog and flash memory design, the chips can be more affordable and use less power compared to chips from competitors, according to the company.

Other plans for the additional investment include increasing support for Mythic AI’s growing customer base across APAC, Europe and the U.S., as well as building out its software portfolio, the company said.

Tim Vehling of Mythic AI

Tim Vehling, the senior vice president for product and business development of Mythic AI, told EnterpriseAI that since announcing the M1108 chips last year that the company has been sampling them with customers. So far, the company, which was founded in 2012, has not reported any sales revenue.

“We have not announced any customers yet, but we have interest from customers in the video surveillance, industrial, AR/VR and drone markets,” said Vehling. Mythic is a fabless operation and uses a 40nm foundry in Japan for its chip production.

“We are planning to continue innovating our analog computer-in-memory technology to bring even higher AI performance at lower power” as its chip production continues, said Vehling.

The M1108 chips were unveiled in November 2020 as analog processors that are designed to provide high performance with low power requirements for edge computing. The M1108’s uses the Mythic Analog Compute Engine (ACE) for analog compute-in-memory with on-chip deep neural network (DNN) model execution and weight parameter storage with no external DRAM. The chips provide concurrent, independent execution of multiple DNNs and are scalable with multiple chips on a single board to address large DNN applications, according to Mythic.

The M1108 architecture includes 108 AMP tiles, each with a Mythic ACE, a digital SIMD vector engine, a 32-bit RISC-V nano-processor, a network-on-chip (NoC) router and local SRAM, as well as four control and interface tiles with a four-lane PCIe interface with up to 2GB/s bandwidth.

Device makers and original equipment manufacturers (OEMs) can choose from the single-chip M1108 Mythic AMP or a variety of PCIe card configurations, including the M.2 M Key and M.2 A+E Key form factors to fit a variety of needs.

Dan Olds, the chief data officer of Intersect360 Research, told EnterpriseAI that Mythic’s high-end edge AI market is an interesting choice for the company.

Dan Olds, analyst

“Their AI solution is also very inexpensive compared to server-based inferencing, which should give them a significant advantage in the market," said Olds. “However, everything related to AI these days is highly competitive. They’re going to face a number of equally well-funded competitors, but not many competitors who are taking their approach when it comes to analog versus digital. My sense of the market is that most of the burgeoning AI chipsters are trying to shoehorn existing technology into a form factor that will be good enough for edge AI – Mythic is taking a radically different approach.”

Because Mythic uses its AMP, ultra-low power technology to accomplish its AI tasks, “it is able to dynamically adjust power based on workload with near zero power required when it is idle,” said Olds. “I’m impressed by their use of non-volatile flash memory as both a storage and processing medium, it’s a big step forward and one that will enable AI in devices ranging from AR/VR glasses to remote sensors.”

Several other AI inferencing chip companies have also been busy with similar funding announcements in the last month.

AI and ML accelerator startup, Groq Inc., announced on April 14 the closing of a $300 million Series C fundraising round that had been rumored for more than a month. Co-led by Tiger Global Management and D1 Capital, with participation from The Spruce House Partnership and Addition, the cash brings Groq's total funding to $367 million, of which $300 million has been raised since the second-half of 2020, according to the company.

And on April 13 SambaNova Systems announced that it is getting another $676 million in new funding, which it says it will use to directly take on AI market leader Nvidia. With the latest large cash infusion, SambaNova says it now has total funding of more than $1 billion and a valuation above $5 billion.

EnterpriseAI