Advanced Computing in the Age of AI | Thursday, May 16, 2024

Inside Microsoft’s AI Announcements at Ignite 

From new copilots and AI development tools, to vector search and AI chips, artificial intelligence featured prominently in Microsoft’s annual Ignite developers conference held last week. It also unveiled some data news around OneLake and Microsoft Fabric.

It would be an understatement to say that Microsoft is bullish on copilots. “Microsoft is the Copilot company,” the company claims, “and we believe in the future there will be a Copilot for everyone and for everything you do.”

To that end, the company made a slew of copilot-related announcements and updates at Ignite 2023. For starters, it announced the general availability of Copilot for Microsoft 365, which it originally unveiled in March.

Since early adopters first started working with Copilot for Microsoft 365, Microsoft has made several additions, including a new dashboard that shows what the copilot is doing, new personalization capabilities, and new whiteboarding and note-taking capabilities in Copilot for Outlook. Additional updates have been added for Copilots for PowerPoint, Excel, and Microsoft Viva.

Outlook is getting a Copilot too

There’s also a new Copilot for Service, which is targeted at customer service professionals. Security Copilot, which it launched earlier this year, will play a prominent role in the system resulting from the combination of Sentinel security analytics and Microsoft Defender XDR platforms.

Copilot for Azure, meanwhile, serves as an AI companion for cloud administrators. “More than just a tool,” Microsoft declares, “it is a unified chat experience that understands the user’s role and goals, and enhances the ability to design, operate and troubleshoot apps and infrastructure.”

The company also rolled out Copilot Studio, a low-code tool designed to allow Microsoft 365 users to build their own custom copilots and connect them to business data. Its Bing Chat and Bing Chat Enterprise offerings have been replaced with (you’ll never guess) Copilot. “When you give Copilot a seat at the table,” the company says, “it goes beyond being your personal assistant to helping the entire team.”

Organizations that use Microsoft Teams to collaborate will soon be able to spin up 3D virtual meeting places using GenAI. Microsoft says its Teams customers will be able to request the creation of 3D meetings and objects using its AI Copilot system. The virtual reality (VR) version of Teams is due in January.

OpenAI and Nvidia Partnerships

Microsoft has a close partnership with OpenAI and is invested in the company. All of the recent new capabilities that OpenAI announced two weeks ago at its DevDay event–such as GPT-4 Turbo and GPSs apps–will be offered by Microsoft via Azure OpenAI Service too.

“As OpenAI innovates, we will deliver all of that innovation as part of Azure OpenAI,” Microsoft CEO Satya Nadella said.

As far as the timeline goes, GPT-3.5 Turbo model with a 16K token prompt length will be generally available soon, and GPT-4 Turbo will be available by the end of the month. GPT-4 Turbo with Vision will soon be available as a preview.

Another partner critical for Microsoft’s ambitions is Nvidia. The GPU chipmaker and the software giant unveiled that its new AI foundry service, which will  include Nvidia tools like AI Foundation Models, NeMo framework, and DGX Cloud AI supercomputing,  will be available on Azure.

 

Nvidia CEO Jensen Huang joined Microsoft CEO Nadella on stage. “You invited Nvidia’s ecosystem, all of our software stacks, to be hosted on Azure,” Huang said. “There’s just a profound transformation in the way that Microsoft works with the ecosystem.”

AI Development

The company made several announcements around AI development, including rolling out Azure AI Studio, which the company describes as a “hub” for exploring, building, testing, and deploying GenAI apps, or even your own custom copilots.

The company also unveiled a new offering called Windows AI Studio that allows developer to build and run AI models directly on the Windows operating system. Windows AI Studio will allow developers to access and play with a variety of language models, such as its own Microsoft Phi, Meta’s Llama2, and open source models sourced from Azure AI Studio or Hugging Face.

It also rolled out Model-as-a-Service, which will give developers access to the latest AI models from its model catalog. AI developers will be able to Llama 2, upcoming premium models from Mistral, and Jais from G42, as an API endpoint, the company says.

Vector Search, which is a feature of Azure AI Search, is now generally available, the company says. It also added a new “prompt flow” capability to Azure Machine Learning. This will “streamline the entire development lifecycle” of GenAI and LLM apps, the company says.

New Chips

 

Microsoft CEO Satya Nadella holding up a new Arm chip (Image source: Microsoft)

Microsoft unveiled a new Arm-based CPU this week. Dubbed the Azure Cobalt, the new chip is 40% faster than the commercial Arm chips it currently uses, the company says. The Azure Cobalt will be offered exclusively in the Azure cloud and is designed for cloud workloads.

It also announced Azure Maia, which it calls an “AI accelerator chip” that’s designed to run cloud-based training and inferencing for AI workloads such as OpenAI models, Bing, GitHub Copilot and ChatGPT.

Some Data Stuff Too

It wasn’t all models all of the time at Ignite. Data, after all, lies at the heart of AI, and Microsoft made some data-related announcements at Ignite.

For instance, it announced that Microsoft Fabric OneLake, which it announced earlier this year, is available as a data store in Azure Machine Learning. The company says this will make it easier for data engineers to share “machine learning-ready data assets developed in Fabric.”

Microsoft announced the GA of Azure Data Lake Storage Gen2  (ADLS Gen2) “shortcuts,” which will allow data engineers “to connect to data from external data lakes in ADLS Gen2 into OneLake through a live connection with target data.”

The company also supports “Amazon S3 shortcuts” in OneLake, which it says will allow customers to “create a single virtualized data lake” that spans Amazon S3 buckets and OneLake, thereby eliminating the latency involved with copying data.

You can access Microsoft’s full slate of AI news from Ignite 2023 here. The full “book ‘o news,” including all 100 product announcements made at the show, is available here.

Editor's note: This article originally appeared on Datanami.

About the author: Alex Woodie

Alex Woodie has written about IT as a technology journalist for more than a decade. He brings extensive experience from the IBM midrange marketplace, including topics such as servers, ERP applications, programming, databases, security, high availability, storage, business intelligence, cloud, and mobile enablement. He resides in the San Diego area.

EnterpriseAI