Microsoft Build 2025 kicked off with a clear focus: empowering developers to build smarter, faster, and more securely with AI. At Microsoft Build is packed with updates that directly impact how developers design, optimize, and scale AI-driven software. Here's a breakdown of the most relevant announcement:Microsoft Build is packed with updates that directly impact how developers design, optimize, and scale AI-driven software. Here's a breakdown of the most relevant announcement
The renaming of Windows Copilot Runtime to Windows AI Foundry is more than cosmetic. This rebrand, announced at Microsoft Build 2025, aligns Windows with Azure AI Foundry, offering parity in lifecycle support, from model selection and optimization to fine-tuning and deployment, directly on-device.
For developers building latency-sensitive or privacy-preserving AI features, this creates a local-first stack capable of bypassing the cloud entirely.
At the core of Foundry is Windows ML, the inferencing engine that supports ONNX models and runs efficiently across a wide range of hardware:
This flexibility allows developers to bring their own models (BYOM) while still leveraging platform-optimized inference, without needing to rewrite for each hardware backend.
With native access to open-source model catalogs, developers can now:
This capability, highlighted at Microsoft Build, makes it easier to build powerful local AI apps.
Inbox AI APIs on Copilot+ PCs now offer functions for:
In addition, LoRA fine-tuning is now supported natively for the Phi Silica SLM (Small Language Model). Developers can personalize models with their own data, directly on-device—without full retraining.
New APIs for Semantic Search & RAG
You can now build semantic search and retrieval-augmented generation (RAG) pipelines using local data, completely within Windows apps—no external cloud dependency required.
Windows 11 now includes native support for Model Context Protocol (MCP), allowing apps to act as context providers or skill endpoints for local AI agents.
Think of MCP as a universal adapter: agents can dynamically interface with any app exposing MCP hooks—like commands, views, or data—enabling a modular agentic system.
MCP is being integrated across Microsoft’s ecosystem, including:
This unified approach was a key theme at Microsoft Build 2025, underscoring Microsoft’s commitment to seamless, scalable AI development.
App Actions allow developers to expose granular app functions, like “generate report,” “start timer,” or “analyze image”, as invokable actions at the OS level. These actions are discoverable by Copilot, AI agents, and the system, providing deeper OS integration and opening new user acquisition channels.
Designed with intent-based invocation in mind, App Actions let your features surface contextually across Windows, reducing friction and boosting usage, all without modifying core app logic. This capability was highlighted as a key developer productivity enhancer at Microsoft Build 2025.
The VBS Enclave SDK enables apps to execute sensitive operations in isolated, hardware-backed enclaves using Hyper-V-based Virtualization Based Security. Ideal for credential handling, DRM, or confidential inference workloads, this keeps code and data shielded from the OS or other processes.
Additionally, Microsoft Build introduced Post-Quantum Cryptography (PQC) APIs, helping developers future-proof apps against emerging quantum threats. These APIs allow early adoption of quantum-safe encryption algorithms, securing apps for the next generation.
In a major announcement at Microsoft Build 2025, Microsoft officially open-sourced WSL, including key components such as:
Developers now have direct access to WSL internals and APIs, paving the way for:
Also announced was the upcoming open-sourcing of the GitHub Copilot Chat extension for VSCode, enabling developers to customize AI workflows directly inside their editor.
These developer tooling upgrades were showcased at Microsoft Build 2025, underscoring Microsoft’s commitment to enhancing developer productivity and flexibility.
At Microsoft Build 2025, Microsoft announced several updates to the Microsoft Store aimed at improving developer experience and app distribution:
Microsoft is making multi-agent architectures accessible through Copilot Studio, enabling developers to design systems where multiple AI agents collaborate by delegating and coordinating tasks autonomously.
These multi-agent systems can leverage:
This unlocks new possibilities for complex workflows such as multi-step automation, distributed problem-solving, and composable AI services that mimic human team dynamics.
Currently in preview, Copilot Studio invites developers to experiment with agent orchestration at scale, a highlight of this year’s Microsoft Build.
Natural Language Web (NLWeb) is a Microsoft initiative introduced at Microsoft Build 2025 aimed at converting traditional websites into agent-driven, conversational interfaces.
By enabling natural language queries and interactions directly against web content, NLWeb empowers developers to build agentic web apps that are more intuitive, context-aware, and interactive.
While the exact involvement of OpenAI remains unclear, NLWeb signals Microsoft’s commitment to embedding conversational AI deeper into everyday web experiences, pushing the boundaries of web interactivity.
At Microsoft Build 2025, Microsoft unveiled the new GitHub Coding Agent (“Project Padawan”), now generally available for Copilot Enterprise and Pro+ users. Unlike the classic Copilot, Project Padawan autonomously handles low-to-medium complexity tasks within well-tested codebases, including:
This marks a significant evolution toward AI as a true collaborative peer, capable of more independent coding contributions rather than just offering suggestions.
Continuing the momentum from Microsoft Build, Copilot Studio is evolving from its low-code/no-code roots into a full-fledged platform for professional developers building complex AI agents.
New APIs for Microsoft 365 Copilot are being introduced, including a retrieval API currently in preview that allows developers to integrate intelligent data retrieval into their agents.
Crucially, Bring Your Own Model (BYOM) support from Azure AI Foundry is now in preview for Copilot Studio, enabling developers to deploy custom-trained models seamlessly within agent workflows, a major milestone for flexible, enterprise-grade AI solutions.
Microsoft is embedding security tools directly into Azure AI Foundry and Copilot Studio to protect AI applications throughout development and deployment, as highlighted at Microsoft Build 2025.
This initiative underscores Microsoft’s commitment to baking AI agent governance, compliance, and security into the AI development lifecycle, not just adding it as an afterthought.
Microsoft Build 2025 made it clear: AI is now a local-first, developer-first priority. With Windows AI Foundry unifying the model lifecycle, Copilot Studio enabling collaborative agents, and expanded hardware acceleration, Microsoft is delivering a full-stack AI platform built for real-world development.
Add in stronger security, open-sourced tools like WSL, and seamless deployment across platforms, developers now have everything they need to build smarter, faster, and more securely. The future of software is AI-native, and Build 2025 just handed developers the blueprint.