Azure Updates
Limited offering to select customers for non-production use and testing.
Available to all Azure customers for non-production use and testing.
Fully released, production-ready product available to all Azure customers.
-
Azure Functions now supports creating an OpenAI resource and an optional Azure AI Search vector store for building intelligent applications in Flex consumption portal experience. This makes it easier to build intelligent applications for text completion, chat with assistant calling, or Retrieval augmented generation when you want to use your own company data with Azure Open AI.
You can build intelligent apps using the native Azure OpenAI SDKs or use the updated Azure Functions OpenAI bindings to jumpstart development of the following scenarios:
Chat assistants
- Input binding to chat with LLMs.
- Output binding to retrieve chat history from persisted storage.
- Skills trigger to extend capabilities of OpenAI LLM through natural language.
Retrieval Augmented Generation (Bring your own data for semantic search)
- Data ingestion with Functions bindings.
- Automatic chunking and embeddings creation.
- Store embeddings in vector database including AI Search, Cosmos DB for MongoDB, and Azure Data Explorer.
- Binding that takes prompts, retrieves documents, sends to OpenAI LLM, and returns to user.
Text completion for content summarization and creation
- Input binding that takes prompt or content and returns response from LLM.
These bindings have been improved in the latest preview to support managed identity and use the latest underlying SDKs along with multiple fixes based on customer feedback.
Azure ID467813Product Categories(s)Compute, Containers, Internet of ThingsUpdate Types(s)Features, Microsoft IgniteAdded to roadmap: 11/19/2024|Last modified: 11/19/2024Share
Additional Resources
Product availability by region
-
Get the Azure mobile app