Skip to content

APIM ❤️ OpenAI - this repo contains a set of experiments on using GenAI capabilities of Azure API Management with Azure OpenAI and other services

License

Notifications You must be signed in to change notification settings

Azure-Samples/AI-Gateway

Repository files navigation

APIM ❤️ OpenAI - 🧪 Labs for the GenAI Gateway capabilities of Azure API Management

Open Source Love

What's new ✨

➕ the AI Foundry SDK lab. ➕ the Content filtering and Prompt shielding labs. ➕ the Model routing lab with OpenAI model based routing. ➕ the Prompt flow lab to try the Azure AI Studio Prompt Flow with Azure API Management. ➕ priority and weight parameters to the Backend pool load balancing lab. ➕ the Streaming tool to test OpenAI streaming with Azure API Management. ➕ the Tracing tool to debug and troubleshoot OpenAI APIs using Azure API Management tracing capability. ➕ image processing to the GPT-4o inferencing lab. ➕ the Function calling lab with a sample API on Azure Functions.

Contents

  1. 🧠 GenAI Gateway
  2. 🧪 Labs
  3. 🚀 Getting started
  4. 🔨 Tools
  5. 🏛️ Well-Architected Framework
  6. 🎒 Show and tell
  7. 🥇 Other Resources

The rapid pace of AI advances demands experimentation-driven approaches for organizations to remain at the forefront of the industry. With AI steadily becoming a game-changer for an array of sectors, maintaining a fast-paced innovation trajectory is crucial for businesses aiming to leverage its full potential.

AI services are predominantly accessed via APIs, underscoring the essential need for a robust and efficient API management strategy. This strategy is instrumental for maintaining control and governance over the consumption of AI services.

With the expanding horizons of AI services and their seamless integration with APIs, there is a considerable demand for a comprehensive AI Gateway pattern, which broadens the core principles of API management. Aiming to accelerate the experimentation of advanced use cases and pave the road for further innovation in this rapidly evolving field. The well-architected principles of the AI Gateway provides a framework for the confident deployment of Intelligent Apps into production.

🧠 GenAI Gateway

AI-Gateway flow

This repo explores the AI Gateway pattern through a series of experimental labs. The GenAI Gateway capabilities of Azure API Management plays a crucial role within these labs, handling AI services APIs, with security, reliability, performance, overall operational efficiency and cost controls. The primary focus is on Azure OpenAI, which sets the standard reference for Large Language Models (LLM). However, the same principles and design patterns could potentially be applied to any LLM.

🧪 Labs

Acknowledging the rising dominance of Python, particularly in the realm of AI, along with the powerful experimental capabilities of Jupyter notebooks, the following labs are structured around Jupyter notebooks, with step-by-step instructions with Python scripts, Bicep files and Azure API Management policies:

🧪 Backend pool load balancing (built-in) 🧪 Advanced load balancing (custom)
flow flow
Playground to try the built-in load balancing backend pool functionality of Azure API Management to either a list of Azure OpenAI endpoints or mock servers. Playground to try the advanced load balancing (based on a custom Azure API Management policy) to either a list of Azure OpenAI endpoints or mock servers.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Access controlling 🧪 Token rate limiting
flow flow
Playground to try the OAuth 2.0 authorization feature using identity provider to enable more fine-grained access to OpenAPI APIs by particular users or client. Playground to try the token rate limiting policy to one or more Azure OpenAI endpoints. When the token usage is exceeded, the caller receives a 429.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Token metrics emitting 🧪 Semantic caching
flow flow
Playground to try the emit token metric policy. The policy sends metrics to Application Insights about consumption of large language model tokens through Azure OpenAI Service APIs. Playground to try the semantic caching policy. Uses vector proximity of the prompt to previous requests and a specified similarity score threshold.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Response streaming 🧪 Vector searching
flow flow
Playground to try response streaming with Azure API Management and Azure OpenAI endpoints to explore the advantages and shortcomings associated with streaming. Playground to try the Retrieval Augmented Generation (RAG) pattern with Azure AI Search, Azure OpenAI embeddings and Azure OpenAI completions.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Built-in logging 🧪 SLM self-hosting (phy-3)
flow flow
Playground to try the buil-in logging capabilities of Azure API Management. Logs requests into App Insights to track details and token usage. Playground to try the self-hosted phy-3 Small Language Model (SLM) trough the Azure API Management self-hosted gateway with OpenAI API compatibility.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 GPT-4o inferencing 🧪 Message storing
flow flow
Playground to try the new GPT-4o model. GPT-4o ("o" for "omni") is designed to handle a combination of text, audio, and video inputs, and can generate outputs in text, audio, and image formats. Playground to test storing message details into Cosmos DB through the Log to event hub policy. With the policy we can control which data will be stored in the DB (prompt, completion, model, region, tokens etc.).
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Developer tooling (WIP) 🧪 Function calling
flow flow
Playground to try the developer tooling available with Azure API Management to develop, debug, test and publish AI Service APIs. Playground to try the OpenAI function calling feature with an Azure Functions API that is also managed by Azure API Management.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Model Routing 🧪 Prompt flow
flow flow
Playground to try routing to a backend based on Azure OpenAI model and version. Playground to try the Azure AI Studio Prompt Flow with Azure API Management.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬
🧪 Content Filtering 🧪 Prompt Shielding
flow flow
Playground to try integrating Azure API Management with Azure AI Content Safety to filter potentially offensive, risky, or undesirable content. Playground to try Prompt Shields from Azure AI Content Safety service that analyzes LLM inputs and detects User Prompt attacks and Document attacks, which are two common types of adversarial inputs.
🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬 🦾 Bicep⚙️ Policy🧾 Notebook 🟰 💬

Backlog of experiments

  • Assistants load balancing
  • Logic Apps RAG
  • Semantic Kernel plugin
  • PII handling
  • Llama inferencing

Tip

Kindly use the feedback discussion so that we can continuously improve with your experiences, suggestions, ideas or lab requests.

🚀 Getting Started

Prerequisites

Quickstart

  1. Clone this repo and configure your local machine with the prerequisites. Or just create a GitHub Codespace and run it on the browser or in VS Code.
  2. Navigate through the available labs and select one that best suits your needs. For starters we recommend the backend pool load balancing.
  3. Open the notebook and run the provided steps.
  4. Tailor the experiment according to your requirements. If you wish to contribute to our collective work, we would appreciate your submission of a pull request.

Note

🪲 Please feel free to open a new issue if you find something that should be fixed or enhanced.

🔨 Tools

  • AI-Gateway Mock server is designed to mimic the behavior and responses of the OpenAI API, thereby creating an efficient simulation environment suitable for testing and development purposes on the integration with Azure API Management and other use cases. The app.py can be customized to tailor the Mock server to specific use cases.
  • Tracing - Invoke OpenAI API with trace enabled and returns the tracing information.
  • Streaming - Invoke OpenAI API with stream enabled and returns response in chunks.

🏛️ Well-Architected Framework

The Azure Well-Architected Framework is a design framework that can improve the quality of a workload. The following table maps labs with the Well-Architected Framework pillars to set you up for success through architectural experimentation.

Lab Security Reliability Performance Operations Costs
Request forwarding
Backend circuit breaking
Backend pool load balancing
Advanced load balancing
Response streaming
Vector searching
Built-in logging
SLM self-hosting

🎒 Show and tell

Tip

Install the VS Code Reveal extension, open AI-GATEWAY.md and click on 'slides' at the botton to present the AI Gateway without leaving VS Code. Or just open the AI-GATEWAY.pptx for a plain old PowerPoint experience.

🥇 Other resources

Numerous reference architectures, best practices and starter kits are available on this topic. Please refer to the resources provided if you need comprehensive solutions or a landing zone to initiate your project. We suggest leveraging the AI-Gateway labs to discover additional capabilities that can be integrated into the reference architectures.

We believe that there may be valuable content that we are currently unaware of. We would greatly appreciate any suggestions or recommendations to enhance this list.

🌐 WW GBB initiative

GBB

Disclaimer

Important

This software is provided for demonstration purposes only. It is not intended to be relied upon for any purpose. The creators of this software make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the software or the information, products, services, or related graphics contained in the software for any purpose. Any reliance you place on such information is therefore strictly at your own risk.

About

APIM ❤️ OpenAI - this repo contains a set of experiments on using GenAI capabilities of Azure API Management with Azure OpenAI and other services

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published