Sign in to confirm you’re not a bot
This helps protect our community. Learn more
Learn Live: Bring Your Own AI Models to Intelligent Apps on AKS with Kaito
Full series information: https://aka.ms/learnlive-intelligent-... More info here: https://aka.ms/learnlive-intelligent-... Follow on Microsoft Learn: Join us to learn how to run open-source Large Language Models (LLMs) with HTTP-based inference endpoints inside your AKS cluster using the Kubernetes AI Toolchain Operator (KAITO). We’ll walk through the setup and deployment of containerized LLMs on GPU node pools and see how KAITO can help reduce operational burden of provisioning GPU nodes and tuning model deployment parameters to fit GPU profiles. --------------------- Learning objectives
  • Learn how to extend existing microservices with AI capabilities.
  • Understand using progressive enhancement to integrate AI capabilities in existing applications.
  • Learn how to use open source or custom Large Language Models (LLM) with existing applications.
  • Learn how to run open source or custom Large Language Models on Azure Kubernetes Service
--------------------- Presenters Paul Yu Senior Cloud Advocate Microsoft Ishaan Sehgal Software Engineer Microsoft Moderators Steven Murawski Principal Cloud Advocate Microsoft

Follow along using the transcript.

Microsoft Reactor

115K subscribers
Live chat replay is not available for this video.