Sign in to confirm you’re not a bot
This helps protect our community. Learn more
Run AI models on your local machine with Ollama [Pt 5]
22Likes
840Views
Oct 152024
Learn how to seamlessly integrate local AI models into your development workflow using Ollama. You'll see how to download, run, and interact with powerful AI models on your machine, and keep compatibility with OpenAI’s API. We'll explore the Phi-3 model family, and discover how you can use it to build prototypes and experiment with AI applications. Links: Watch this series' playlist: https://aka.ms/genai-js All slides and code samples: https://github.com/microsoft/generati... #ai #phi3 #openai Chapters: 00:00 Introduction to Local AI Models 00:12 Benefits of Using Local Models 00:52 Overview of Phi-3 Model Family 01:30 Introduction to Ollama 02:10 Installing Ollama and Downloading Models 03:10 Running a UI with Ollama 04:20 Using Ollama's HTTP API 05:50 OpenAI Compatible API Features 06:40 Next Steps with Ollama and Phi-3

Follow along using the transcript.

Microsoft Developer

588K subscribers