Get started prompt engineering with local LLMs Feb 16, 2024 Ollama makes it easy to run LLMs locally and provides experimental compatibility with OpenAI's APIs