Offline Chatbots with Ollama
September 10, 2024
Read on Medium- deepseek
- chatbots
- react
- ollama
- ai

Generative AI tools like ChatGPT, Gemini, or Claude have been a game-changer for many use cases, but they don’t come without drawbacks. Privacy concerns, costs, and reliance on internet connectivity can be significant hurdles for some users.
Ollama solves these issues by giving users more control, allowing them to self host open-source LLMs on their own machines. In just a few steps, you’ll have your own free, local LLM at your fingertips. Plus, with Ollama’s API and libraries, you can integrate this powerful AI into your own applications.
Getting Started with Ollama
1. Download and Install:
- Head over to Ollama’s official website and download the appropriate version for your device.

2. Run Ollama:
- Once installed, you can then run ollama from your terminal.
3. Choose Your Model:
- Now just choose whichever LLMs you want to download. Some popular ones are Llama 3.1, Phi 3, and Mistral.
- For this example I’ll use Llama 3.1. Once there you will see different options for the different parameter sizes, I’ll go with 8b here since it’s the most lightweight at 4.7GB.

4. Download and Run:
- Copy the command ollama run llama3.1:8b and run it in your terminal.
- It will begin downloading the model, and once complete you will be able to chat with it right in the terminal.
Taking It Further: Integrating Ollama into Your Apps
Ollama provides libaries in JavaScript and Python, so you can take it a step further and integrate these LLMs into your own apps. I was inspired to build a free, offline version of ChatGPT and came up with this implementation in React.

You can find the source code for this project here, for a practical example of how to leverage Ollama LLMs in your own apps.
Thanks for reading! I’m a full-stack developer specializing in React, TypeScript, and Web3 technologies.
Check out more of my work at mrmendoza.dev
Find my open-source projects on GitHub
Connect with me on LinkedIn
Let me know if this article was helpful or anything you’d like to see next.