Setting Up Ollama for AI Insights
Learn how to install and configure Ollama to enable AI-powered insights in Umbra Budget.
On this page
Setting Up Ollama for AI Insights
Umbra Budget uses Ollama to provide AI-powered financial insights while keeping all your data 100% local and private. This guide will walk you through installing Ollama and configuring it to work with Umbra Budget.
What is Ollama?
Ollama is an open-source tool that lets you run large language models (LLMs) locally on your computer. By using Ollama, Umbra Budget can provide intelligent spending analysis, budget recommendations, and financial insights without ever sending your data to external servers.
Benefits of using Ollama:
- Complete Privacy: Your financial data never leaves your device
- No Internet Required: AI features work offline after initial setup
- No API Costs: No subscriptions or usage fees
- Fast Responses: Local processing means quick results
System Requirements
Before installing Ollama, ensure your system meets these requirements:
| Platform | Minimum RAM | Recommended RAM | Storage |
|---|---|---|---|
| macOS | 8 GB | 16 GB | 10 GB |
| Windows | 8 GB | 16 GB | 10 GB |
Note: AI models can be resource-intensive. For the best experience, we recommend at least 16 GB of RAM and an SSD.
Linux Users: While Ollama supports Linux, Umbra Budget for Linux is coming soon. You can install Ollama now and it will be ready when the Linux version of Umbra Budget is released.
Installation
macOS
- Download Ollama from the official website:
Visit ollama.ai/download and download the macOS installer. - Install the application:
Open the downloaded.dmgfile and drag Ollama to your Applications folder. - Launch Ollama:
Open Ollama from your Applications folder. You'll see the Ollama icon appear in your menu bar.
Alternatively, install via Homebrew:
brew install ollama
Windows
- Download the installer from ollama.ai/download
- Run the installer and follow the on-screen instructions
- Launch Ollama from the Start menu
Linux
Install Ollama using the official install script:
curl -fsSL https://ollama.ai/install.sh | sh
Or using your package manager:
# Ubuntu/Debian
sudo apt install ollama
# Fedora
sudo dnf install ollama
# Arch Linux
yay -S ollama
Downloading a Model
After installing Ollama, you need to download a language model. Umbra Budget works best with these models:
Recommended: Llama 3.2 (3B)
A good balance of performance and resource usage:
ollama pull llama3.2
Lightweight: Phi-3 Mini
Best for systems with limited RAM:
ollama pull phi3:mini
Advanced: Llama 3.1 (8B)
Better quality responses, requires more resources:
ollama pull llama3.1
Verifying Installation
To verify Ollama is working correctly:
- Check if Ollama is running:
ollama --version - Test the model:
ollama run llama3.2 "Hello, how are you?" - Verify the API is accessible:
curl http://localhost:11434/api/tags
You should see a JSON response listing your installed models.
Configuring Umbra Budget
Once Ollama is installed and running:
- Open Umbra Budget and navigate to Settings
- Go to the AI Insights section
- Enable AI Features by toggling the switch
- Select your model from the dropdown menu (it will auto-detect installed models)
- Click Test Connection to verify everything is working
Configuration Options
| Setting | Description | Default |
|---|---|---|
| Model | The Ollama model to use | llama3.2 |
| API URL | Ollama server address | http://localhost:11434 |
| Temperature | Response creativity (0-1) | 0.7 |
| Context Length | How much history to include | 4096 |
Using AI Insights
With Ollama configured, you can now use AI features in Umbra Budget:
Ask AI
Click the Ask AI button on the Insights page to ask questions about your finances:
- "How can I reduce my food expenses?"
- "What are my biggest spending categories this month?"
- "Am I on track to meet my savings goal?"
- "Suggest ways to save money based on my spending patterns"
Automatic Insights
The AI will automatically generate insights based on your transaction history:
- Spending pattern analysis
- Budget health assessment
- Savings recommendations
- Trend predictions
Troubleshooting
Ollama not detected
Symptoms: Umbra Budget shows "Ollama not found" or connection errors.
Solutions:
- Ensure Ollama is running (check for the menu bar icon on macOS)
- Verify the API is accessible:
curl http://localhost:11434/api/tags - Check if the port is blocked by a firewall
- Restart Ollama and try again
Slow responses
Symptoms: AI responses take a very long time or the app feels sluggish.
Solutions:
- Try a smaller model like
phi3:mini - Close other resource-intensive applications
- Reduce the context length in settings
- Consider upgrading your RAM
Out of memory errors
Symptoms: Ollama crashes or shows memory errors.
Solutions:
- Use a smaller model
- Close other applications to free up RAM
- Restart your computer and try again
Model not found
Symptoms: Selected model doesn't appear in Umbra Budget.
Solutions:
- Ensure the model is fully downloaded:
ollama list - Pull the model again:
ollama pull llama3.2 - Restart Umbra Budget
Privacy & Security
When using Ollama with Umbra Budget:
- All processing is local: Your financial data never leaves your device
- No data collection: Ollama doesn't send usage data or analytics
- No internet required: After downloading the model, AI features work offline
- You control your data: Delete the model anytime with
ollama rm <model>
Next Steps
Now that you have Ollama configured, explore these features:
- Understanding AI Insights - Learn how to interpret AI-generated insights
- Customizing Prompts - Create custom AI prompts for your needs
- Privacy Best Practices - Tips for maintaining your financial privacy