AI Features

Setting Up Ollama for AI Insights

Learn how to install and configure Ollama to enable AI-powered insights in Umbra Budget.

Setting Up Ollama for AI Insights

Umbra Budget uses Ollama to provide AI-powered financial insights while keeping all your data 100% local and private. This guide will walk you through installing Ollama and configuring it to work with Umbra Budget.

What is Ollama?

Ollama is an open-source tool that lets you run large language models (LLMs) locally on your computer. By using Ollama, Umbra Budget can provide intelligent spending analysis, budget recommendations, and financial insights without ever sending your data to external servers.

Benefits of using Ollama:

  • Complete Privacy: Your financial data never leaves your device
  • No Internet Required: AI features work offline after initial setup
  • No API Costs: No subscriptions or usage fees
  • Fast Responses: Local processing means quick results

System Requirements

Before installing Ollama, ensure your system meets these requirements:

PlatformMinimum RAMRecommended RAMStorage
macOS8 GB16 GB10 GB
Windows8 GB16 GB10 GB

Note: AI models can be resource-intensive. For the best experience, we recommend at least 16 GB of RAM and an SSD.

Linux Users: While Ollama supports Linux, Umbra Budget for Linux is coming soon. You can install Ollama now and it will be ready when the Linux version of Umbra Budget is released.

Installation

macOS

  1. Download Ollama from the official website:
    Visit ollama.ai/download and download the macOS installer.
  2. Install the application:
    Open the downloaded .dmg file and drag Ollama to your Applications folder.
  3. Launch Ollama:
    Open Ollama from your Applications folder. You'll see the Ollama icon appear in your menu bar.

Alternatively, install via Homebrew:

brew install ollama

Windows

  1. Download the installer from ollama.ai/download
  2. Run the installer and follow the on-screen instructions
  3. Launch Ollama from the Start menu

Linux

Install Ollama using the official install script:

curl -fsSL https://ollama.ai/install.sh | sh

Or using your package manager:

# Ubuntu/Debian
sudo apt install ollama

# Fedora
sudo dnf install ollama

# Arch Linux
yay -S ollama

Downloading a Model

After installing Ollama, you need to download a language model. Umbra Budget works best with these models:

A good balance of performance and resource usage:

ollama pull llama3.2

Lightweight: Phi-3 Mini

Best for systems with limited RAM:

ollama pull phi3:mini

Advanced: Llama 3.1 (8B)

Better quality responses, requires more resources:

ollama pull llama3.1

Verifying Installation

To verify Ollama is working correctly:

  1. Check if Ollama is running:
    ollama --version
    
  2. Test the model:
    ollama run llama3.2 "Hello, how are you?"
    
  3. Verify the API is accessible:
    curl http://localhost:11434/api/tags
    

    You should see a JSON response listing your installed models.

Configuring Umbra Budget

Once Ollama is installed and running:

  1. Open Umbra Budget and navigate to Settings
  2. Go to the AI Insights section
  3. Enable AI Features by toggling the switch
  4. Select your model from the dropdown menu (it will auto-detect installed models)
  5. Click Test Connection to verify everything is working

Configuration Options

SettingDescriptionDefault
ModelThe Ollama model to usellama3.2
API URLOllama server addresshttp://localhost:11434
TemperatureResponse creativity (0-1)0.7
Context LengthHow much history to include4096

Using AI Insights

With Ollama configured, you can now use AI features in Umbra Budget:

Ask AI

Click the Ask AI button on the Insights page to ask questions about your finances:

  • "How can I reduce my food expenses?"
  • "What are my biggest spending categories this month?"
  • "Am I on track to meet my savings goal?"
  • "Suggest ways to save money based on my spending patterns"

Automatic Insights

The AI will automatically generate insights based on your transaction history:

  • Spending pattern analysis
  • Budget health assessment
  • Savings recommendations
  • Trend predictions

Troubleshooting

Ollama not detected

Symptoms: Umbra Budget shows "Ollama not found" or connection errors.

Solutions:

  1. Ensure Ollama is running (check for the menu bar icon on macOS)
  2. Verify the API is accessible: curl http://localhost:11434/api/tags
  3. Check if the port is blocked by a firewall
  4. Restart Ollama and try again

Slow responses

Symptoms: AI responses take a very long time or the app feels sluggish.

Solutions:

  1. Try a smaller model like phi3:mini
  2. Close other resource-intensive applications
  3. Reduce the context length in settings
  4. Consider upgrading your RAM

Out of memory errors

Symptoms: Ollama crashes or shows memory errors.

Solutions:

  1. Use a smaller model
  2. Close other applications to free up RAM
  3. Restart your computer and try again

Model not found

Symptoms: Selected model doesn't appear in Umbra Budget.

Solutions:

  1. Ensure the model is fully downloaded: ollama list
  2. Pull the model again: ollama pull llama3.2
  3. Restart Umbra Budget

Privacy & Security

When using Ollama with Umbra Budget:

  • All processing is local: Your financial data never leaves your device
  • No data collection: Ollama doesn't send usage data or analytics
  • No internet required: After downloading the model, AI features work offline
  • You control your data: Delete the model anytime with ollama rm <model>

Next Steps

Now that you have Ollama configured, explore these features: