Skip to main content

Overview

Ollama allows you to run large language models locally on your machine. Perfect for development, testing, and privacy-sensitive applications. Model Class: OpenAIChatModel (OpenAI-compatible API)

Authentication

export OLLAMA_BASE_URL="http://localhost:11434/v1/"  # Optional, this is the default

Examples

from upsonic import Agent, Task
from upsonic.models.openai import OpenAIChatModel

model = OpenAIChatModel(model_name="llama3.2", provider="ollama")
agent = Agent(model=model)

task = Task("Hello, how are you?")
result = agent.do(task)
print(result)

Parameters

ParameterTypeDescriptionDefaultSource
max_tokensintMaximum tokens to generateModel defaultBase
temperaturefloatSampling temperature0.8Base
top_pfloatNucleus sampling0.9Base
seedintRandom seedNoneBase
stop_sequenceslist[str]Stop sequencesNoneBase