Skip to main content

Overview

OpenAI-like models are LLM providers that implement the OpenAI API specification. You can use the standard OpenAIChatModel with custom base_url and api_key parameters to connect to any OpenAI-like endpoint. Model Class: OpenAIChatModel

Authentication

from os import getenv
from upsonic.models.openai import OpenAIChatModel
from upsonic.providers.openai import OpenAIProvider

model = OpenAIChatModel(
    model_name="your-model-name",
    provider=OpenAIProvider(api_key=getenv("YOUR_API_KEY"), base_url="https://your-custom-endpoint.com/v1")
)

Examples

from os import getenv
from upsonic import Agent, Task
from upsonic.models.openai import OpenAIChatModel
from upsonic.providers.openai import OpenAIProvider

model = OpenAIChatModel(
    model_name="your-model-name",
    provider=OpenAIProvider(api_key=getenv("YOUR_API_KEY"), base_url="https://your-custom-endpoint.com/v1")
)

agent = Agent(model=model)
task = Task("Hello, how are you?")
result = agent.do(task)
print(result)

Parameters

ParameterTypeDescriptionDefaultSource
model_namestrModel identifierRequiredBase
api_keystrAPI key for authenticationRequiredBase
base_urlstrCustom endpoint URLRequiredBase
max_tokensintMaximum tokens to generateModel defaultBase
temperaturefloatSampling temperature (0.0-2.0)1.0Base
top_pfloatNucleus sampling threshold1.0Base
seedintRandom seed for reproducibilityNoneBase
stop_sequenceslist[str]Sequences that stop generationNoneBase
presence_penaltyfloatPenalize token presence (-2.0 to 2.0)0.0Base
frequency_penaltyfloatPenalize token frequency (-2.0 to 2.0)0.0Base
parallel_tool_callsboolAllow parallel tool executionTrueBase
timeoutfloatRequest timeout in seconds600Base