This is the hello-world of ai_sdk. You only need two imports: the provider factory and the
generate_text helper.
Install dependencies
pip install ai-sdk openai
Make sure the environment variable OPENAI_API_KEY is set.
Create a model instance & generate text
from ai_sdk import openai, generate_text
# 1. Re-use the model across calls
model = openai("gpt-4.1-mini")
# 2. Generate a friendly greeting
res = generate_text(model=model, prompt="Say hi to the world in one sentence")
print(res.text)
Expected output (your wording may vary):Hello, world – great to see you!
What just happened?
openai() returns a LanguageModel wrapper that talks to the OpenAI Chat Completions API.
generate_text() sends the prompt, waits for the response, and returns a typed
GenerateTextResult.
print(res.finish_reason) # "stop"
print(res.usage) # TokenUsage(prompt_tokens=7, completion_tokens=9, ...)
That’s it – no async boilerplate, no JSON parsing, no provider-locked SDK calls.