Calling Models Through APIs
Learn how model applications connect to hosted APIs, including requests, parameters, and basic error handling.
Explanation
Most production apps connect to a model through an API rather than running the model directly.
You must send the right payload, handle errors gracefully, and log important metadata.
Application code should separate prompt creation from transport logic.
Why this topic matters in practice
In generative AI products, the model is only one part of the system. The surrounding workflow determines whether the output is useful, safe, and maintainable. This lesson matters because it helps you connect the idea to tasks such as tutoring, search, copilots, business assistants, and production automation.
Examples
Web app
A tutorial site sends a user question to a backend service, then returns the answer to the browser.
Batch processing
A script loops through customer comments and summarizes each one through an API.
Back-office automation
An internal app sends policy text and a question, then stores the answer for review.
A generic API request example in Python
The code below is intentionally concise so the underlying pattern stays clear. It focuses on the application logic you can reuse, even if you later switch model providers or deployment environments.
import requests
def call_model(endpoint, api_key, payload):
headers = {
"Authorization": f"Bearer {api_key}",
"Content-Type": "application/json"
}
response = requests.post(endpoint, json=payload, headers=headers, timeout=30)
response.raise_for_status()
return response.json()
payload = {
"input": "Explain embeddings in plain English.",
"temperature": 0.2
}
# Replace with your actual endpoint and key.
# print(call_model("https://your-model-endpoint", "YOUR_API_KEY", payload))How the coding section works
- The request function is generic so you can adapt it to your model provider.
- Separating headers, payload, and endpoint makes the code easier to maintain.
- Always add timeout handling and response checks in production.
Implementation advice
When turning this lesson into a real feature, think beyond the code snippet itself. Decide what inputs should be allowed, how you will validate outputs, how you will recover from errors, and how you will measure whether the feature is actually helping users. Those surrounding choices often determine whether an AI feature feels polished or unreliable.
Summary / key takeaways
- Model APIs are the bridge between your application and the model.
- Reliable integrations need timeouts, status checks, and logging.
- Keep prompt logic separate from transport code so both can evolve cleanly.
Exercises
- Why is timeout handling important in a model API call?
- Add a try/except block around the example function.
- List three fields you might want to log for debugging model requests.