Chat completion
Chat completion or text completion is super simple using the middlebop rest api. You can use your Middlebop API key with any model.
See our supported models
See the message syntax
OpenAI
You can create a chat completion by passing your middlebop API key in the the x-api-key
header and in the post body, an array of messages.
Make a POST call to https://api.middlebop.com/v1/chat/openai/
curl -X POST "https://api.middlebop.com/v1/chat/openai/" \
-H "Content-Type: application/json" \
-H "x-api-key: your_api_key_here" \
-d '{
"model": "gpt-4",
"messages": [
{
"role": "user",
"content": {
"type": "text",
"text": "Hello, who are you?"
}
}
]
}'
Response
The response will be an array of response messages. Most times the array will contain just 1 text response but when using function calling, it might respond with multiple function calls if supported.
{
"responseMessages": [
{
"role": "assistant",
"content": {
"type": "text",
"text": "Hello! I'm AI developed by OpenAI. How can I help you today?"
}
}
]
}
Google
Chat completions using Google models is just as simple with your middlebop API key. You can use the same key for any model.
The only difference is the model in the body and the endpoint.
Make a POST call to https://api.middlebop.com/v1/chat/google/
curl -X POST "https://api.middlebop.com/v1/chat/google/" \
-H "Content-Type: application/json" \
-H "x-api-key: your_api_key_here" \
-d '{
"model": "gemini-pro",
"messages": [
{
"role": "user",
"content": {
"type": "text",
"text": "Hello, who are you?"
}
}
]
}'
Response
The response will be an array of response messages. Most times the array will contain just 1 text response but when using function calling, it might respond with multiple function calls if supported.
{
"responseMessages": [
{
"role": "assistant",
"content": {
"type": "text",
"text": "Hello! I'm Gemini. How can I help you today?"
}
}
]
}