POST
/
batches
from together import Together
import os

client = Together(
api_key=os.environ.get("TOGETHER_API_KEY"),
)

batch = client.batches.create_batch("file_id", endpoint="/v1/chat/completions")

print(batch.id)
{
  "job": {
    "id": "01234567-8901-2345-6789-012345678901",
    "user_id": "user_789xyz012",
    "input_file_id": "file-input123abc456def",
    "file_size_bytes": 1048576,
    "status": "IN_PROGRESS",
    "job_deadline": "2024-01-15T15:30:00Z",
    "created_at": "2024-01-15T14:30:00Z",
    "endpoint": "/v1/chat/completions",
    "progress": 75,
    "model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
    "output_file_id": "file-output789xyz012ghi",
    "error_file_id": "file-errors456def789jkl",
    "error": "<string>",
    "completed_at": "2024-01-15T15:45:30Z"
  },
  "warning": "<string>"
}

Authorizations

Authorization
string
header
default:default
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Body

application/json

Response

201
application/json

Job created (potentially with warnings)

The response is of type object.