from together import Together
import os
client = Together(
api_key=os.environ.get("TOGETHER_API_KEY"),
)
batches = client.batches.list_batches()
for batch in batches:
print(batch.id)
[
{
"id": "01234567-8901-2345-6789-012345678901",
"user_id": "user_789xyz012",
"input_file_id": "file-input123abc456def",
"file_size_bytes": 1048576,
"status": "IN_PROGRESS",
"job_deadline": "2024-01-15T15:30:00Z",
"created_at": "2024-01-15T14:30:00Z",
"endpoint": "/v1/chat/completions",
"progress": 75,
"model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
"output_file_id": "file-output789xyz012ghi",
"error_file_id": "file-errors456def789jkl",
"error": "<string>",
"completed_at": "2024-01-15T15:45:30Z"
}
]
List all batch jobs for the authenticated user
from together import Together
import os
client = Together(
api_key=os.environ.get("TOGETHER_API_KEY"),
)
batches = client.batches.list_batches()
for batch in batches:
print(batch.id)
[
{
"id": "01234567-8901-2345-6789-012345678901",
"user_id": "user_789xyz012",
"input_file_id": "file-input123abc456def",
"file_size_bytes": 1048576,
"status": "IN_PROGRESS",
"job_deadline": "2024-01-15T15:30:00Z",
"created_at": "2024-01-15T14:30:00Z",
"endpoint": "/v1/chat/completions",
"progress": 75,
"model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
"output_file_id": "file-output789xyz012ghi",
"error_file_id": "file-errors456def789jkl",
"error": "<string>",
"completed_at": "2024-01-15T15:45:30Z"
}
]
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
OK
The response is of type object[]
.
Was this page helpful?