Run AI tasks on large datasets with PromptLoop
x-api-key
header of all requests. Here’s an example using cURL:
data_uuid:
A unique identifier for the input
inputs:
An array containing the input data. Tasks specify whether the input should be a link or a string. This will be checked in the validation step.
Example:
400 Bad Request
error.
You can use the v0/batches/validate-data
endpoint to validate your input data before submitting a job.
The response will indicate whether the data is valid or not.
Examples of Validation Errors
Launch a Job
/v0/launch-job
endpoint to start processing.
Ensure your data is properly formatted and includes the required fields.Monitor Job Status
/v0/job-status/{id}
endpoint.
Replace {id}
with the job ID you received in step 1.Retrieve Results
/v0/batch/{id}/results
endpoint.data_uuid
passed in with that row. Order is not guaranteed to be maintained when processing, so use this to identify rows.API Method | Endpoint | TPM Restriction |
---|---|---|
GET | /v0/tasks | 20 TPM |
GET | /v0/tasks/{id} | 10 TPM |
POST | /v0/tasks/{id} | 5 TPM |
API Method | Endpoint | TPM Restriction |
---|---|---|
GET | /v0/batches | 10 TPM |
POST | /v0/batches | 4 TPM |
POST | /v0/batches/validate-data | 20 TPM |
GET | /v0/batches/{id} | 20 TPM |
PUT | /v0/batches/{id} | 4 TPM |
GET | /v0/batches/{id}/results | 10 TPM |
429 Too Many Requests
response. It’s important to implement proper error handling in your application to manage these scenarios effectively.
output_type=stream
for the /v0/batch/{id}/results
endpoint. This allows for more efficient transfers.
data
field contains a JSON array with the processed results for each input, similar to the response from the /v0/batch/{id}/results
endpoint.
The Payload will be sent with a Content-Type: application/json
header.