TsgcHTTP_API_OpenAI › Methods › CreateBatch
Submits a new Batch job for asynchronous bulk inference at reduced cost
function CreateBatch(const aRequest: TsgcOpenAIClass_Request_Batch) : TsgcOpenAIClass_Response_Batch;
| Name | Type | Description |
|---|---|---|
aRequest | const TsgcOpenAIClass_Request_Batch | Batch configuration: input file id (JSONL), Endpoint (e.g. /v1/chat/completions), CompletionWindow and metadata. |
The newly created Batch object with its id and initial Status (usually validating) (TsgcOpenAIClass_Response_Batch)
Calls POST /v1/batches to queue a large number of requests packaged as a JSONL file uploaded beforehand through UploadFile. The service processes the batch within the specified CompletionWindow (typically 24h) and produces an output file id on success. Poll RetrieveBatch to track progress.
var oRequest: TsgcOpenAIClass_Request_Batch;
oBatch: TsgcOpenAIClass_Response_Batch;
begin
oRequest := TsgcOpenAIClass_Request_Batch.Create;
try
oRequest.InputFileId := 'file_abc123';
oRequest.Endpoint := '/v1/chat/completions';
oRequest.CompletionWindow := '24h';
oBatch := oAPI.CreateBatch(oRequest);
ShowMessage(oBatch.Id + ' - ' + oBatch.Status);
finally
oRequest.Free;
end;
end;