Send a structured list of input messages with text content, and the model will generate the next message in the conversation.
Send a Hello message to a local Ollama model.
Ollama := TsgcHTTP_API_Ollama.Create(nil);
Ollama.OllamaOptions.Host := 'http://localhost:11434';
WriteLn(Ollama._CreateMessage('llama3', 'Hello!'));
Send a message with a system prompt to control the model's behavior.
Ollama := TsgcHTTP_API_Ollama.Create(nil);
Ollama.OllamaOptions.Host := 'http://localhost:11434';
WriteLn(Ollama._CreateMessageWithSystem('llama3',
'You are a helpful assistant that responds in Spanish.',
'What is the capital of France?'));
Use Server-Sent Events (SSE) to stream the response in real-time. Assign the OnHTTPAPISSE event handler to receive streaming events.
Ollama := TsgcHTTP_API_Ollama.Create(nil);
Ollama.OllamaOptions.Host := 'http://localhost:11434';
Ollama.OnHTTPAPISSE := OnSSEEvent;
Ollama._CreateMessageStream('llama3', 'Tell me a story.');
procedure TForm1.OnSSEEvent(Sender: TObject; const aEvent, aData: string;
var Cancel: Boolean);
begin
// aEvent contains the event type
// aData contains the JSON data for this event
Memo1.Lines.Add(aData);
end;