Ollama Model Provider
yaml
type: "io.kestra.plugin.ai.provider.Ollama"Examples
Chat completion with Ollama
yaml
id: chat_completion
namespace: company.ai
inputs:
- id: prompt
type: STRING
tasks:
- id: chat_completion
type: io.kestra.plugin.ai.completion.ChatCompletion
provider:
type: io.kestra.plugin.ai.provider.Ollama
modelName: llama3
endpoint: http://localhost:11434
thinkingEnabled: true
returnThinking: true
messages:
- type: SYSTEM
content: You are a helpful assistant, answer concisely, avoid overly casual language or unnecessary verbosity.
- type: USER
content: "{{inputs.prompt}}"
Properties
endpoint *Requiredstring
Model endpoint
modelName *Requiredstring
Model name
baseUrl string
Base URL
Custom base URL to override the default endpoint (useful for local tests, WireMock, or enterprise gateways).
caPem string
CA PEM certificate content
CA certificate as text, used to verify SSL/TLS connections when using custom endpoints.
clientPem string
Client PEM certificate content
PEM client certificate as text, used to authenticate the connection to enterprise AI endpoints.