A2AAgent​A2​A​Agent

Call a remote AI agent via the A2A protocol.

This tool allows an LLM to call a remote AI Agent via the A2A protocol. Make sure to specify a name and a description so the LLM can understand what it does to decide if it needs to call it.

yaml
type: "io.kestra.plugin.ai.tool.A2AAgent"

Call a remote AI agent via the A2A protocol.

yaml
id: ai-agent-with-agent-tools
namespace: company.ai

inputs:
  - id: prompt
    type: STRING
    defaults: |
      Each flow can produce outputs that can be consumed by other flows. This is a list property, so that your flow can produce as many outputs as you need.
      Each output needs to have an ID (the name of the output), a type (the same types you know from inputs, e.g., STRING, URI, or JSON), and a value, which is the actual output value that will be stored in internal storage and passed to other flows when needed.
tasks:
  - id: ai-agent
    type: io.kestra.plugin.ai.agent.AIAgent
    provider:
      type: io.kestra.plugin.ai.provider.GoogleGemini
      modelName: gemini-2.5-flash
      apiKey: "{{ kv('GEMINI_API_KEY') }}"
    systemMessage: Summarize the user message, then translate it into French using the provided tool.
    prompt: "{{inputs.prompt}}"
    tools:
      - type: io.kestra.plugin.ai.tool.A2AAgent
        description: Translation expert
        serverUrl: "http://localhost:10000"
Properties

Agent description

The description will be used to instruct the LLM what the tool is doing.

Server URL

The URL of the remote agent A2A server

Default tool

Agent name

It must be set to a different value than the default in case you want to have multiple agents used as tools in the same task.