Skip to main content

Why webhook‑triggered tests?

Part of Avido’s secret sauce is that you can kick off a test without touching your code.
Instead of waiting for CI or redeploys, Avido sends an HTTP POST to an endpoint that you control.
BenefitWhat it unlocks
Continuous coverageRun tests on prod/staging as often as you like and automated.
SME‑friendlyNon‑developers trigger & tweak tasks from the Avido UI.

How it works

  1. A test is triggered in the dashboard or automatically.
  2. Avido POSTs to your configured endpoint.
  3. Validate the signature + timestamp with our API/SDK.
  4. Run your LLM flow using prompt from the payload.
  5. Emit a trace that includes testId to connect results in Avido.
    Using OpenTelemetry? Set the avido.test.id span attribute instead — see the OpenTelemetry guide.
  6. Return 200 OK – any other status tells Avido the test failed.
Validating a webhook with the API requires both x-api-key and x-application-id headers. Use the application ID that issued the API key.

Payload example

When Avido triggers your webhook endpoint, it sends:
Webhook payload
{
  "prompt": "Write a concise onboarding email for new users.",
  "testId": "123e4567-e89b-12d3-a456-426614174000",
  "metadata": {
    "customerId": "1",
    "priority": "high"
  }
}
Headers:
HeaderPurpose
x-avido-signatureHMAC signature of the payload
x-avido-timestampUnix ms timestamp (in milliseconds) the request was signed
Notes:
  • testId is always included in webhook payloads and is unique per test run. Pass it to your ingest.create() call so the trace is linked to the Avido test.
  • metadata is optional and only included when available from the originating task.
  • The webhook body may contain additional fields beyond those shown above. For example, experiment-related data is included when tests are part of an experiment. Do not assume a fixed schema — always handle unknown fields gracefully.
You do not need to set traceId when responding to a webhook. The server generates one automatically. If you do set it, use a random UUID — do not reuse the testId as the traceId. See the Tracing guide for details.
Always validate the raw body. The HMAC signature is computed over the exact JSON payload sent by Avido. You must forward the raw, unmodified request body to the /v0/validate-webhook endpoint (or use it directly when verifying the signature). If you parse and re-serialize the body, differences in key ordering or whitespace can cause signature verification to fail.

Verification flow

If validation fails, respond 401 (or other 4xx/5xx). Avido marks the test as failed.

Body validation best practices

The webhook body is not a fixed schema. While the most common fields are prompt, testId, and metadata, the payload may include additional fields at any time (for example, experiment data when a test is part of an experiment). Your endpoint should:
  1. Accept any valid JSON body — do not reject requests that contain unknown fields.
  2. Forward the entire raw body for signature validation — the HMAC signature covers every field in the payload. If you strip, rename, or re-order fields before calling /v0/validate-webhook, the signature check will fail.
  3. Read only the fields you need after validation succeeds — safely ignore fields you don’t recognize.
When using a framework that automatically parses JSON (e.g., Express, Flask, FastAPI), pass the parsed object directly to the validation endpoint or SDK method. Avoid manually re-serializing it, as subtle differences (key order, whitespace) can break signature verification.

Code examples

curl --request POST \
  --url https://api.avidoai.com/v0/validate-webhook \
  --header 'Content-Type: application/json' \
  --header 'x-application-id: <application-id>' \
  --header 'x-api-key: <api-key>' \
  --data '{
  "signature": "abc123signature",
  "timestamp": 1687802842609,
  "body": {
    "prompt": "Write a concise onboarding email for new users.",
    "testId": "123e4567-e89b-12d3-a456-426614174000",
    "metadata": {
      "customerId": "1",
      "priority": "high"
    }
  }
}'

SDK method reference

validateWebhook.validate()

Verifies that an incoming webhook request was signed by Avido.
const { valid } = await client.validateWebhook.validate({
  signature,  // from x-avido-signature header
  timestamp,  // from x-avido-timestamp header
  body,       // the full parsed request body
});
ParameterTypeDescription
signaturestringThe x-avido-signature header value
timestampstringThe x-avido-timestamp header value
bodyobjectThe full webhook request body (parsed JSON)
Returns an object with a valid boolean field.

Experiments

When a test is triggered as part of an experiment, the webhook payload includes an additional experiment field. Experiments let you compare different configurations (variants) of your LLM pipeline against a baseline, and the experiment field tells your application which overrides to apply.

Experiment payload

Webhook payload with experiment
{
  "prompt": "Write a concise onboarding email for new users.",
  "testId": "123e4567-e89b-12d3-a456-426614174000",
  "metadata": {
    "customerId": "1",
    "priority": "high"
  },
  "experiment": {
    "experimentId": "aaa11111-bbbb-cccc-dddd-eeeeeeeeeeee",
    "experimentVariantId": "fff22222-3333-4444-5555-666666666666",
    "overrides": {
      "response_generator": {
        "temperature": 0.3,
        "system": "You are a concise assistant."
      }
    }
  }
}
FieldTypeDescription
experiment.experimentIdstring (uuid)Unique identifier for the experiment
experiment.experimentVariantIdstring (uuid)Unique identifier for the variant being tested
experiment.overridesRecord<string, Record<string, unknown>>Configuration overrides keyed by inference step name (e.g. response_generator, classifier). Each step’s value is an object of parameter overrides (e.g. temperature, system, model)
How overrides work: Your application already has its own configuration (prompts, models, parameters, etc.) for each step in your LLM pipeline. The overrides object tells you what to change: each key is a step name, and each value contains the specific settings to overwrite for that step. For example, if overrides contains { "response_generator": { "temperature": 0.3 } }, you should use 0.3 as the temperature for your response_generator step instead of whatever your default is — and keep all other settings as-is.

Using experiments in your application

When your webhook handler receives a payload with experiment, apply the overrides to the corresponding steps in your LLM pipeline:
app.post('/avido/webhook', async (req, res) => {
  // ... validation omitted for brevity ...
  const body = req.body;

  // Check if this test is part of an experiment
  if (body.experiment) {
    const { overrides } = body.experiment;

    // Apply overrides to each inference step in your pipeline.
    // The keys in `overrides` match the step names you defined in Avido.
    for (const [stepName, params] of Object.entries(overrides)) {
      // Example: override the temperature and system prompt
      // for your "response_generator" step
      applyStepConfig(stepName, params);
    }
  }

  const result = await runAgent(body.prompt); // your LLM call

  await client.ingest.create({
    events: [
      {
        type: 'trace',
        timestamp: new Date().toISOString(),
        testId: body.testId,
        input: body.prompt,
        output: result,
      },
    ],
  });

  return res.status(200).send('OK');
});

// Example helper — adapt to your framework / LLM client
function applyStepConfig(stepName: string, params: Record<string, unknown>) {
  // Look up the step in your pipeline config and merge overrides
  pipelineConfig[stepName] = {
    ...pipelineConfig[stepName],
    ...params,
  };
}
You don’t need to change your trace ingestion code. The testId already links the trace back to the correct experiment variant in Avido. Just pass body.testId as usual.
If your application doesn’t run experiments yet, you can safely ignore the experiment field — its presence is optional and your existing webhook handler will continue to work without changes.

Troubleshooting

ProblemCauseFix
Signature always invalidBody was re-serialized before validationPass the parsed JSON object directly; don’t JSON.stringify it first
Missing signature headerFramework strips custom headersCheck your reverse proxy / load balancer isn’t dropping x-avido-* headers
Test shows as failedEndpoint returned non-200 statusEnsure you return 200 OK after successful processing
Trace not linked to testMissing testId in ingestionPass body.testId in your ingest.create() call
Experiment overrides not appliedexperiment field ignored in handlerCheck for body.experiment and apply overrides to your pipeline config before running the LLM call

Next steps

  • Send us Trace events to capture your full LLM workflow.
  • Using OpenTelemetry? See the OpenTelemetry Integration guide.
  • Schedule or trigger tasks from Tasks in the dashboard.
  • Invite teammates so they can craft evals and review results directly in Avido.