Skip to main content

Windmill AI

Windmill provides ways to have AI help you in your coding experience.

OpenAI integration

If you're rather interested in leveraging OpenAI from your scripts, flows and apps, check OpenAI Integration.

To enable Windmill AI, go to the "Windmill AI" tab in the workspace settings and add a supported model's resource. Code completion is disabled by default, but you can enable it in the same tab.

Enable Windmill AI

Windmill AI for scripts

AI Chat

The script editor includes an integrated AI chat panel designed to assist with coding tasks directly. The assistant can generate code, identify and fix issues, suggest improvements, add documentation, and more.

Key features include:

  • Granular Apply/Reject: When the AI suggests code changes, you can apply or discard specific parts of the suggestion, rather than having to accept or reject the entire block.
  • Contextual Awareness: You can provide additional context to guide the AI’s suggestions, such as database schemas, diffs from the deployed version, or runtime errors.
  • Quick Actions: The panel includes built-in shortcuts for common tasks like fixing bugs or optimizing code, making it faster to apply standard improvements.

Code completion

The script editor includes code autocomplete using AI. Pressing Tab completes code snippets based on neighboring context, not just at the cursor.

It can be enabled/disabled at the user level in the script editor settings.

Summary copilot

From your code, the AI assistant can generate a script summary.

Legacy AI features

The following guides are still applicable for code editing of inline scripts inside flows and apps.

Code generation


In a code editor (Script, Flow, Apps), click on AI and write a prompt describing what the script should do. The script will follow Windmill's main requirements and features (exposing a main function, importing libraries, using resource types, declaring required parameters with types). Moreover, when creating an SQL or GraphQL script, the AI assistant will take into account the database/GraphQL API schema when selected.

Prompt

Pro tips

The AI assistant is particularly effective when generating Python and TypeScript (Bun runtime) scripts, so we recommend using these languages when possible. Moreover, you will get better results if you specify in the prompt what the script should take as parameters, what it should return, the specific integration and libraries it should use if any. For instance, in the demo video above we wanted to generate a script that fetches the commits of a GitHub repository passed as a parameter, so we wrote: fetch and return the commits of a given repository on GitHub using octokit.

Code editing

Inside the AI Gen popup, you can choose to edit the existing code rather than generating a new one. The assistant will modify the code according to your prompt.

Code fixing

Upon error when executing code, you will be offered to "AI Fix" it. The assistant will automatically read the code, explain what went wrong, and suggest a way to fix it.

Windmill AI for flows

Generate Workflows from prompts.

Windmill AI for Flows support two creation modes: Sequence Flows and Trigger Flows. In both cases, you specify a prompt for each step. The AI assistant will then generate the code step by step and link them together. You will have the opportunity at the end of each step to review, edit or even regenerate the step's code.

For each action, you can either choose a script from the Hub or generate one from scratch using Windmill AI. At the end of the process, flow inputs are inferred and you just need to fill them in.

Sequence flows

Generate a flow consisting of a sequence of scripts.

Trigger flows

Trigger flows are designed to pull data from an external source and return all of the new items since the last run, without resorting to external webhooks. A trigger script is intended to be used as scheduled poll with schedules and states (rich objects in JSON, persistent from one run to another) in order to compare the execution to the previous one and process each new item in a for loop.

If there are no new items, the flow will be skipped.


The inputs of the for-loop action are automatically filled in with the outputs of the trigger step.

Moreover, the flow is automatically set to run every 15 minutes when deployed. The schedule can then be customized (e.g. every 30 seconds etc.). This allows you to avoid relying on webhooks sent by external APIs, which can be tedious to configure.

Summary copilot for steps

From your code, the AI assistant can generate a summary for flow steps.

Step input copilot

When adding a new step to a flow, the AI assistant will suggest inputs based on the previous steps' results and flow inputs.

Flow loops iterator expressions from context

When adding a for loop, the AI assistant will suggest iterator expressions based on the previous steps' results.

Flow branches predicate expressions from prompts

When adding a branch, the AI assistant will suggest predicate expressions from a prompt.

CRON schedules from prompt

The AI assistant can generate CRON schedules from a prompt.

Models

Windmill AI supports:

  • OpenAI's models (o1 currently not supported)
  • Azure OpenAI's models
    • Base URL format: https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}
  • Anthropic's models (including Claude 3.7 Sonnet in extended thinking mode)
  • Mistral's Codestral
  • DeepSeek's models
  • Google's Gemini models
  • Groq's models
  • OpenRouter's models
  • Together AI's models
  • Custom AI: base URL and API key of any AI provider that is OpenAI API-compatible

If you'd like to use a model that isn't in the default dropdown list, you can use any model supported by a provider (like gpt-4o-2024-05-13 or claude-3-7-sonnet-20250219) by simply typing the model name in the model input field and pressing enter.

AWS Bedrock

For models hosted on AWS Bedrock, you can use the Custom AI provider and follow the instructions in the Windmill AI using AWS Bedrock guide.