monday.com
Autofill with AI
Improving AI setup for non-technical builders
Final Autofill with AI setup, connecting inputs, instructions, and outputs in one clear loop.
About
Autofill with AI allows monday.com users to automatically fill board columns using AI, based on existing data and lightweight instructions.
As monday.com introduced prebuilt AI actions, we saw strong interest but low successful adoption. Many users struggled to understand how their configuration choices influenced the AI’s output, leading to setup abandonment and low trust in the feature.
I redesigned the Autofill with AI setup experience to help non-technical users understand cause and effect through interaction rather than explanation.
My role
Product Designer
I led the end-to-end design of the Autofill with AI setup experience, from problem framing and exploration to validation and rollout. I partnered closely with a PM and engineers, shaping key decisions around flexibility, constraints, and how much control to expose without compromising predictability.
Outcome
Setup conversion increased from 35.12% to 40.03%, reducing setup abandonment by ~14%
Average AI actions per account increased by 25% (from 87 to 109) across ~2.5k new AI columns
Timeline
May - Jul 2025
End-to-end setup flow: translating an IT request to English.
Context
The product
Autofill with AI allows users to set an automation that fills a specific column using a combination of a prompt and existing board data.
The user
The primary users are monday.com builders. Some are technical, but many are not. Their goal is to create reliable, easy-to-use boards that help teammates follow intended workflows.
Constraints
• The overall structure of the AI action setup needed to remain intact
• Users needed flexibility to add instructions and iterate toward their desired outcome
• Changes were limited to the existing side panel UI
• Design decisions needed to be validated before full engineering investment
• Users needed flexibility to add instructions and iterate toward their desired outcome
• Changes were limited to the existing side panel UI
• Design decisions needed to be validated before full engineering investment
Problem
What wasn’t working
Users struggled to get the AI to produce the results they expected. They didn’t understand what the dropdowns meant or how their choices affected the outcome, leading many to abandon setup or lose trust after unsatisfactory results.
Why it mattered
This confusion directly impacted adoption. Users who failed on their first attempt were unlikely to try again, limiting the feature’s long-term value.
Previous setup experience, where users selected inputs and instructions without clear visibility into how those choices affected the AI’s output.
Exploration
Early hypotheses
Users assumed the AI already had full context of the board and didn’t understand why an input needed to be selected.
If users understood the relationship between input and output, they would be able to set up the automation more successfully on their first attempt.
Explored directions
We tested presenting the setup as a single readable sentence to surface the relationship between inputs and outputs. Early testing showed improvement, leading us to invest in a broader redesign of the setup experience.
Low-effort experiment to validate the sentence-based mental model before committing to a larger redesign.
One source of inspiration was monday.com’s Automation Center. It uses a sentence-based structure to help users configure complex logic in a readable, predictable way, and it’s a pattern builders already know.
We hypothesized that applying a similar structure to AI actions could reduce cognitive load and make intent clearer without introducing new concepts.
Solution
Teaching input and output through language
We reframed the setup as a sentence that clearly communicated cause and effect:
Extract [this] from [input], using the following instructions.
Extract [this] from [input], using the following instructions.
Instead of explaining AI concepts, this made input and output understandable through natural language. Only the sentence chips are editable, each opening a focused menu that reduces cognitive load and prevents invalid setups.
Empty state of the redesigned Extract action, using sentence structure to make inputs and outputs understandable from the start.
Making outputs predictable
We added a preview of seven items so users could immediately see how their selections affected the output, helping them connect configuration choices to real results and increasing trust in the AI.
Progressive disclosure and iteration
To avoid overwhelming users, we minimized upfront explanation. Additional instructions were revealed only after users saw the first preview, encouraging learning through interaction and iteration.
Applying the same sentence-based empty state pattern across all AI actions to create a consistent and predictable setup experience.
Learnings
• Concepts that feel obvious to teams building AI are not intuitive to most users
• Users rarely invest time learning a feature if they fail on the first attempt
• Setting clear expectations is critical for building trust in AI systems
This project reinforced a principle I apply when designing generative tools: users do not need to understand the system, but they do need to understand cause and effect. Simplifying complex systems into familiar language and pairing that clarity with immediate previews builds trust and supports iteration.
This approach applies beyond monday.com to any generative system where users must guide outcomes without being taught the underlying model or mechanics.
Background
Autofill with AI evolved over time. The first version was an open prompt block that allowed users to freely describe what they wanted AI to fill. While this helped us learn how advanced users wanted to use AI, it required strong prompt-writing skills and a clear understanding of AI capabilities.
Based on these learnings, we introduced prebuilt AI blocks such as extract, translate, and assign labels. This project focuses on improving the setup experience for these prebuilt blocks, making them accessible to a much broader audience.