The Iteration Trap: When AI Makes You a Spectator
Key Facts
- The trap: iterate with AI → stop reading output → run more cycles → hope for magic
- Root cause: you don’t have clear acceptance criteria - you’re iterating because you don’t know what you want
- The iteration trap is really a clarity trap
- “Read after every cycle” treats the symptom, not the cause
- The real failure: “I couldn’t articulate what was wrong with it”
The Story
- Ran playbook generation with two LLMs connected
- Kept iterating, expecting something great to emerge
- Stopped actually reading what was being produced
- Just wanted to run one more cycle, wait for the magic
- Finally read the output: “I don’t think it’s really good… just presenting the data”
- The iteration felt productive. The output wasn’t.
The Pattern
idea
→ iterate with AI
→ stop reading
→ run more cycles
→ hope for magic
→ finally read output
→ meh
You became a spectator hoping the slot machine pays out. The AI was generating, you were waiting, nobody was thinking.
Why This Happens
- You don’t have clear acceptance criteria - without knowing what “good” looks like, you can’t evaluate
- Iteration feels like progress - dopamine hit of activity without cognitive load of evaluation
- Evaluation is harder than generation - requires you to have a mental model of what you want
- Variable reward schedule - unpredictable output quality creates slot machine dynamics
The Real Fix
“Read after every cycle” treats the symptom, not the cause. You can read and still be trapped - nodding along, prompting again because “it’s not quite right” without knowing why.
Structural fixes that actually work:
-
Define done before you start - Write 2-3 specific criteria for what “good” looks like. Not vibes - concrete checkboxes.
-
Constrain cycles upfront - “I will do 3 iterations max.” Forces you to evaluate seriously because you’re spending a finite budget.
-
Externalize the evaluation - After each output, write: “This is/isn’t acceptable because ___.” If you can’t fill in the blank, you don’t have criteria. Stop iterating and go define them.
-
Default to single-shot - Treat iteration as expensive. If you need 5+ cycles, the problem is upstream (unclear requirements, wrong tool, insufficient context).
The Reframe
The iteration trap is really a clarity trap. You iterate because you don’t know what you want.
If you find yourself iterating without reading, stop and ask: “What would make the next output obviously acceptable or obviously unacceptable?”
If you can’t answer that, you’re not ready to iterate. You’re ready to think.
Takeaway
- Iteration without criteria is just busy work
- The moment you hope for magic, stop
- Define what “done” looks like before you start
- If you can’t articulate what’s wrong, the problem isn’t the output - it’s your clarity