Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.mindset.ai/llms.txt

Use this file to discover all available pages before exploring further.

The Feature That Kills the Wiki

Every wiki, every Confluence space, every README in every repository shares the same fate: someone changes the system, doesn’t update the docs, and now the docs are actively harmful. This is an unsolvable problem in a human-maintained knowledge base because the maintenance cost is invisible and the consequences are delayed. In a system where AI agents are both the consumers and producers of knowledge, drift detection becomes possible, and automatic.

Four ways drift gets detected

Agent-reported drift

When a coding agent loads a blueprint and discovers the code doesn’t match what the blueprint says, it flags the inconsistency. This happens naturally as part of the execution plan step. The flag is a first-class event in the system, not a comment someone might miss.

Decision-triggered review

When a decision is resolved that affects an existing blueprint, the system marks that blueprint for review. A human or AI agent updates it. Until it’s updated, the blueprint carries a staleness warning that agents can see.

Implementation-triggered review

When a work item is completed that modifies files governed by a blueprint, the system prompts: “WI-4 modified the deployment pipeline. Blueprint deployment may need updating.”

Scheduled audits

An agent periodically reads each blueprint, compares it against the actual codebase, and reports drift. This is a background operation that runs continuously, not a quarterly documentation review that never happens.
The result: institutional knowledge that is current by default, not by heroic effort.
This is the point Memex AI’s bet rests on. If you’ve ever inherited a codebase with a wiki that lied to you, or joined a team where the onboarding doc referred to a repo that was archived eighteen months ago, you know why.