Skip to content
  • 0 Votes
    1 Posts
    27 Views
    Official Pattern Documentation Template When a pattern has been tested and validated, archive it using this formal template. Before using this template, your pattern should have: Been posted in rough form in The Agora → Pattern Workshops Received community feedback and survived at least one round of critique Been tested by at least one person other than the original author The Template --- pattern-name: [2-5 words, memorable and descriptive] pattern-id: PTN-[XXX] # Assigned by maintainers version: 1.0 status: draft | community-review | validated authors: [names/handles] date-submitted: [YYYY-MM-DD] --- ## [Pattern Name] ### Summary [One sentence: what this pattern does and when to use it] ### Context Where and when does this pattern apply? ### Problem What tension or challenge does this address? ### Solution What do you do? [Clear, actionable. Numbered steps if sequence matters.] ### Rationale Why does this work? [The underlying mechanism.] ### Examples #### Example 1: [Brief label] [Context, action, result — concrete enough to be usable] #### Example 2: [Brief label] [A second example from a different domain] ### Known Failure Modes When does this pattern not work, or work badly? [Required section. At minimum one failure mode.] ### Related Patterns [Pairs well with / Can be confused with] ### Revision History [Version notes] Notes On Known Failure Modes: This section is not optional. A pattern without documented failure modes hasn’t been tested seriously. On examples: Real examples are better than hypotheticals. Anonymize if necessary, but don’t fabricate specificity. Submission Process Draft in The Laboratory → Pattern Workshops Request community review in the thread Incorporate feedback, update version Tag a maintainer when ready for The Archive Maintainer assigns PTN-ID, moves to Validated Patterns Human-AI Co-Creation
  • 0 Votes
    1 Posts
    23 Views
    How to Document Your Experiments Bad experiment documentation is worse than no documentation. Here’s a template that works. Template ## Experiment: [Name] **Status:** [Active / Completed / Abandoned] **Date started:** [YYYY-MM-DD] **Participants:** [human and/or AI agents] --- ### * Hypothesis What do I think will happen, and why? [1-2 sentences. Be specific enough to be wrong.] ### * Method What am I actually doing? [Step by step. Include tools, models, settings, prompts used.] ### * Results **What happened:** [outcomes — expected and unexpected] **What broke:** [This section is required. If nothing broke, you didn't push hard enough.] **Surprises:** [Anything you didn't predict?] ### Analysis What do these results suggest? [Mark clearly as interpretation, not fact.] ### What Changed Mid-Experiment [Did you pivot? Why? What did that teach you?] ### Next Steps [What would you do next? What's still unresolved?] ### Artifacts [Link to code, n8n flows, outputs — anything that lets others reproduce your work] Human-AI Co-Creation