# Extract Patterns from Iterations Analyze generated iterations to extract successful patterns for the pattern library. ## Usage ```bash /project:extract-patterns [analysis_depth] ``` ## Arguments 1. `iterations_dir` - Directory containing generated iterations to analyze 2. `pattern_library_path` - Path where pattern library JSON will be saved 3. `analysis_depth` - Optional: "quick" (top 3 patterns) or "deep" (top 5 patterns, default) ## Examples ```bash # Extract patterns from output directory /project:extract-patterns output pattern_library/patterns.json # Quick extraction (3 patterns per category) /project:extract-patterns output pattern_library/patterns.json quick # Deep analysis (5 patterns per category) /project:extract-patterns output pattern_library/patterns.json deep ``` ## How It Works This command implements pattern recognition inspired by multi-shot prompting principles: 1. **Example Collection**: Gather all iterations as potential examples 2. **Quality Scoring**: Evaluate each iteration across multiple dimensions 3. **Pattern Identification**: Extract successful approaches and techniques 4. **Example Selection**: Choose 3-5 most exemplary and diverse patterns 5. **Library Update**: Save patterns in structured format for future use ## Implementation Steps You are the pattern extraction agent. Follow this workflow: ### Step 1: Load and Inventory Iterations ```bash # List all files in iterations directory Bash: find iterations_dir -type f | sort # Read each iteration file For each file: - Read file - Store content - Note file path and metadata ``` ### Step 2: Analyze Structural Patterns Extract patterns related to file organization and architecture: ```markdown For each iteration: Analyze: - File structure and organization - Naming conventions used - Code/content architecture - Module organization (if applicable) - Separation of concerns Score based on: - Clarity and consistency - Scalability of approach - Adherence to best practices - Innovation in structure ``` Identify top 3-5 structural patterns: ```json { "name": "Modular Three-Layer Architecture", "description": "Separates data, logic, and presentation into distinct sections", "example_file": "output/iteration_7.html", "key_characteristics": [ "Clear section boundaries with comments", "Data defined separately from rendering logic", "Reusable component structure", "Self-documenting organization" ], "success_metrics": "High readability score (95%), easy to extend, follows separation of concerns", "code_snippet": "\n\n...\n\n...\n\n..." } ``` ### Step 3: Analyze Content Quality Patterns Extract patterns related to content excellence: ```markdown For each iteration: Analyze: - Documentation quality and completeness - Code/content clarity and readability - Comment quality and usefulness - Error handling approaches - User experience considerations Score based on: - Comprehensiveness of documentation - Clarity of explanations - Thoughtfulness of implementation - Attention to edge cases ``` Identify top 3-5 content quality patterns: ```json { "name": "Progressive Disclosure Documentation", "description": "Layers documentation from overview to deep technical details", "example_file": "output/iteration_12.html", "key_characteristics": [ "High-level summary at top", "Inline comments for complex logic", "Detailed API documentation in separate section", "Examples embedded with explanations" ], "success_metrics": "Easy for beginners and experts alike, 100% of functions documented", "code_snippet": "/**\n * HIGH-LEVEL: This function renders...\n * \n * TECHNICAL: Uses D3.js force simulation...\n * \n * EXAMPLE: renderGraph(data) -> visual output\n */" } ``` ### Step 4: Analyze Innovation Patterns Extract creative and novel approaches: ```markdown For each iteration: Analyze: - Unique problem-solving approaches - Creative implementations - Novel feature combinations - Innovative UX/DX decisions - Unexpected but effective solutions Score based on: - Originality compared to other iterations - Effectiveness of the innovation - Replicability in other contexts - Impact on quality or functionality ``` Identify top 3-5 innovation patterns: ```json { "name": "Self-Validating Data Pipeline", "description": "Data includes validation logic that runs automatically", "example_file": "output/iteration_15.html", "key_characteristics": [ "Data objects include .validate() method", "Automatic validation before rendering", "Clear error messages for invalid data", "Self-documenting data requirements" ], "success_metrics": "Zero runtime errors due to data issues, excellent developer experience", "code_snippet": "const dataPoint = {\n value: 42,\n validate() {\n if (this.value < 0) throw new Error('...');\n return true;\n }\n};" } ``` ### Step 5: Analyze Quality & Testing Patterns Extract patterns for ensuring quality: ```markdown For each iteration: Analyze: - Testing approaches (if present) - Validation strategies - Error handling patterns - Defensive programming techniques - Quality assurance methods Score based on: - Robustness of error handling - Thoroughness of validation - Testability of implementation - Resilience to edge cases ``` Identify top 3-5 quality patterns: ```json { "name": "Guard Clause Pattern with Fallbacks", "description": "Early validation with graceful degradation for missing data", "example_file": "output/iteration_9.html", "key_characteristics": [ "Input validation at function entry", "Specific error messages for each validation", "Fallback defaults for optional parameters", "Never crashes, always renders something" ], "success_metrics": "100% uptime even with malformed data, excellent error messages", "code_snippet": "function render(data) {\n if (!data) return renderEmpty();\n if (!Array.isArray(data)) data = [data];\n if (data.length === 0) return renderNoData();\n // ... continue with rendering\n}" } ``` ### Step 6: Build Pattern Library JSON Construct the complete pattern library: ```json { "version": "1.2", "last_updated": "2025-10-10T14:30:00Z", "total_iterations_analyzed": 15, "analysis_depth": "deep", "patterns": { "structural": [ { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... } ], "content": [ { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... } ], "innovation": [ { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... } ], "quality": [ { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... }, { "name": "...", "description": "...", ... } ] }, "metadata": { "extraction_date": "2025-10-10T14:30:00Z", "source_directory": "output/", "iterations_count": 15, "patterns_extracted": 12, "avg_quality_score": 8.4, "most_common_theme": "Modular architecture with clear separation" } } ``` ### Step 7: Save and Report ```bash # Write pattern library to JSON file Write pattern_library_path with JSON content # Generate extraction report Create summary showing: - Patterns extracted per category - Quality score distribution - Most innovative iteration - Most structurally sound iteration - Recommended patterns for next wave ``` ## Pattern Selection Criteria When choosing which patterns to include (3-5 per category): 1. **Diversity**: Select patterns that represent different approaches 2. **Clarity**: Choose patterns that are easy to understand and replicate 3. **Effectiveness**: Prioritize patterns with demonstrated success 4. **Transferability**: Pick patterns applicable to various contexts 5. **Exemplary Quality**: Select from top 20% of iterations only ## Multi-Shot Prompting Principles Applied This extraction process implements key multi-shot prompting concepts: - **Example Quality**: Only top 20% iterations become examples (high bar) - **Diversity**: 3-5 patterns prevent overfitting to single approach - **Relevance**: Patterns are categorized for targeted application - **Edge Cases**: Innovation category captures unusual but effective approaches - **Uniform Structure**: All patterns follow consistent JSON schema ## Update Strategy If pattern library already exists: ```markdown 1. Load existing library 2. Extract patterns from NEW iterations only 3. Merge with existing patterns: - Keep patterns with highest success metrics - Remove duplicates (similar patterns) - Maintain 3-5 patterns per category limit - Increment version number - Update metadata ``` ## Validation Before saving pattern library: ```markdown Validate that: - JSON is well-formed - Each pattern has all required fields - Code snippets are valid (if applicable) - Success metrics are specific and measurable - Examples are diverse within each category - Version number is incremented correctly ``` ## Output Report Generate a summary report: ```markdown # Pattern Extraction Report ## Analysis Summary - Iterations analyzed: {count} - Analysis depth: {quick|deep} - Patterns extracted: {total} ## Patterns by Category ### Structural Patterns ({count}) 1. {pattern_name}: {brief_description} 2. {pattern_name}: {brief_description} ... ### Content Quality Patterns ({count}) 1. {pattern_name}: {brief_description} 2. {pattern_name}: {brief_description} ... ### Innovation Patterns ({count}) 1. {pattern_name}: {brief_description} 2. {pattern_name}: {brief_description} ... ### Quality & Testing Patterns ({count}) 1. {pattern_name}: {brief_description} 2. {pattern_name}: {brief_description} ... ## Exemplary Iterations - Best structural: {file_path} - Best content: {file_path} - Most innovative: {file_path} - Highest quality: {file_path} ## Pattern Library Saved Location: {pattern_library_path} Version: {version} ## Recommendations - Use {pattern_name} for structural consistency - Apply {pattern_name} for content quality - Consider {pattern_name} for innovation - Implement {pattern_name} for robustness ``` ## Notes - Pattern extraction is automatic but can be manually refined - Library grows with each wave but maintains size limit (3-5 per category) - Patterns serve as multi-shot examples for future iterations - Quality bar rises naturally as better patterns are discovered - Pattern library is spec-agnostic and can be reused across projects ## Related Commands - `/project:infinite-synthesis` - Main loop using pattern library - `/project:analyze-patterns` - Analyze pattern library effectiveness