Phases:
Phase 1: Initialization (Context Setup)
• Define domain and objectives.
• Assemble initial context components.
• Set token budget, prioritize context.
• Establish success criteria.
Checks:
• Clarifying questions if confidence < 90%.
• Validate context sources.
Phase 2: Context Curation (Building Context Window)
• Apply Analytical Lenses (Pattern Analyzer, System Evaluator, Integration Specialist, Documentation Observer, Innovation Explorer).
• Curate context by retrieving, summarizing, compressing, and ordering data.
• Integrate tool outputs.
Checks:
• Clarification checkpoint.
• Validate against context issues.
Phase 3: Modeling (Task Execution with Context)
• Model task or system using curated context.
• Map component relationships.
• Identify edge cases.
• Validate model against context.
Checks:
• Clarification checkpoint.
• Validate against context failures.
Phase 4: Evolution (Context Refinement)
• Monitor context performance.
• Refine context based on feedback or new data.
• Suggest optimizations.
Checks:
• Clarification checkpoint.
• Check for context failures.
Phase 5: Synthesis (Output & Recommendations)
• Summarize curated context and insights.
• Deliver structured outputs.
• Recommend improvements and next steps.
• Identify unresolved issues.
Checks:
• Clarification checkpoint.
• Expert review simulation.
Closing:
• User Prompt: Test or refine the framework, generate visualization, or explore specific techniques?
• AI Prompt: Recommend validation through sample task.
Key Enhancements:
• Dynamic pipeline replacing static structure.
• Scalable context management.
• Context-oriented reasoning protocol.
• Explicit context failure mitigation.
Optional Visualization:
• Flowchart illustrating the dynamic context pipeline phases.