Documentation
Best Practices
Learn proven techniques for creating accurate, reliable analyses that drive better decisions.
Foundation Principles
Start with Clear Objectives
Before building any model, clearly define:
- What decision(s) are you trying to make?
- When do you need to make them?
- What specific questions need answers?
- What level of accuracy is needed?
- Who will use the results and how?
Embrace "Good Enough" Modeling
Perfect models don't exist. Focus on building models that are:
- Useful: Address the key decision
- Understandable: Stakeholders can follow the logic
- Actionable: Lead to clear next steps
- Timely: Available when the decision needs to be made
Model Building Best Practices
Keep It Simple
Start with the simplest model that addresses your question, then add complexity only if needed:
- Begin with 3-5 key Areas of Uncertainty or Decision
- Test simple versions before building complex ones
Validate Your Inputs
Your outputs are only as good as your inputs:
- Use multiple sources: Don't rely on single data points
- Check historical data: How accurate were past estimates?
- Get expert input: Validate assumptions with people who know
- Test extreme values: Make sure your ranges are realistic
Document Everything
Future you (and others) will thank you:
- Record data sources and assumptions
- Explain why you chose specific alternative values for each Area of Uncertainty or Decision as well as the probability assigned to each uncertaintycase
- Note any limitations or concerns
- Keep track of model versions and changes
Avoid the Overconfidence Trap
People tend to be overconfident in their estimates. Be aware of this and try to challenge your assumptions:
- Make ranges wider than your first instinct
- Ask "What could go wrong?" and "What could go better than expected?"
- Challenge assumptions with devil's advocate thinking
Sanity Check
Always test your model:
- Extreme value testing: Set variables to min/max and see if results make sense
- Order of magnitude checks: Are results in the right ballpark?
- Trend analysis: Do results change in expected directions when inputs change?
- Historical comparison: How do results compare to past similar situations?
Sensitivity Testing
Understand what drives your results:
- Identify the most influential variables
- Test different distribution assumptions
- Vary correlation assumptions
- Check robustness to outliers
Communication Best Practices
Lead with Insights
Structure your communication around key insights:
- Bottom line up front: What's the recommendation?
- Key insights: What did you learn?
- Supporting evidence: What analysis supports this?
- Methodology: How did you do the analysis?
Address Uncertainty Honestly
Don't hide uncertainty - embrace it:
- Show ranges of outcomes and cumuliative likelihoods, not just point estimates
- Discuss key assumptions and their impact
- Acknowledge what you don't know
- Suggest ways to reduce uncertainty if needed
Make It Visual
Use charts effectively:
- Choose chart types that support your message
- Highlight key insights with annotations
- Use consistent colors and formatting
- Avoid chart junk that distracts from the message
- Don't put too many charts in a single slide
Organizational Best Practices
Build Modeling Capability
Create sustainable modeling practices:
- Train multiple people, not just one expert
- Develop standard templates and approaches
- Create review processes for important analyses
- Build a library of past analyses for reference
Establish Governance
For critical decisions, establish clear processes:
- Define who can approve model assumptions
- Require peer review for high-stakes analyses
- Document model validation procedures
- Archive models and results for future reference
Learn from Results
Track how your models perform:
- Compare predictions to actual outcomes when possible with lookback analysis efforts
- Identify patterns in modeling errors
- Update future models based on lessons learned
- Share learnings across the organization
Ethical Considerations
Avoid Bias
Be aware of potential biases:
- Confirmation bias: Looking for results that confirm preconceptions
- Anchoring bias: Being overly influenced by initial estimates
- Availability bias: Overweighting recent or memorable events
- Optimism bias: Being unrealistically positive about outcomes
Consider Stakeholder Impact
Think about who is affected by your analysis:
- Are there groups who might be harmed by the decision?
- Have you considered unintended consequences?
- Are you transparent about limitations and assumptions?
- Do stakeholders understand the uncertainty in your results?
Final Recommendations
Remember that the goal of analysis is better decisions, not perfect models. Focus on:
- Clarity: Make your analysis easy to understand and act on
- Honesty: Be transparent about uncertainty and limitations
- Relevance: Address the actual decision being made
- Timeliness: Deliver insights when they're needed
The best analysis is the one that leads to better decisions. Keep that goal in mind, and you'll create valuable, actionable insights.