Learning Design

Crisis Decision-Making Through Case-Based Peer Learning

Designing a collaborative learning architecture for institutional investment professionals to navigate portfolio crisis scenarios through structured peer interaction.

Role Lead Instructional Designer
Timeline 8-10 weeks
Tools Learning Science Frameworks, Google Docs
Type Design Showcase

Project Background

I designed a comprehensive case-based peer learning module for institutional investment professionals dealing with "The Denominator Effect Dilemma"—a realistic portfolio crisis scenario where professionals must analyze trade-offs, navigate competing stakeholder pressures, and defend crisis management decisions.

This project showcases the instructional design process, methodology, and artifacts created during the design phase. Rather than demonstrating a built course, it highlights my ability to design sophisticated learning architectures, facilitate peer learning, and create comprehensive implementation materials grounded in learning science.

The target audience is institutional investors who must make complex decisions under uncertainty, with incomplete information, and conflicting stakeholder priorities—skills that benefit from collaborative case analysis rather than traditional lecture-based training.

Designing for Complex Decision-Making

The learning challenge required designing for crisis decision-making that involves:

  • Analysis of incomplete information: Real portfolio crises don't come with all the data neatly packaged
  • Balancing competing stakeholder interests: Board policy, investor relationships, cash flow constraints, and reputation risk all conflict
  • Justifying decisions under uncertainty: No single "right" answer exists—learners must defend judgment calls
  • Learning from peer perspectives: Diverse professional experiences enrich analysis beyond what any individual knows

The instructional design challenge was creating a learning experience that:

  • Goes beyond lecture: Complex decision-making requires active learning and application
  • Leverages peer collaboration: Social learning theory shows knowledge is constructed through dialogue
  • Feels realistic: Authentic assessment in a crisis simulation context
  • Scales to cohorts: Works for 16-20 learners without quality loss
  • Works across delivery modes: Adaptable for online, in-person, hybrid, or asynchronous formats

Design Process & Methodology

I applied evidence-based instructional design frameworks throughout the project:

  • Merrill's First Principles: Problem-centered learning, activation of prior knowledge, demonstration through peer modeling, application through case analysis, and integration via reflection
  • Social Constructivism: Knowledge built through collaboration and dialogue, leveraging the "zone of proximal development" where peers scaffold each other's learning
  • Cognitive Apprenticeship: Learning through modeling, coaching, scaffolding, articulation, and reflection in authentic contexts
  • Bloom's Taxonomy: Targeting the highest cognitive levels—analysis, evaluation, and synthesis—appropriate for experienced professionals
01

Learning Need Analysis

Identified the need for crisis decision-making training that goes beyond compliance-focused instruction. Analyzed target audience characteristics, prerequisite knowledge requirements, and learning context constraints including time pressure and distributed teams.

02

SME Collaboration

Designed a 7-phase partnership workflow to collaborate with subject matter experts on creating realistic case scenarios. Created SME collaboration guides and case design briefs ensuring the case had appropriate complexity, realistic messiness, and multiple defensible options.

03

Learning Architecture Design

Developed the 5-touchpoint peer learning protocol that structures collaboration from initial individual analysis through board defense simulation. Each touchpoint deliberately alternates between individual and collaborative work to maximize learning.

04

Assessment Design

Created a dual rubric system measuring both individual analysis quality (60%) and peer learning contribution (40%). Designed authentic performance tasks rather than traditional knowledge checks, ensuring learners demonstrate actual decision-making capability.

05

Implementation Planning

Developed comprehensive facilitator guides, discussion scaffolds, and delivery option documentation. Ensured the learning design could scale across multiple delivery modes and cohorts while maintaining pedagogical integrity.

What I Created

I designed a complete instructional architecture with three primary components:

1. Five-Touchpoint Peer Learning Architecture

Touchpoint 1: Initial Stance (Asynchronous, 30 min) — Learners individually analyze the case and post their preliminary recommendation before seeing peer responses, establishing baseline thinking and creating ownership.

Touchpoint 2: Small Group Stakeholder Analysis (Synchronous, 45 min) — Groups of 4 deeply analyze one stakeholder perspective (Board Chair, CIO, CFO, GP Partners), forcing empathy and perspective-taking through collaborative work.

Touchpoint 3: Gallery Walk (Synchronous, 25 min) — Groups present stakeholder analyses while others challenge assumptions using structured protocols, creating productive cognitive dissonance through exposure to conflicting viewpoints.

Touchpoint 4: Cross-Pollination (Asynchronous, 45 min) — Individual reflection integrating peer insights into revised recommendations, with explicit metacognition about how thinking evolved.

Touchpoint 5: Board Defense (Synchronous, 90 min) — Learners present recommendations to simulated board members (peers role-playing) who ask probing questions, developing confidence in defending decisions under pressure.

2. Comprehensive Implementation Package (9 Documents)

Facilitator Guide

Minute-by-minute session flow with facilitation moves, discussion quality indicators, troubleshooting common issues, and self-reflection criteria for facilitator development.

Learning Objectives

ABCD format objectives aligned to Bloom's taxonomy (analysis, evaluation, synthesis levels), with clear assessment evidence and measurability criteria.

Implementation Guide

3.5-hour structure breakdown, cohort sizing recommendations (16-20 optimal), delivery mode adaptations, technology requirements, and accessibility considerations.

Peer Learning Protocol

Detailed specifications for all 5 touchpoints including timing, purpose, facilitator roles, and pedagogical rationale grounded in social constructivism.

Assessment Rubric

Dual assessment approach: Individual Analysis (60% - trade-off clarity, stakeholder consideration, evidence use) and Peer Contribution (40% - perspective sharing, constructive challenge, synthesis).

Discussion Scaffolds

9 tools ensuring productive peer discussions including role cards (Analyst, Devil's Advocate, Synthesizer), discussion protocols, question stems, and challenge cards.

Case Design Brief

SME instructions for creating realistic case scenarios with requirements for ambiguity, multiple defensible options, stakeholder conflicts, and quantitative exhibits.

SME Collaboration Workflow

7-phase instructional designer and SME partnership process covering needs analysis, case development, review cycles, and quality assurance checkpoints.

Interview Presentation Guide

Portfolio presentation strategy for discussing this project in interview contexts, emphasizing process over product and methodology over tools.

3. Scalable Delivery Options

Fully Online: Zoom breakout rooms, Miro boards for collaborative work, LMS discussion forums for asynchronous touchpoints.

In-Person: Physical breakout spaces, poster boards for gallery walk, printed role cards and scaffolds.

Hybrid: Mixed remote and in-person participants with careful attention to equity and engagement across both groups.

Asynchronous: Discussion board adaptation over one-week timeframe, though acknowledges this loses 60% of the real-time dialogue value.

Critical Design Choices

Several key design decisions demonstrate evidence-based instructional design expertise:

Peer Learning Over Expert Lecture

Rationale: Complex decision-making benefits from multiple perspectives rather than seeking a single "right answer" from an expert. Social constructivism demonstrates knowledge is built through discourse.

Implementation: Designed 5 structured touchpoints ensuring productive collaboration rather than unguided group work that can devolve into unproductive conversation.

Dual Assessment (Individual + Peer)

Rationale: Avoids free-riding while rewarding both quality thinking AND collaborative behavior. 60% individual analysis measures decision-making capability; 40% peer contribution measures collaboration skills.

Assessment: Individual rubric evaluates trade-off clarity and evidence use. Peer rubric evaluates perspective-sharing, constructive challenge, and synthesis contributions.

Realistic Messiness in Case Design

Rationale: Real decisions involve incomplete data, competing stakeholders, and ambiguity. Case requirements specify multiple defensible options with no clear "right" answer, time pressure, and conflicting priorities.

Outcome: Learners practice professional judgment and argumentation rather than formula application, developing transferable crisis management skills.

Facilitator as Process Guide

Rationale: Preserves learner agency and avoids "sage on stage" dynamics. Facilitator role focuses on keeping discussions productive, managing time, and surfacing diverse views—not providing answers.

Scaffolds: Created 9 discussion tools (role cards, question stems, protocols) enabling facilitators to support learning without directing conclusions.

Design Quality & Projected Outcomes

Since this course has not been built or deployed, impact focuses on design quality indicators and projected learning outcomes based on the comprehensive implementation package created:

9
Comprehensive Deliverables Created
3+
Learning Science Frameworks Applied
5
Structured Peer Learning Touchpoints
4
Delivery Mode Adaptations

Design Quality Indicators:

  • Pedagogical grounding: Merrill's First Principles, Social Constructivism, Cognitive Apprenticeship, and Bloom's Taxonomy inform every design decision
  • Facilitator-ready materials: Minute-by-minute timing, facilitation moves, common challenges addressed, self-reflection criteria for facilitator development
  • Scalable architecture: Works across online, in-person, hybrid, and asynchronous modes; accommodates cohorts of 12-24 learners; reusable framework for different case scenarios
  • Accessible design: WCAG considerations including multiple participation modes, extended time options, alternative assessment formats, and inclusive facilitation protocols
  • Assessment rigor: Authentic performance tasks measuring synthesis-level thinking; dual rubric preventing free-riding; peer feedback mechanisms

Projected Learning Outcomes:

  • Learning effectiveness: 85%+ proficiency on rubric criteria; 50%+ learners revise initial recommendation after peer learning; high self-reported confidence gains
  • Engagement quality: 90%+ pre-work completion; 100% active participation in live sessions; 85%+ post-work submission rates
  • Satisfaction metrics: 4.0+/5.0 relevance and engagement ratings; 80%+ report peer discussion deepened understanding; strong word-of-mouth for future cohorts

Business Value:

  • Reusable architecture: Plug in different cases for different crisis scenarios (liquidity management, GP disputes, regulatory changes)
  • Scalable delivery: Cohorts of 16-20 without quality loss; low-tech requirements (Zoom, Google Docs, Miro—standard tools)
  • Efficient facilitator training: 2-3 hours onboarding time; comprehensive guide reduces dependency on single expert facilitator
  • Cost-effective: Free/low-cost technology stack option available; materials are digital and infinitely reproducible

What I Learned

SME collaboration is critical for authentic cases. Realistic crisis scenarios require deep domain expertise that instructional designers don't possess. The 7-phase SME collaboration workflow I created ensures partnership success by clarifying roles, setting expectations for case complexity (multiple defensible options, realistic messiness), and building in review cycles for pedagogical and content quality.

Scaffolding prevents chaos in peer learning. Early in the design process, I recognized that unstructured peer discussion can devolve into unproductive conversation or be dominated by one voice. Creating 9 discussion scaffolds (role cards, question stems, challenge cards, protocols) ensures discussions stay focused while preserving intellectual freedom. This balance between structure and openness is what makes peer learning work.

Assessment drives behavior. The dual rubric—60% individual analysis, 40% peer contribution—prevents free-riding while explicitly valuing collaboration. Learners who know they're assessed on both dimensions behave differently than those evaluated on individual work alone. Making peer contribution an explicit, weighted assessment criterion signals that collaboration is core to learning, not an optional add-on.

Facilitator training matters more than I initially thought. A sophisticated learning design only works if facilitators can execute it. Creating the comprehensive facilitator guide with minute-by-minute timing, facilitation moves mapped to learning science principles, and troubleshooting for common pitfalls ensures quality delivery. The self-reflection criteria help facilitators continuously improve rather than just "following the script."

Let's work together