View on GitHub

Cross-Disciplinary Software Team Spaces

A Pattern Language

Display of Work

Summary

Leave prototypes and work-in-progress visible. This invites curiosity and helps ideas spread across teams.

Context

Teams create valuable artifacts, prototypes, and work-in-progress. These could inspire and inform other teams if made visible. Success requires a culture that values transparency and psychological safety around sharing imperfect work. Teams also need commitment to ongoing curation and maintenance.

Problem

When teams hide their work, they miss opportunities to learn from each other’s approaches. They also miss unexpected connections between projects. Empty display spaces send a signal that sharing work is not expected or valued.

Solution

Create dedicated spaces for displaying current work, prototypes, and visual artifacts. Team members can see and be inspired by each other’s progress. Use systematic display design, rotation protocols, and engagement techniques.

Display Design Guidelines:

Physical Display Infrastructure:

Digital Display Integration:

Display Layout Principles:

Rotation Protocols:

Content Lifecycle Management:

  1. Active Display Phase (2-4 weeks):
    • Current work-in-progress prominently featured
    • Daily updates encouraged for rapidly evolving projects
    • Team members add sticky note comments and questions
    • Progress photos and iteration documents accumulated
  2. Mature Display Phase (2-3 weeks):
    • Completed work with lessons learned documents
    • Success metrics and outcomes clearly visible
    • Connection points to other projects highlighted
    • Invitation for others to build upon or adapt approaches
  3. Archive Transition Phase (1 week):
    • Digital documentation created before physical removal
    • Key artifacts moved to permanent collection or storage
    • Impact stories and collaboration outcomes documented
    • Space prepared for incoming work displays

Rotation Schedule Framework:

Curator Role Implementation:

Engagement Techniques:

Interactive Elements:

  1. Annotation Opportunities:
    • Sticky note parking lots for questions and observations
    • Shared whiteboard space next to displays for collaborative sketching
    • Comment QR codes linking to digital discussion threads
    • “Build on this” invitation cards for extending displayed work
  2. Guided Discovery Activities:
    • Weekly “Gallery Walks”: 30-minute structured tours with work creators as guides
    • Monthly “Cross-Pollination Sessions”: Facilitated discussions connecting work across teams
    • Quarterly “Innovation Archaeology”: Deep dives into how displayed work evolved over time
    • “Maker Rounds”: Peer review of work-in-progress (like medical rounds)

Social Proof Mechanisms:

  1. Visibility Metrics:
    • Engagement tracking: Simple dot voting for “this inspired me” or “I learned from this”
    • Collaboration stories: Documented cases where displayed work led to team cooperation
    • Iteration evidence: Before/after photos showing how feedback influenced development
    • Connection mapping: Visual network of how displayed work influenced other projects
  2. Recognition Systems:
    • “Display of the Month”: Community voting for most engaging or educational display
    • “Cross-Pollination Awards”: Recognition for work that inspired collaboration across teams
    • “Process Transparency Champions”: Celebrating teams that share authentic work-in-progress
    • “Evolution Stories”: Show how displayed work changed based on community feedback

Structured Engagement Events:

  1. “Demo Derby” Sessions (Weekly, 30 minutes):
    • Rapid-fire 3-minute presentations of newly displayed work
    • Q&A focus on process and lessons learned rather than just outcomes
    • Explicit invitation for others to build upon or adapt approaches
    • Document spontaneous collaboration agreements
  2. “Work-in-Progress Critiques” (Bi-weekly, 45 minutes):
    • Architecture studio-style critique sessions adapted for software teams
    • Structured feedback protocol: observations, questions, suggestions
    • Focus on generative critique that improves work rather than just evaluation
    • Rotation of critique leadership to develop facilitation skills
  3. “Failure Archaeology” (Monthly, 60 minutes):
    • Dedicated sessions for displaying and discussing failed experiments
    • Emphasis on learning extraction and knowledge preservation
    • Create “failure artifacts” that prevent repeating mistakes
    • Celebrate intelligent failures that advanced team understanding

Technology-Enhanced Engagement:

  1. Augmented Reality Integration:
    • AR markers on physical displays that reveal digital layers when viewed through mobile apps
    • Historical progression viewing: see how work evolved over time through AR overlay
    • Hidden documentation: process notes, decision rationales, and context accessible via AR
    • Remote collaboration: distributed team members can leave AR comments on physical displays
  2. Social Network Integration:
    • Internal social feeds highlighting display interactions and collaborations
    • Expertise discovery: connect people based on displayed work and demonstrated interests
    • Serendipity algorithms: suggest connections between seemingly unrelated displayed work
    • Knowledge graph building: automatic tagging and linking of displayed artifacts

Content Guidelines:

Authentic Work-in-Progress:

Balanced Information Architecture:

Intellectual Property Considerations:

Forces

Consequences

Positive

Negative

Examples

Architecture and Design Studios:

MIT Architecture Department:

IDEO Design Studios:

Software Development Organizations:

Spotify Engineering:

GitHub Engineering:

Research and Development Labs:

Bell Labs Historical Model:

Google Research:

Manufacturing and Production:

Toyota Production System:

3M Innovation Centers:

Hybrid/Remote Adaptations:

Automattic (WordPress.com):

GitLab Distributed Development:

Anti-Examples and Lessons Learned:

Failed Implementations:

Success Factors:

Implementation

Phase 1: Infrastructure and Guidelines (2-4 weeks)

  1. Space Assessment and Design:
    • Identify high-traffic areas suitable for display installation
    • Design display layouts that optimize sight lines and interaction zones
    • Install physical infrastructure (magnetic walls, pin-up boards, lighting)
    • Set up digital display systems with interactive capabilities
  2. Content and Curation Guidelines:
    • Develop clear criteria for what work should be displayed
    • Create templates for consistent display formatting
    • Establish IP and confidentiality guidelines for safe sharing
    • Define curator roles and rotation schedules

Phase 2: Pilot Implementation (4-8 weeks)

  1. Early Adopter Engagement:
    • Recruit 2-3 volunteer teams for initial display pilots
    • Provide training on display design and curation techniques
    • Document early successes and challenges for broader rollout
    • Create feedback loops for continuous improvement
  2. Engagement Pattern Development:
    • Establish regular rotation schedules and review processes
    • Implement annotation systems and feedback mechanisms
    • Begin structured engagement events (gallery walks, critique sessions)
    • Track engagement metrics and collaboration outcomes

Phase 3: Scaling and Optimization (8-16 weeks)

  1. Organization-Wide Rollout:
    • Expand display infrastructure to all team areas
    • Train additional curators and display stewards
    • Integrate displays with existing team rituals and processes
    • Develop technology solutions for digital display integration
  2. Advanced Engagement Techniques:
    • Implement AR and social network integration features
    • Establish recognition systems and success celebrations
    • Create advanced facilitation techniques for cross-team collaboration
    • Develop analytics and measurement systems for continuous improvement

Sources