What Content Creators Can Teach Us About Research Management
Written on February 5, 2026
Tim, founder of MediaStorm (影视飓风), shared how his team produced 150 videos in 2024 while maintaining quality. As someone who leads junior researchers, I found unexpected parallels between scaling content production and managing research projects.
The Four Pillars of MediaStorm’s Workflow
Tim structures his operation around four pillars:
- Organization Architecture - How teams are structured
- Editorial/Topic Selection - How ideas are chosen and developed
- Production - How work gets executed
- Post-Production - How outputs are refined and delivered
Let me break down each and show how they translate to research.
Pillar 1: The Middle-Platform Model
MediaStorm uses a “middle-platform” (中台) approach: technical specialists (cinematographers, editors, producers) are pooled in a central planning department rather than assigned to fixed teams. Projects draw from this shared pool based on their needs.
For research teams:
Instead of assigning each junior researcher to a single project, consider creating shared capability pools:
- Methods pool: Students skilled in specific techniques (statistical analysis, qualitative coding, programming)
- Domain pool: Students with expertise in particular research areas
- Writing pool: Students who excel at literature review, paper drafting, or visualization
When a new project starts, you assemble the right combination from these pools. This prevents the common problem of one student being overloaded while another waits for their project to progress.
Practical implementation:
- Maintain a skills matrix showing each student’s capabilities
- For each new project, explicitly identify which skills are needed at each phase
- Cross-train students so the pools aren’t too thin
Pillar 2: The HKRR Framework for Topic Selection
MediaStorm evaluates content ideas using the HKRR framework:
- Happiness - Is it enjoyable?
- Knowledge - Does it teach something valuable?
- Resonance - Does it connect emotionally?
- Rhythm - Does it have good pacing and flow?
Good content satisfies at least one of these attributes well.
Adapted for research projects:
Before committing resources to a research direction, evaluate it against a similar framework:
- Novelty - Does this advance the field meaningfully?
- Feasibility - Can we actually execute this with available resources and skills?
- Impact - Will this matter to practitioners or other researchers?
- Fit - Does this align with the team’s capabilities and interests?
Practical implementation:
- Create a shared “idea library” using Feishu, Airtable, or similar tools
- In weekly meetings, have students pitch ideas with explicit ratings on each dimension
- Make selection criteria transparent so students learn to self-evaluate before proposing
Pillar 3: The Shell Theory for Packaging
One of Tim’s key insights is the “shell theory” (壳理论): important content often needs an engaging “shell” to attract attention. A video about camera sensors becomes more watchable when framed as a “camera awards show.”
For research:
Academic writing often fails not because the ideas are bad, but because they’re packaged poorly. The same research can be framed as:
- A technical contribution (new method)
- A practical solution (solves real problem)
- A surprising finding (challenges assumptions)
- A synthesis (connects disparate ideas)
Practical implementation:
- When students draft papers, have them write 3 different abstracts with different framings
- Discuss which venue each framing suits best
- Practice the “so what?” test: can you explain why a busy person should care?
Pillar 4: Workflow Automation and Visibility
MediaStorm evolved through three management phases:
- Excel era - Manual tracking, scattered deliverables
- Multi-table era - Centralized tracking but limited process visibility
- Project management era - Automated notifications, embedded checkpoints, real-time dashboards
The key insight: as teams scale, informal coordination breaks down. You need systems that make progress visible and enforce quality gates.
For research teams:
Research projects often fail not from lack of effort, but from lack of visibility. Advisors don’t know students are stuck. Students don’t know their work is blocking others.
Practical implementation:
- Use a shared project board (Trello, Notion, or even a shared doc) with clear stages: Literature Review → Method Design → Data Collection → Analysis → Writing → Revision
- Require brief weekly status updates (3 sentences max)
- Define explicit “done” criteria for each stage
- Set up automated reminders for stalled tasks
The Three-Step Management Loop
MediaStorm’s management philosophy boils down to three steps:
- Direction - Set the right goals at the start
- Correction - Catch problems early and adjust
- Reflection - Learn from each project for the next
This maps perfectly to research supervision:
Direction:
- At project kickoff, explicitly define success criteria
- Ensure the student can articulate the research question in one sentence
- Agree on a rough timeline with milestones
Correction:
- Weekly check-ins focused on blockers, not status reports
- “What’s preventing you from making progress?” is more useful than “What did you do?”
- Intervene early when scope creeps or methodology drifts
Reflection:
- After each paper submission or project completion, hold a brief retrospective
- What worked? What would we do differently?
- Document lessons so they compound across projects
Scaling Quality: The Bottleneck Problem
MediaStorm discovered that expanding output requires identifying bottlenecks. They compressed project cycles from 35-40 days to 5 days by removing delays—not by working faster, but by eliminating waiting time.
For research:
Projects rarely stall because people aren’t working hard enough. They stall because:
- Students wait for advisor feedback
- Advisors wait for student drafts
- Everyone waits for data access or IRB approval
Practical implementation:
- Map out the critical path for each project type
- Identify where waiting typically occurs
- Create backup tasks so students are never blocked completely
- Batch feedback: set specific times for review rather than ad-hoc responses
What This Means for Junior Researcher Development
The deeper lesson from MediaStorm isn’t about efficiency—it’s about creating systems that let people focus on what they do best.
Tim’s team can produce 150 quality videos because:
- Each person knows their role
- Handoffs are clear and automatic
- Quality standards are explicit
- Problems surface quickly
For research supervision, this means:
- Clear expectations reduce anxiety and wasted effort
- Visible progress keeps everyone aligned
- Structured feedback accelerates learning
- Systematic reflection compounds improvements
The goal isn’t to turn research into a factory. It’s to remove the friction that prevents good research from happening.
What systems have you found useful for managing research teams? I’d love to hear at persdre@gmail.com.