Sonder Labs
In ProgressAI-powered content platform for structured knowledge delivery. Building the infrastructure for scalable AI content creation.
Tech Stack
Overview
Sonder Labs is an experimental platform exploring how AI can be used to create structured, high-quality educational content at scale. The goal isn't to replace human creators, but to handle the mechanical work of content production so experts can focus on the strategic and creative aspects.
The Problem
Creating consistent, well-structured content is time-intensive:
- Research and outlining — Finding reliable sources, organizing information
- Writing — Translating knowledge into clear, engaging prose
- Formatting — Maintaining consistent style and structure
- Distribution — Publishing to multiple platforms with platform-specific formatting
All of this compounds when you're trying to maintain a regular publishing schedule across multiple formats (articles, videos, social posts).
Approach
Rather than building a generic "AI content generator," Sonder Labs focuses on structured workflows for specific content types:
- Knowledge articles — Research-backed, structured guides following a consistent template
- Concept explanations — Breaking down complex topics into digestible pieces
- Implementation guides — Step-by-step instructions for practical tasks
Each content type has its own workflow, quality checks, and output format.
System Architecture
The platform consists of several interconnected systems:
Content Pipeline (n8n)
- Research automation — Gathering and organizing source material
- Structured generation — Using templates and AI to create initial drafts
- Quality validation — Checking for accuracy, clarity, and completeness
Content Management (Next.js + Supabase)
- MDX storage and rendering
- Version control for articles
- Metadata management (tags, categories, relationships)
Publishing System
- Multi-format export (web, PDF, email newsletter)
- Platform-specific formatting (Twitter threads, LinkedIn posts)
- Scheduled publication
Stack
- Next.js 14 — Frontend and content rendering
- OpenAI GPT-4 — Content generation and transformation
- n8n — Workflow automation and orchestration
- Supabase — Database and content storage
- Vercel — Hosting and deployment
Current Status
The core content pipeline is functional and has been used to generate multiple pieces of published content. Current work focuses on:
- Improving quality validation (reducing factual errors)
- Building a feedback loop (how did users engage with the content?)
- Scaling to multiple content types simultaneously
Lessons Learned
1. Templates are critical Without strong structural templates, AI-generated content becomes generic. The template defines the thinking structure, not just the format.
2. Quality validation can't be automated (yet) While AI can check for basic issues (broken markdown, missing sections), evaluating whether content is actually useful requires human judgment.
3. The workflow is the product The visible content is the output, but the real value is in the reusable workflow. Building systems that can be adapted to different content types is more valuable than one-off generators.
4. Context matters more than capability Using GPT-4 vs GPT-3.5 makes less difference than providing the right context, examples, and constraints. Focus on the input structure, not just the model.
What's Next
- Audio generation using text-to-speech for each article
- Automated cross-linking between related pieces
- Content personalization based on reader expertise level
- Integration with YouTube for video script generation
Need help implementing AI in your business?
I offer consulting services for AI automation and workflow implementation. From strategy to execution.
Work With Me