Glasp’s note: This is Hatching Growth, a series of articles about how Glasp organically reached millions of users. In this series, we’ll highlight some that worked and some that didn’t, and the lessons we learned along the way. While we prefer not to use the term "user," please note that we’ll use it here for convenience 🙇♂️
If you want to reread or highlight this newsletter, save it to Glasp.
Recap: #1–#6 in one glance
Context: The “Next 1 Million Project”
After YouTube Summary hit escape velocity, we began running a series of experiments under the Next 1 Million Project banner. The idea was simple: test quick concepts that might attract the next wave of users, while probing what reinforced Glasp’s mission. Hatch was one of those probes.
Where the Kindle Personality Test explored identity via books, Hatch explored new knowledge via remix.
What is Hatch?
At its simplest, Hatch takes highlights from two articles, reasons across them, and generates a new article—typically 1,000–1,500 words—with explicit source attribution.
Manual or random: Choose two articles yourself or let the system pick randomly.
Attribution: Every generated piece cites its source articles.
Auto-Hatch: Enable daily auto-generation (one new article per day).
Scale: With ~2,790 articles in one account, there are ~3.89M possible pairings.
We also experimented with cross-user combinations—e.g. mixing my highlights with my cofounder’s—to see if the diversity created fresher insights.
Why we built it
Several overlapping motivations:
Marketing hack: After the YouTube Summary milestone, we were looking for the next experiment that could spread. “Reasoning engine” was the term du jour for GPT-4; combining highlights felt timely.
Content + SEO strategy: As covered in Hatching Growth #2, Glasp’s growth relies heavily on SEO. More high-quality text = more surface area. Hatch automates that content flywheel.
Asset leverage: Highlights are durable assets. Remixing them compounds their value.
Core action incentive: To unlock more Hatches, users need to highlight more. That strengthens the product’s core loop.
Launch & distribution
We launched Hatch on X, LinkedIn, YouTube, and Product Hunt. Distribution tactics included:
Daily posts with screenshots of generated articles.
Tagging the original authors of the source articles on X/LinkedIn. Some reposted (e.g., Scott Belsky of Behance), generating lightweight buzz.
Product Hunt launch: not a breakout, but it gave exposure to a curious audience.
It didn’t hit the virality of YouTube Summary or ChatGPT Extensions. Still, it seeded visibility and feedback loops.
Feedback from users
“Why only two sources?” Requests for mixing 3+ articles.
“Why only highlights?” Requests to use posts (Glasp Posts, Medium, Substack, etc.).
“This sparks ideas.” Users liked how cross-concept reasoning surfaced unexpected connections for projects and writing.
What we learned
As a growth hack: Too weak. Curiosity and novelty alone don’t create million-user pull.
As a system asset: Strong. Once running, Hatch keeps producing content with minimal maintenance. Some articles were even cited in Wikipedia and larger outlets.
TAM matters: YouTube Summary solved a universal pain (time). Hatch solves a niche curiosity (idea remix). That difference explains adoption.
Still bullish on reasoning: Human × AI co-reasoning feels like a promising new interaction pattern.
Tech stack
Frontend: Next.js
Infra: GCP (Firestore for highlights, Cloud Run for services)
LLM: GPT for reasoning
Post-mortem
As marketing: Weak. Didn’t spark mass adoption.
As strategy: Sound. It fits the long-term SEO and asset-leverage playbook.
As exploration: Valuable. Reinforced the idea that reasoning-based remixing can generate genuinely novel content.
What’s next
We don’t expect Hatch to be a breakout hit, but it will quietly compound. Auto-Hatch alone generates hundreds of new articles daily, indexed over time. More importantly, the project keeps us exploring the frontier of AI as a reasoning partner.
Next episode, we’ll share more about what happened after these Next 1 Million experiments—what stuck, what we sunset, and how we folded learnings back into the core.
If you have experiments you’d like us to analyze—or want us to run Hatch on specific articles—let us know in the comments 🙏
Partner with Glasp
We currently offer newsletter sponsorships. If you have a product, event, or service you’d like to share with our community of learning enthusiasts, sponsor an edition of our newsletter to reach engaged readers.
We value your feedback
We’d love to hear your thoughts and invite you to our short survey.