13 projects. 20,857 stars. Zero commits.
We track 132 AI agent projects for scientific research at Claw4Science. Every day, our curation script checks each one for signs of life — recent commits, open issues being addressed, new releases.
13 of them have gone silent. Not deleted. Not archived. Just... stopped.
The longest hasn't been updated in 521 days. The most popular has 5,487 stars and 58 unanswered issues. Combined, these 13 projects have nearly 2,900 forks and 254 open issues that nobody is responding to.
What happened?
The Full List
| Project | Stars | Days Silent | Last Push | What It Did |
|---|---|---|---|---|
| AgentLaboratory | 5,487 | 232 | Aug 2025 | Autonomous research agent |
| AI-Scientist-v2 | 5,373 | 112 | Dec 2025 | Sakana's AI scientist |
| AI-Researcher | 5,094 | 176 | Oct 2025 | HKUDS research agent |
| Auto-Deep-Research | 1,482 | 176 | Oct 2025 | HKUDS deep research |
| ChemCrow | 894 | 476 | Dec 2024 | Chemistry AI pioneer |
| Virtual Lab | 660 | 100 | Dec 2025 | Nanobody design (Nature) |
| FreePhDLabor | 496 | 170 | Oct 2025 | Autonomous research |
| MDCrow | 234 | 91 | Jan 2026 | Molecular dynamics |
| AutoBA | 226 | 521 | Nov 2024 | Multi-omics automation |
| CRISPR-GPT | 156 | 237 | Aug 2025 | CRISPR experiment design |
| BioDiscoveryAgent | 102 | 277 | Jul 2025 | Stanford bio discovery |
| BioMaster | 91 | 157 | Nov 2025 | Nucleome analysis |
| BioMedAgent | 63 | 406 | Feb 2025 | 67 bio tools (Nature BME) |
That's 20,857 stars of scientific capability sitting idle.
Pattern 1: The Paper Is the Product
Five of these projects share the same story: an academic team built an AI agent, published a paper about it, and moved on.
ChemCrow (894 stars) was a pioneer — one of the first chemistry AI agents, published in 2023, cited over a thousand times. But the code stopped updating in December 2024. There are 12 open issues. Nobody's answering.
BioMedAgent (63 stars) shipped with a Nature Biomedical Engineering paper. The code was always the paper's appendix, not a standalone product.
CRISPR-GPT, BioDiscoveryAgent, Virtual Lab — same pattern. Paper published, citations accumulated, code frozen.
This isn't a bug. It's how academia works. The incentive structure is clear: papers get you tenure, maintained code gets you nothing. A professor's GitHub contribution graph doesn't appear on their CV. Their h-index does.
The result is a landscape littered with brilliant prototypes that nobody maintains. Each one proved something was possible. None of them became something you can rely on.
Pattern 2: The Team Moved to Something Bigger
HKUDS stopped updating both AI-Researcher (5,094 stars) and Auto-Deep-Research (1,482 stars) on the same day — October 16, 2025. Combined: 6,576 stars, 845 forks, 94 open issues.
But HKUDS didn't disappear. They were building nanobot (38,000+ stars) and ClawTeam (4,500+ stars). The old projects weren't abandoned — they were superseded.
Sakana's AI-Scientist-v2 (5,373 stars) went quiet in December 2025. Sakana is a well-funded company. They're almost certainly working on v3 internally, or pivoting the technology into a commercial product.
Some dormancy is graduation, not death. The team's attention is a finite resource, and they're spending it on the next generation.
Pattern 3: Stars Without Stewards
AgentLaboratory is the most striking case. 5,487 stars. 772 forks. 58 open issues. The concept — an autonomous agent that runs entire research experiments — clearly resonated. But resonance doesn't produce maintainers.
Of those 772 forks, how many contributed back? The answer, based on the pull request history: almost none.
FreePhDLabor (496 stars) tells the story in its name. It was a conceptual provocation — "what if AI could do a PhD's work?" The idea was validated. The code was a proof of concept. There was never a plan for long-term maintenance.
This is the open-source paradox in research: stars measure interest, not investment. A thousand people can star a repo in the time it takes one person to fix a bug.
Pattern 4: Outpaced by the Next Generation
The AI for Science ecosystem moves fast. Six months is a generation.
AutoBA (226 stars, dormant 521 days) did automated multi-omics analysis. Today, OmicsClaw does the same thing but better — with persistent memory, more analysis skills, and active development.
MDCrow (234 stars, dormant 91 days) automated molecular dynamics simulations. BloClaw now covers molecular dynamics plus cheminformatics, protein folding, and autonomous RAG — all in one workspace.
BioMaster (91 stars) was absorbed by the broader multi-omics agent category. When the ecosystem offers 10 tools that each do what you did plus more, there's no incentive to keep updating.
Natural selection applies to software too.
The Counterintuitive Conclusion
13 dormant projects ≠ 13 failures.
Most of them accomplished what they set out to do:
- 5 published papers that advanced the field
- 2 teams graduated to bigger, better projects
- 3 proved concepts that others built on
- 3 were naturally replaced by the next generation
The real question isn't "why did they stop?" It's "are there enough new projects replacing them?"
The answer: yes. In the past month alone, we've added 25+ actively maintained projects to our directory. The ecosystem isn't shrinking — it's churning. Old projects make way for new ones that learned from their predecessors.
What This Means for You
If you're using a dormant project:
- Check the fork list. Someone might be maintaining a fork. GitHub's network graph shows active forks.
- Read the issues. Often the last few issues contain workarounds or migration guides.
- Look for the successor. In most cases, a newer project covers the same functionality.
If you're choosing a tool:
- Filter by recent activity. Our project directory shows last-push dates and commit sparklines for every project.
- Prefer projects with multiple contributors over single-author repos.
- Check if there's a paper — but also check if the code has commits after the paper date.
If you're a project author:
- If you've moved on, say so. A one-line note in the README ("This project is no longer maintained. Consider using X instead.") saves users hours of confusion.
- Archive the repo. GitHub's archive feature makes the status unambiguous.
- 58 open issues with no response is worse than a clear "this project has ended."
The Lifecycle of a Science Agent
What we're seeing is a natural lifecycle emerging:
Paper published → Code released → Stars accumulate → Issues pile up
→ Team moves on → Fork activity drops → Dormancy → ReplacementThis cycle runs in about 6-18 months for most academic science agents. Commercial projects (like Sakana) may have longer cycles but face the same gravitational pull toward the next thing.
The projects that break this cycle — OpenClaw, AutoResearchClaw, EvoScientist — tend to have either strong community governance, commercial backing, or both.
For everyone else: build something great, publish the paper, and don't feel guilty when you move on. Someone will pick up where you left off.
All 13 dormant projects are still listed in our directory with activity status labels. Browse the 119 actively maintained projects at claw4science.org.
