Ecosystem Report: March Was Wild

Apr 1, 2026

Monthly ecosystem report covering March 18–31, 2026. Data as of April 1.


The Month in One Sentence

We started March with 54 projects on Claw4Science. We're ending it with 107. That's not a typo.


What Happened

March 2026 was the month the OpenClaw science ecosystem went from "interesting niche" to "you can't ignore this anymore." Here's the timeline of how it unfolded:

Week 1 (Mar 18–22): Claw4Science launched. The first batch — OpenClaw, NanoBot, PicoClaw, and a handful of science-focused tools like EvoScientist and MedgeClaw — gave us 54 projects. Respectable, but quiet.

Week 2 (Mar 23–25): The discovery phase. Searching GitHub page by page turned up project after project: ScienceClaw (four of them, all different), PaperClaw (six!), DrugClaw (two), ResearchClaw (four). The naming collision problem became impossible to ignore.

Week 3 (Mar 26–29): The floodgates. Cross-referencing with the Pantheon database added Karpathy's autoresearch (60K stars!), Alibaba's DeepResearch, and AgentLaboratory. We also found NVIDIA's NemoClaw, Hermes Agent (now 21K stars), and Edict — a multi-agent orchestration system inspired by China's ancient Three Departments and Six Ministries government structure. Yes, you read that correctly.

Week 4 (Mar 30–31): Consolidation. OpenBioMed Skills (PharMolix + Tsinghua AIR) joined the skill hubs. We hit 107 projects and 24 skill hubs.


The Numbers

MetricStart of MarchEnd of MarchChange
Projects54107+98%
Skill hubs1924+26%
Blog posts1025+150%
Skills surveyed02,203New
Google impressions0917/weekNew
Google clicks022/weekNew

The Fastest Risers

Some projects didn't just arrive — they arrived running.

Hermes Agent went from "not on our radar" to 21K stars in March. Built by Nous Research, it's the agent that solves a problem nobody else was addressing: making local open-source models actually work with tool calls. Same model that fails on OpenClaw works perfectly on Hermes, because it ships per-model parser adapters. If you care about data privacy and zero API costs, this is the one to watch.

Edict — 12.8K stars for a multi-agent system inspired by the imperial Chinese bureaucracy. Nine AI "ministers" coordinate via a Kanban-style dashboard, each responsible for a domain. Absurd? Maybe. Effective? The star count suggests yes.

autoresearch — Andrej Karpathy's entry into the space landed with 63K stars. Enough said.


The Dormant Projects

Not everyone made it through March in good shape.

  • AutoBA (JoshuaChou2018) — last updated November 2024. The multi-omic analysis agent appears abandoned.
  • BioDiscoveryAgent (Stanford SNAP) — last updated July 2025. Experimental design tool for genetic perturbations, but no recent activity.
  • CellAgent — removed from the directory. 12 stars, last updated October 2024.

These aren't failures — they're the natural lifecycle of open-source research projects. Papers get published, grad students graduate, priorities shift. But if you're choosing a tool for production use, check the pushed_at timestamp before committing.


The Naming Wars

March revealed something unexpected: the OpenClaw science ecosystem has a serious naming collision problem.

NameIndependent projectsLargest
ScienceClaw4beita6969 (392★)
PaperClaw6guhaohao0991 (185★)
ResearchClaw4wentorai (400★)
MedClaw3zteyesreal (45★)
DrugClaw2QSong (135★)
SciClaw3drpedapati (19★)

We built disambiguation pages for all six collision groups. But the bigger question is: why does this keep happening? Our theory: the "X-Claw" naming pattern is so natural that independent teams converge on the same name without knowing about each other. It's a sign of healthy ecosystem growth — and a warning that discoverability matters.


The Gap Nobody's Filling

A WeChat article from "生物信息与育种" (Bioinformatics & Breeding) made an observation that stuck with us: every science AI agent is focused on medical and human genetics. Agricultural genomics — plant GWAS, pan-genomes, molecular breeding — remains untouched.

The technical foundations are identical. Variant calling doesn't care if the DNA comes from a patient or a soybean. The first "AgriClaw" would have zero competition. If you're in ag-biotech and reading this: the field is wide open.


The Skill Landscape

We completed the first comprehensive survey of the science skill ecosystem: 2,203 skills across 12 hubs, classified into 34 categories.

The headline finding: Genomics alone accounts for 25% of all skills (550 out of 2,203). Literature search is next at 5.6%. Ecology and environmental science? 1.2%. The distribution tells you exactly where the bioinformatics community has been spending its AI energy — and where it hasn't.

Full interactive survey →


What to Watch in April

  1. Hermes Agent — at 21K stars and climbing, it could become the default non-OpenClaw agent for privacy-conscious researchers
  2. OpenBioMed Skills — 45 skills from PharMolix + Tsinghua AIR; watch for adoption in the drug discovery community
  3. The agriculture gap — someone's going to fill it. The question is who
  4. Our paper — the Claw4Science survey paper is in progress. 107 projects, 2,203 skills, and a lineage tree of the entire ecosystem


Next report: May 1, 2026. Data refreshed daily via GitHub API.

Ecosystem Report: March Was Wild | Blog