This directory contains comprehensive research papers documenting the innovations, architecture, and impact of the Zoo Network ecosystem and Gym training platform.
File: zoo-foundation-mission.tex
Size: 43KB (~1,100 lines)
Topics: Conservation AI, Educational AI, Frontier AI research, 501(c)(3) mission
File: zoo-network-architecture.tex
Size: 48KB (~1,200 lines)
Topics: L2 AI specialization layer, HLLM integration, federated learning, experience ledger
File: zoo-tokenomics.tex
Size: 2,050+ lines
Topics: 100% airdrop model, validator requirements, contribute-to-access economics, DAO governance
File: gym-training-platform.tex
Size: 49KB (~1,228 lines)
Topics: 100+ model support, 8 training methods, educational democratization
File: hllm-training-free-grpo.tex
Size: 71KB (~50 pages)
Topics: 99.8% cost reduction, Hamiltonian invariant, semantic advantages, 3-stage algorithm
File: experience-ledger-dso.tex
Size: 54KB (~1,600 lines)
Topics: Decentralized Semantic Optimization, Byzantine-robust curation, IPFS/Arweave storage
File: zip-002-zen-reranker.tex
Size: 49KB (~1,288 lines)
Topics: Native 7680-dim embeddings, BitDelta compression (31.87×), Byzantine aggregation
- Total Papers: 7 comprehensive LaTeX research papers
- Total Lines: ~9,500+ lines
- Total Size: 362KB
- Total References: 140+ academic citations
- Cross-References: All papers extensively cite each other
- First 100% airdrop AI token (zero VC/private sales)
- Gini coefficient 0.61 (vs industry 0.80-0.90)
- Democratic validator participation (1,000 ZOO minimum stake)
- 99.8% cost reduction via Training-Free GRPO ($18 vs $10,000+)
- 100× data efficiency (100 examples vs 10,000+)
- Context optimization vs parameter updates
- Hamiltonian conservation law (Ψ·Θ = κ)
- Decentralized Semantic Optimization (DSO)
- Experience Ledger with Byzantine fault tolerance
- Three-layer storage architecture
- 7680-dimensional native embeddings
- 87,000+ users across 142 countries
- 420,000+ models trained
- Democratizing AI training worldwide
PDFs are automatically compiled when .tex files are pushed to main/master:
git add papers/*.tex
git commit -m "Update research papers"
git pushPDFs will be available in:
- Artifacts: Download from GitHub Actions runs (90-day retention)
- Releases: Tagged releases with all PDFs bundled
- Repository: Committed to
papers/pdfs/directory
Requires LaTeX installation (MacTeX, TeX Live, or MiKTeX):
cd papers
# Compile single paper
pdflatex paper-name.tex
bibtex paper-name
pdflatex paper-name.tex
pdflatex paper-name.tex
# Compile all papers
for tex in *.tex; do
base="${tex%.tex}"
pdflatex "$tex"
bibtex "$base" 2>/dev/null || true
pdflatex "$tex"
pdflatex "$tex"
done- LaTeX distribution (texlive-full recommended)
- BibTeX for bibliography processing
- Standard LaTeX packages (amsmath, graphicx, hyperref, etc.)
For Understanding Zoo Ecosystem:
- Zoo Foundation Mission (overview)
- Zoo Network Architecture (technical infrastructure)
- Zoo Tokenomics (economic model)
For AI Training Innovations: 4. Gym Training Platform (practical tooling) 5. HLLM Training-Free GRPO (theoretical breakthrough) 6. Experience Ledger & DSO (implementation details)
For Technical Specifications: 7. ZIP-002: Zen-Reranker (embedding technology)
| Paper | AIME24 | Cost | Data | Time |
|---|---|---|---|---|
| Traditional Fine-Tuning | 80.0% | $10,000+ | 10K+ examples | Days |
| Training-Free GRPO | 82.7% | $18 | 100 examples | Hours |
All papers extensively cross-reference each other for academic rigor:
- Zoo Foundation Mission → All other Zoo papers
- Zoo Network Architecture → Tokenomics, HLLM, Experience Ledger, Gym
- Zoo Tokenomics → Foundation, Architecture, HLLM, Experience Ledger, Gym
- Gym Platform → HLLM, Experience Ledger, Tokenomics
- HLLM Training-Free GRPO → Gym, Experience Ledger, Zoo Network
- Experience Ledger → HLLM, Zoo Network, Gym, Tokenomics, Zen-Reranker
- ZIP-002 Zen-Reranker → Experience Ledger, Zoo Network
All papers use arXiv-style versioning: vYYYY.MM
Example:
- v2023.05 — May 2023 (Initial Version)
- v2024.06 — June 2024 (Major Revision)
- v2025.09 — September 2025 (Current)
See PAPER_TIMELINE.md for complete version history.
If you use these papers in your research, please cite:
@misc{zoo2025foundation,
title={Zoo Labs Foundation: AI-Blockchain Research for Conservation, Education, and Frontier Science},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025tokenomics,
title={Zoo Network Tokenomics: Fair Distribution Through Complete Airdrop},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025grpo,
title={HLLM with Training-Free GRPO: 99.8\% Cost Reduction via Semantic Optimization},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025dso,
title={Experience Ledger: Decentralized Semantic Optimization for Large Language Models},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025gym,
title={Gym: A Unified Platform for AI Model Training and Fine-Tuning at Scale},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zip2025reranker,
title={ZIP-002: Zen-Reranker with Native 7680-Dimensional Embeddings},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}- Website: https://zoo.ngo
- GitHub: https://github.com/zooai/gym
- Documentation: https://docs.zoo.ngo
- Hugging Face: https://huggingface.co/zooai
- Discord: https://discord.gg/zooai
- Twitter: https://twitter.com/zoolabsfdn
For questions about these research papers:
- Email: research@zoo.ngo
- Issues: https://github.com/zooai/gym/issues
- Discussions: https://github.com/zooai/gym/discussions
All research papers are released under Creative Commons Attribution 4.0 International (CC BY 4.0).
You are free to:
- Share — copy and redistribute the material
- Adapt — remix, transform, and build upon the material
Under the following terms:
- Attribution — You must give appropriate credit
See LICENSE file for full terms.
Last Updated: October 28, 2025 Organization: Zoo Labs Foundation Inc (501(c)(3) Non-Profit) Status: 7 papers complete, ready for publication