Skip to content

Milestones

Ramon Perez edited this page Mar 13, 2025 · 3 revisions

Milestones & Roadmap

March & April 2025 - Caffeinated Sloth

Mission: Build the market's best single-node inference server - methodical yet surprisingly fast.

March Focus: Precision Architecture

We're tightening Cortex's foundation this month. Improving security protocols, enhancing reliability, and making configuration more flexible. We'll roll out new benchmarking tools to measure our progress and create tutorials to help users get the most from our platform.

April Focus: Accelerated Evolution

April is all about making Cortex faster and easier to use. We're launching two videos - an intro to Cortex on YouTube and a technical presentation at C++Now. The Python engine gets supercharged with vLLM integration, and we're adding Intel GPU support to expand hardware compatibility. Key Deliverables:

Key Deliverables:

  • Hardened security measures
  • Flexible configuration options
  • Comprehensive benchmarking suite
  • Python engine performance boost
  • Intel GPU support
  • Introduction and technical videos

May 2025 - Quantum Waffle

Quantum Waffle (May 2025)

Mission: Expand Cortex's reach through better distribution channels and improved developer experience.

May Focus: Distribution & Access

This month is about making Cortex available everywhere developers want it. We're adding support for multiple package managers, creating specialized Docker images for different use cases, and launching our redesigned documentation site. The new Model Hub will make finding and using the right models simpler than ever, while expanded GPU support ensures more hardware compatibility.

Key Deliverables:

  • Package manager integrations (npm, pip, etc.)
  • Specialized Docker images for various deployment scenarios
  • Redesigned documentation site with improved navigation
  • New Model Hub interface with enhanced filtering and metrics
  • Support for additional GPU architectures
  • Streamlined developer workflows and CLI improvements
Clone this wiki locally