# Tesslate — Complete Reference > The open-source AI builder stack for startups and growing companies. Ship like a superteam. ## About Tesslate Tesslate Corporation is a software company founded in 2025, based in North Carolina, USA. We build Tesslate Studio, an open-source, self-hostable AI-powered development platform. Our mission: make it possible for small teams and solo developers to build and ship software at the speed of much larger organizations, using AI agents that handle the heavy lifting — from code generation to deployment. ## Tesslate Studio — Product Details ### What It Is Tesslate Studio is an all-in-one AI-native development environment. It combines: - **Full-stack code generation**: Describe what you want in natural language — a prompt, a sketch, a spec, or a goal — and Studio generates the frontend, backend, database, APIs, and container configuration. - **AI agent orchestration**: Build, customize, and chain AI agents to automate repetitive development tasks. Agents can write code, run tests, deploy services, and more. - **Built-in development tools**: Code editor, integrated terminal, live browser preview, database viewer, and deployment pipeline — all in one interface. - **Smart Save**: One-click snapshots of your entire project state. Version your work without thinking about git. - **Agent Marketplace**: Pre-built agent templates you can use immediately or customize. ### Architecture Studio uses a Docker-based microservices architecture: - **Frontend**: React 19 SPA - **Backend**: Python/FastAPI - **Database**: PostgreSQL - **LLM Routing**: LiteLLM (supports OpenAI, Anthropic, Google, and any OpenAI-compatible provider) - **Container Orchestration**: Docker Compose with Traefik reverse proxy - **Devserver**: Isolated Docker container for running user code safely ### Supported LLM Providers Tesslate Studio supports all major LLM providers through LiteLLM: - OpenAI (GPT-4o, GPT-4.1, o3, o4-mini) - Anthropic (Claude Opus 4.6, Claude Sonnet 4.6, Claude Haiku 4.5) - Google (Gemini 2.5 Pro, Gemini 2.5 Flash) - Any OpenAI-compatible API endpoint - Self-hosted models via vLLM, Ollama, llama.cpp, LM Studio ### Installation **One-command install:** ```bash curl -fsSL https://tesslate.com/install.sh | bash ``` **Manual install:** 1. Install Docker Desktop (macOS/Windows) or Docker Engine (Linux) 2. `git clone https://github.com/TesslateAI/Studio.git && cd Studio` 3. `cp .env.example .env` — edit SECRET_KEY and LITELLM_MASTER_KEY 4. `docker compose up -d` 5. `docker build -t tesslate-devserver:latest -f orchestrator/Dockerfile.devserver orchestrator/` 6. Open http://studio.localhost **Requirements:** - Docker Desktop or Docker Engine - Git - 8 GB RAM minimum (16 GB recommended) - An LLM API key ### Pricing Tiers | Plan | Price | Projects | Storage | Key Features | |------|-------|----------|---------|--------------| | Free | $0 | 3 | 1 GB | All AI models, $150 credits, open source | | Basic | $20/mo | 15 | 5 GB | BYOK, priority support | | Pro | $49/mo | Unlimited | 25 GB | GPU workflows, priority builds, advanced agents | | Ultra | $149/mo | Unlimited | 100 GB | Dedicated resources, team collab, custom agents | All paid plans support bring-your-own API keys. Annual billing saves ~17%. ## OmniCoder OmniCoder is Tesslate's free, open-source coding model. It's a 9-billion parameter language model fine-tuned for code generation and developer tool integration. ### Key Details - **Model ID**: Tesslate/OmniCoder-9B - **Parameters**: 9 billion - **Context Length**: 131,072 tokens - **Max Output**: 16,384 tokens - **License**: Open source - **HuggingFace**: https://huggingface.co/Tesslate/OmniCoder-9B ### Supported Tools OmniCoder can be configured as the backend model for: - Claude Code (Anthropic Messages API) - Codex CLI (OpenAI Responses API) - OpenCode (OpenAI Chat Completions) - Cline (VS Code extension) - Roo Code (VS Code extension) - Kilo Code (VS Code extension) - Cursor - Goose - Crush ### Setup ```bash curl -fsSL https://tesslate.com/install_omnicoder.sh | bash ``` The script interactively configures your chosen serving endpoint and tools. ### Serving Options - Tesslate cloud endpoint (free during beta) - Local vLLM (localhost:8000) - Local llama.cpp (localhost:8080) - Local Ollama (localhost:11434) - Local LM Studio (localhost:1234) - Custom endpoint ## Company Information - **Legal Name**: Tesslate Corporation - **Founded**: 2025 - **Location**: North Carolina, USA - **Website**: https://tesslate.com - **GitHub Organization**: https://github.com/TesslateAI ### Social Media - X/Twitter: https://x.com/tesslateai - LinkedIn: https://www.linkedin.com/company/tesslate/ - Instagram: https://instagram.com/tesslateai - Discord: https://discord.gg/WgXabcN2r2 ### Contact - General inquiries: team@tesslate.com - Security reports: security@tesslate.com - Legal questions: legal@tesslate.com - Book a call: https://calendar.app.google/ZFFv4Gs7M7UM6FWx9 ## All Links - Homepage: https://tesslate.com - Cloud Studio: https://studio.tesslate.com - Install page: https://tesslate.com/install - Pricing: https://tesslate.com/pricing - Documentation: https://docs.tesslate.com - Blog: https://tesslate.com/blog - Changelog: https://tesslate.com/changelog - Manifesto: https://tesslate.com/manifesto - Careers: https://tesslate.com/careers - Newsroom: https://tesslate.com/newsroom - Security: https://tesslate.com/security - Privacy Policy: https://tesslate.com/privacy - Terms of Service: https://tesslate.com/terms - GitHub (Studio): https://github.com/TesslateAI/Studio - HuggingFace (OmniCoder): https://huggingface.co/Tesslate/OmniCoder-9B - Discord: https://discord.gg/WgXabcN2r2