Overview
Lyceum Cloud is a managed compute platform for running code and serving models on cloud GPUs. The dashboard, CLI, and REST API all expose the same underlying capabilities, organised around a few core building blocks.Quickstart
Create an account, install the CLI or VS Code extension, and run your first job.
What you can do
Serverless runs
Submit Python scripts, Docker images, or Docker Compose stacks. Pay per second of execution.
VM instances
Provision long-lived GPU VMs with SSH access for interactive work and custom environments.
Dedicated inference
Deploy any Hugging Face model to a dedicated GPU endpoint with autoscaling.
Object storage
Per-user S3-compatible bucket for inputs, outputs, and shared datasets.
Ways to access the platform
Dashboard
Web UI for everything: launching runs and VMs, deploying models, managing storage, billing, and API keys.
CLI
The
lyceum command-line tool for runs, VMs, inference, and storage from your terminal.REST API
Programmatic access to every dashboard feature. OpenAI-compatible inference endpoints.
Authentication
Every request to the platform is authenticated with a bearer token in theAuthorization header. Two token types are supported:
- API keys — long-lived tokens prefixed
lk_, created and managed in API Keys. Use these for CLI, scripts, and integrations. - JWT tokens — short-lived tokens issued by the dashboard login flow. Use these for testing in the browser or API playground.
Generate an API key
Create your first API key from the dashboard.

