Lyceum provides integrated S3 storage that automatically mounts to your execution environments. Upload once, access everywhere.
Overview
Lyceum Cloud provides S3-compatible storage that seamlessly integrates with your computational workloads. Upload data files, scripts, and dependencies, then access them from any execution environment.
Persistent Storage Files persist across executions with automatic backup
S3 Compatible Full S3 API compatibility with familiar tools
Auto-mounting Files automatically available in all environments
Upload Files
VS Code
Web Dashboard
API
Quick Upload Methods
Right-click Upload
Right-click any file or folder → “Upload to Lyceum Cloud”
The VS Code extension provides the fastest way to upload files during development
Dashboard Upload
Select Files
Click “Upload Files” or drag and drop your files
Programmatic Upload curl -X POST https://api.lyceum.technology/api/v2/external/storage/upload \
-H "Authorization: Bearer <token>" \
-F "file=@/path/to/your/file.txt" \
-F "key=data/file.txt"
The key parameter is optional. If not provided, the original filename will be used.
import requests
with open ( 'data.csv' , 'rb' ) as f:
response = requests.post(
'https://api.lyceum.technology/api/v2/external/storage/upload' ,
headers = { 'Authorization' : f 'Bearer { token } ' },
files = { 'file' : f},
data = { 'key' : 'datasets/data.csv' }
)
Access Files in Code
Files are automatically mounted in your execution environment under the path you define (default: /lyceum/):
Load Data
Environment Variables
List Files
import pandas as pd
# Files automatically available at /lyceum/storage/
df = pd.read_csv( '/lyceum/datasets/data.csv' )
# Process your data
results = df.describe()
# Save outputs back to storage
results.to_csv( '/lyceum/outputs/results.csv' )
File Management
API Request curl -X GET "https://api.lyceum.technology/api/v2/external/storage/list-files" \
-H "Authorization: Bearer <token>" \
-G -d "prefix=data/" -d "max_files=100"
Response {
"success" : true ,
"data" : [
{
"key" : "data/dataset.csv" ,
"size" : 1024000 ,
"last_modified" : "2024-01-15T10:30:00Z" ,
"etag" : " \" 9bb58f26192e4ba00f01e2e7b136bbd8 \" "
}
]
}
VS Code Extension Click the download icon next to any file in the Cloud Files panel API Request curl -X GET "https://api.lyceum.technology/api/v2/external/storage/download/{file_key}" \
-H "Authorization: Bearer <token>" \
-o downloaded_file.txt
Single File curl -X DELETE "https://api.lyceum.technology/api/v2/external/storage/delete/{file_key}" \
-H "Authorization: Bearer <token>"
Entire Folder curl -X DELETE "https://api.lyceum.technology/api/v2/external/storage/delete-folder/{folder_prefix}" \
-H "Authorization: Bearer <token>"
S3 Client Access
Generate temporary S3 credentials for direct access using standard S3 tools:
Get Credentials
Python (boto3)
AWS CLI
curl -X POST https://api.lyceum.technology/api/v2/external/storage/credentials \
-H "Authorization: Bearer <token>"
Response: {
"access_key" : "AKIAIOSFODNN7EXAMPLE" ,
"secret_key" : "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" ,
"session_token" : "AQoDYXdzEPT//////////" ,
"endpoint" : "https://s3.lyceum.technology" ,
"bucket_name" : "user-bucket-123" ,
"expires_at" : "2024-01-15T12:00:00Z"
}
import boto3
from botocore.config import Config
# Configure S3 client
s3_client = boto3.client(
's3' ,
aws_access_key_id = 'YOUR_ACCESS_KEY' ,
aws_secret_access_key = 'YOUR_SECRET_KEY' ,
aws_session_token = 'YOUR_SESSION_TOKEN' ,
endpoint_url = 'https://s3.lyceum.technology' ,
region_name = 'us-east-1' ,
config = Config( signature_version = 's3v4' )
)
# Upload file
s3_client.upload_file( 'local.txt' , 'user-bucket-123' , 'remote.txt' )
# Download file
s3_client.download_file( 'user-bucket-123' , 'remote.txt' , 'local.txt' )
# List objects
response = s3_client.list_objects_v2( Bucket = 'user-bucket-123' )
for obj in response.get( 'Contents' , []):
print (obj[ 'Key' ], obj[ 'Size' ])
# Configure credentials
export AWS_ACCESS_KEY_ID = YOUR_ACCESS_KEY
export AWS_SECRET_ACCESS_KEY = YOUR_SECRET_KEY
export AWS_SESSION_TOKEN = YOUR_SESSION_TOKEN
# Upload file
aws s3 cp local.txt s3://user-bucket-123/remote.txt \
--endpoint-url https://s3.lyceum.technology
# Download file
aws s3 cp s3://user-bucket-123/remote.txt local.txt \
--endpoint-url https://s3.lyceum.technology
# Sync directory
aws s3 sync ./local_folder s3://user-bucket-123/remote_folder \
--endpoint-url https://s3.lyceum.technology
Bulk Upload
Upload multiple files in one request:
curl -X POST https://api.lyceum.technology/api/v2/external/storage/upload-bulk \
-H "Authorization: Bearer <token>" \
-F "[email protected] " \
-F "[email protected] " \
-F "folder_prefix=data/"
The folder_prefix parameter is optional and will be prepended to all uploaded filenames.
Best Practices
Never store API keys or credentials in storage
Use temporary S3 credentials when sharing access
Regularly audit and clean up unused files
Keep local backups of critical data
Storage is user-scoped and isolated. You can only access files within your own storage bucket. All data is encrypted and requires authentication.