This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
npm run dev- Start development server on localhost:5173npm run dev:host- Start development server with host accessnpm run build- Build for production (includes SSR build and customization)npm start- Start production server from buildnpm run preview- Preview production build
npm run check- Run Svelte type checkingnpm run check:watch- Run type checking in watch modenpm run format- Format code with Prettiernpm run lint- Check code formatting (use this for validation)
npm run db:generate- Generate Drizzle migrationsnpm run db:migrate- Run database migrationsnpm run db:studio- Open Drizzle Studio for database inspection
npm run bundle- Create distribution bundlenpm run dist- Full build and bundle process
- Frontend: SvelteKit 2.x with Svelte 5, TypeScript
- Styling: TailwindCSS 4.x with Skeleton UI components
- Database: Drizzle ORM with PostgreSQL (PGLite for embedded use)
- Real-time: WebSockets via sveltekit-io for live updates
- Build: Vite with Node.js adapter
Key Directories:
src/lib/client/- Client-side components and utilitiessrc/lib/server/- Server-side logic (database, sockets, AI adapters)src/lib/shared/- Shared constants and utilitiessrc/routes/- SvelteKit routes and API endpoints
Core Components:
src/lib/client/components/- Reusable Svelte components organized by featuresrc/lib/client/connectionForms/- AI connection configuration formssrc/lib/server/connectionAdapters/- AI model integration adapterssrc/lib/server/sockets/- WebSocket event handlerssrc/lib/server/utils/promptBuilder/- Advanced prompt compilation system
The application uses a comprehensive schema defined in src/lib/server/db/schema.ts:
Core Entities:
users- User accounts with active configuration referencesconnections- AI model connections (OpenAI, Ollama, LM Studio, etc.)characters- AI-controlled characters with rich metadatapersonas- User-controlled characters/identitieschats- Conversations (1:1 or group) with message historylorebooks- Advanced worldbuilding and context management
Configuration Tables:
samplingConfigs- AI generation parameters (temperature, top-p, etc.)contextConfigs- Handlebars templates for prompt formattingpromptConfigs- System prompts and instructions
WebSocket-based real-time updates using sveltekit-io:
- All CRUD operations emit to user rooms for live UI updates
- Socket handlers in
src/lib/server/sockets/manage events - Client connects to single user room for security
Modular adapter pattern for AI model support:
BaseConnectionAdapter- Abstract base for all AI connections- Individual adapters for OpenAI, Ollama, LM Studio, Llama.cpp
PromptBuilder- Advanced prompt compilation with Handlebars templating- Token counting and context management per connection type
Sophisticated prompt compilation system with:
- Handlebars-based templating with custom helpers
- Modular content inclusion strategies
- Lorebook integration with keyword matching (vectorization planned)
- Token-aware context truncation
- Multiple output formats (chat completion vs. text completion)
Always run npm run db:generate after schema changes, then npm run db:migrate to apply.
When adding new socket events, register them in src/lib/server/sockets/index.ts following the existing pattern.
New AI connection types should extend BaseConnectionAdapter and implement required methods. Add connection defaults and sampling key mappings.
The advanced lorebook system supports:
- World lore (global context)
- Character lore (character-specific context)
- History entries (timeline-based context)
- Binding system for dynamic character references
Components are organized by feature area. Form components follow consistent patterns for WebSocket-based CRUD operations with optimistic updates.
No specific test framework is currently configured. Manual testing through the dev server is the primary approach.