Skip to content

feat: Add custom system prompt support for LLM text formatting #90

@KojoBarbie

Description

@KojoBarbie

Summary

The LLM-based text formatting pipeline currently uses a hardcoded system prompt and application-type-specific rules defined in formatter-prompt.ts. Users have no way to add their own instructions to influence how the LLM formats transcribed text.

Current Behavior

  • The system prompt ("You are a professional text formatter...") and application-type-specific rules (email, chat, notes, etc.) are hardcoded in apps/desktop/src/pipeline/providers/formatting/formatter-prompt.ts
  • FormatterConfig only exposes enabled, modelId, and fallbackModelId
  • The Settings UI only allows toggling formatting on/off and selecting a model

Proposed Change

Allow users to define a custom system prompt that is appended to the existing built-in system prompt when sent to the LLM. The built-in prompt and application-type rules remain unchanged and always apply — the custom prompt provides additional user-specific instructions on top of them.

Example use cases

  • “Always use polite language when writing.”
  • “Leave technical terms in English.”
  • “Use bullet points frequently.”
  • Domain-specific instructions (medical, legal, etc.)

Implementation scope

  1. Type/Schema: Add an optional customSystemPrompt field to FormatterConfig in types/formatter.ts
  2. Database: Update AppSettingsData in db/schema.ts to persist the custom prompt
  3. Prompt construction: Update constructFormatterPrompt() in formatter-prompt.ts to append the user-defined prompt after the built-in prompt when present
  4. tRPC router: Update setFormatterConfig mutation in trpc/routers/settings.ts to accept the new field
  5. Settings UI: Add a textarea to FormattingSettings.tsx for editing the custom system prompt, with a clear/reset option

Behavior

  • Custom prompt empty (default): Existing behavior — only the built-in system prompt is sent
  • Custom prompt set: Built-in system prompt + custom prompt are concatenated and sent together to the LLM
  • The built-in prompt is never replaced or overridden by the custom prompt

Happy to contribute a PR if this is something you'd accept.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions