Personal app for recurring checklists and journaling.
- Quickstart:
docs/quickstart.md - User guide:
docs/user-guide.md - Public API guide:
docs/public-api.md - Public API OpenAPI spec:
docs/public-api.openapi.yaml - Infra / CDK setup:
infra/README.md - GitHub Actions infra deploy setup:
docs/infra-ci-setup.md
- Next.js / React
- Vercel
- AWS (S3, DynamoDB, SQS, Lambda, Secrets Manager)
- Upstash / Vercel KV
- Purify.ts
- Docker (for local Redis fallback)
Prerequisites:
- Node.js
- Docker
Setup:
- Install dependencies:
npm install - Copy the app env template:
cp .env.example .env.local - Fill in the required local runtime values in
.env.local - Start local Redis fallback:
docker compose up - Start the app:
npm run dev
For infra deployment from your machine, also:
- Copy
infra/.env.exampletoinfra/.env - Fill in the CDK deploy values
- Run
cd infra && npm ci
Use /.env.example as the template.
Required:
AUTH_SECRET- Password/JWT signing secret.
AWS_REGIONAWS_ROLE_ARN- App IAM role ARN assumed by the Next.js runtime.
AWS_ACCESS_KEY_ID- Base IAM user access key used to assume
AWS_ROLE_ARN.
- Base IAM user access key used to assume
AWS_SECRET_ACCESS_KEY- Base IAM user secret used to assume
AWS_ROLE_ARN.
- Base IAM user secret used to assume
AWS_BUCKET_NAMEAWS_JOURNAL_VECTOR_BUCKET_NAMEAWS_JOURNAL_VECTOR_INDEX_NAMEAWS_JOURNAL_VECTOR_DIMENSIONAWS_TABLE_NAMEAWS_JOBS_QUEUE_URL- Queue URL used by the app to enqueue transcription jobs.
OPENAI_API_KEY- Required for app-side OpenAI usage (journal analysis / embeddings).
Production-required in Vercel:
VERCEL_PROJECT_PRODUCTION_URL- Hostname only, no protocol.
KV_REST_API_URLKV_REST_API_TOKEN
Optional:
NODE_ENVLOG_LEVELADMIN_USERNAMESJOURNAL_VECTOR_TOP_KJOURNAL_VECTOR_MAX_DISTANCEMIN_JOURNAL_ANALYSIS_CHARSOPENAI_JOURNAL_ANALYSIS_MODELVERCEL_GIT_COMMIT_SHANEXT_PUBLIC_THEME_OVERRIDElight,dark, orsystem
These are not Vercel env vars. They are set on the deployed Lambda by the CDK stack.
Required:
AWS_REGIONAWS_BUCKET_NAMEAWS_TABLE_NAMEAWS_APP_SECRET_NAME- Secrets Manager secret name the worker reads to get
OPENAI_API_KEY.
- Secrets Manager secret name the worker reads to get
OPENAI_TRANSCRIPTION_MODEL- Use a real model id such as
whisper-1.
- Use a real model id such as
OPENAI_TRANSCRIPTION_STRUCTURING_MODEL
Optional:
MAX_RECEIVE_ATTEMPTSTIMEOUT_IN_MIN
Use infra/.env.example as the template.
Required:
BASE_URL- Production hostname only, no protocol.
OPENAI_API_KEYOPENAI_TRANSCRIPTION_MODELOPENAI_TRANSCRIPTION_STRUCTURING_MODELAWS_ACCOUNTAWS_REGIONAWS_ALARM_EMAILAWS_JOURNAL_VECTOR_DIMENSION
Deploy the infra stack first. See infra/README.md.
For .github/workflows/infra-cdk.yml, configure the production GitHub environment.
GitHub secrets:
AWS_ROLE_TO_ASSUME_PRODOPENAI_API_KEY
GitHub vars:
BASE_URLAWS_REGIONAWS_ALARM_EMAILAWS_JOURNAL_VECTOR_DIMENSIONOPENAI_TRANSCRIPTION_MODELOPENAI_TRANSCRIPTION_STRUCTURING_MODEL
After infra deploy, pull the generated AWS app secret values:
./scripts/pull-aws-secrets.sh
Set the resulting runtime values in Vercel Production:
AUTH_SECRETVERCEL_PROJECT_PRODUCTION_URLAWS_REGIONAWS_ROLE_ARNAWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEYAWS_BUCKET_NAMEAWS_JOURNAL_VECTOR_BUCKET_NAMEAWS_JOURNAL_VECTOR_INDEX_NAMEAWS_JOURNAL_VECTOR_DIMENSIONAWS_TABLE_NAMEAWS_JOBS_QUEUE_URLOPENAI_API_KEYKV_REST_API_URLKV_REST_API_TOKEN
Optional but recommended in Vercel:
LOG_LEVEL=infoOPENAI_JOURNAL_ANALYSIS_MODEL=gpt-4o-miniADMIN_USERNAMES=<comma-separated admins>
After Vercel env vars are set, redeploy the app so all runtime env changes take effect.
- Local development uses the Docker Redis fallback instead of Vercel KV.
- The app runtime and the worker runtime are separate env surfaces. Do not put Lambda-only vars into Vercel unless you intentionally need them there.
- The app runtime key is
AWS_JOBS_QUEUE_URL, notAWS_TRANSCRIPTION_JOBS_QUEUE_URL.