A comprehensive platform for building phonologically-valid constructed languages with AI-powered features, alphabet-to-IPA mapping, and advanced lexicon management. Built with modern web technologies, AWS cloud infrastructure, and enterprise-grade backup and disaster recovery capabilities.
- Overview
- Architecture
- Features
- Prerequisites
- Quick Start
- Development Setup
- Infrastructure
- Database Management
- Backup & Recovery
- Data Retention
- Database Schema
- API Documentation
- Frontend Architecture
- Security
- Deployment
- Monitoring & Logging
- Testing
- Disaster Recovery
- Contributing
- Troubleshooting
- License
PhaserAI is a production-ready web application designed for constructed language (conlang) creators. It provides tools for:
- Creating custom alphabets with IPA phoneme mapping
- Managing comprehensive lexicons with etymology tracking
- Validating phonological rules and syllable structures
- Generating AI-powered word suggestions
- Importing/exporting linguistic data
- Multi-language translation support
- Frontend: React 19 + TypeScript with 50+ components
- Backend: AWS serverless architecture with RDS PostgreSQL
- Infrastructure: CDK-managed AWS resources with automated backups
- Security: Cognito authentication with row-level security
- Performance: <50MB Docker images, sub-second API responses
- Reliability: 99.9% uptime target with 4-hour RTO, 1-hour RPO
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β React SPA β β API Gateway β β RDS PostgreSQLβ
β (Frontend) βββββΊβ + Lambda βββββΊβ (Database) β
β β β (Backend) β β β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β β β
β ββββββββββββββββββββ β
βββββββββββββββΊβ Cognito Auth βββββββββββββββ
β (Identity) β
ββββββββββββββββββββ
β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β
βΌ βΌ βΌ
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β AWS Backup β β Migration β β Monitoring β
β + S3 Storage β β Lambda β β + Alerting β
β β β β β β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
Frontend:
- Framework: React 19 with TypeScript
- Build Tool: Vite 5.4+ with SWC
- Styling: Tailwind CSS + shadcn/ui components
- State Management: Zustand + TanStack Query
- Forms: React Hook Form + Zod validation
- Routing: React Router v6
Backend:
- API: AWS API Gateway + Lambda functions
- Database: Amazon RDS PostgreSQL 15.8 with Multi-AZ
- Authentication: AWS Cognito with OAuth
- Infrastructure: AWS CDK (TypeScript)
DevOps & Operations:
- Containerization: Docker multi-stage builds
- Web Server: Nginx with security headers
- Monitoring: CloudWatch + health checks
- Backup: AWS Backup + S3 with lifecycle management
- Migration: Automated database schema versioning
- CI/CD: Ready for GitHub Actions integration
- Define consonants, vowels, and diphthongs
- Map alphabet letters to IPA phonemes
- Bidirectional conversion (alphabet β IPA)
- Support for multi-character mappings (e.g., "th" β "ΞΈ")
- Real-time syllable structure checking
- Customizable phonotactic rules
- Violation tracking and reporting
- Visual feedback for invalid constructions
- Interactive consonant chart (manner Γ place)
- Vowel chart (height Γ backness)
- Highlight enabled phonemes
- Click-to-insert functionality
- Parent-child word relationships
- Derivation type classification
- Historical sound changes
- Visual etymology trees
- OpenAI-powered word generation
- Phonologically-aware suggestions
- Context-sensitive recommendations
- Batch generation capabilities
- Translation management
- Language code standardization
- Bulk import/export
- Search across translations
- Lexicon growth tracking
- Phoneme frequency analysis
- Etymology depth metrics
- Usage pattern visualization
- JSON/CSV import/export
- Backup and restore
- Data validation
- Migration tools
- Dark/light theme toggle
- Responsive design
- Keyboard shortcuts
- Accessibility compliance
- Node.js: 18.0+ (LTS recommended)
- Package Manager: pnpm 8.10+ (preferred) or npm 9+
- Docker: 20.10+ (for containerized development)
- Git: 2.30+ with SSH key configured
- AWS CLI: 2.0+ configured with appropriate permissions
- CDK: 2.170+ installed globally (
npm install -g aws-cdk) - Permissions: IAM user with CDK deployment permissions
- RDS: PostgreSQL database hosting
- Lambda: Serverless function execution
- API Gateway: REST API management
- Cognito: User authentication
- Secrets Manager: Credential storage
- VPC: Network isolation
# Clone the repository
git clone https://github.com/your-org/phaserai.git
cd phaserai
# Install frontend dependencies
pnpm install
# Install infrastructure dependencies
cd infra
npm install
cd ..# Copy environment template
cp .env.example .env
# Edit .env with your configuration
# Required variables:
# - VITE_API_URL: Your API Gateway URL
# - VITE_COGNITO_USER_POOL_ID: Cognito User Pool ID
# - VITE_COGNITO_CLIENT_ID: Cognito App Client ID
# - VITE_AWS_REGION: AWS region (e.g., us-east-1)
# - VITE_COGNITO_DOMAIN: Cognito hosted UI domaincd infra
# Bootstrap CDK (first time only)
cdk bootstrap
# Deploy all stacks
npm run deploy
# Note the outputs for your .env file# Connect to your RDS instance and run:
psql -h your-rds-endpoint -U phaserai_admin -d phaserai_dev
# Execute the schema
\i database-schema.sql# Start development server
pnpm run dev
# Access at http://localhost:5173# Install dependencies
pnpm install
# Start development server with hot reload
pnpm run dev
# Run linter
pnpm run lint
# Build for production
pnpm run build
# Preview production build
pnpm run previewcd infra
# Compile TypeScript
npm run build
# Watch for changes
npm run watch
# Synthesize CloudFormation
npm run synth
# Show differences
npm run diff
# Deploy changes
npm run deploy# Build development image
docker build -f Dockerfile.dev -t phaserai-dev .
# Run with hot reload
docker run -p 5173:5173 \
-v "$(pwd):/app" \
-v /app/node_modules \
--env-file .env \
--name phaserai-dev-container \
phaserai-dev# Build production image
docker build \
--build-arg VITE_API_URL="$VITE_API_URL" \
--build-arg VITE_COGNITO_USER_POOL_ID="$VITE_COGNITO_USER_POOL_ID" \
--build-arg VITE_COGNITO_CLIENT_ID="$VITE_COGNITO_CLIENT_ID" \
-t phaserai-prod .
# Run production container
docker run -p 3001:80 \
--name phaserai-prod-container \
phaserai-prodPhaserAI uses a single, consolidated GitHub Actions workflow (ci-cd.yml) that combines all development activities:
Pipeline Stages:
- Validation & Quality - Code quality, security scans, formatting checks
- Build & Test - Frontend build, Docker images, Lambda layers, infrastructure
- Deploy - ECR push, EC2 deployment, vulnerability scanning
- Notification - Pipeline summary and PR comments
Triggers:
- Pull Requests: Validation and build only
- Main Branch: Full pipeline with deployment
- Schedule: Weekly security scans
- Manual: Custom environment deployment
Key Features:
- Eliminates duplicate activities across workflows
- Intelligent job skipping based on changes
- Parallel execution where possible
- Comprehensive security scanning
- Automated PR feedback
# Run linter
pnpm run lint
# Fix auto-fixable issues
pnpm run lint --fix# Type check without emitting
npx tsc --noEmit
# Watch mode
npx tsc --noEmit --watchSnyk - Link
Purpose: Manages RDS PostgreSQL instance and networking
Resources:
- VPC: 3-tier architecture (public/private/database subnets)
- RDS Instance: PostgreSQL 15.8 with encryption and Multi-AZ
- Security Groups: Database and Lambda access control
- Secrets Manager: Database credentials
- Subnet Groups: Database isolation
Configuration:
// Environment-specific sizing
const instanceType = environment === 'prod'
? ec2.InstanceType.of(ec2.InstanceClass.T3, ec2.InstanceSize.SMALL)
: ec2.InstanceType.of(ec2.InstanceClass.T3, ec2.InstanceSize.MICRO);
// Production features
multiAz: environment === 'prod',
deletionProtection: environment === 'prod',
backupRetention: environment === 'prod' ? 7 : 1 daysPurpose: Comprehensive backup and recovery management
Resources:
- AWS Backup Vault: Automated backup scheduling with encryption
- S3 Backup Bucket: Long-term storage with intelligent tiering
- Verification Lambda: Python-based backup integrity checking
- SNS Topic: Automated notifications for backup issues
- CloudWatch Events: Scheduled backup verification
Backup Schedule:
Production:
Daily: 2:00 AM UTC (35-day retention)
Weekly: Sunday 1:00 AM UTC (1-year retention)
Monthly: 1st day 12:00 AM UTC (7-year retention)
Storage Lifecycle:
Standard β IA (30 days) β Glacier (90 days) β Deep Archive (365 days)Purpose: Automated database schema management
Resources:
- Migration Lambda: Serverless migration runner
- Custom Resource: Automatic migration on deployment
- VPC Integration: Secure database access
- CloudWatch Logs: Migration execution tracking
Features:
- Versioned migration files with UP/DOWN sections
- Automatic execution during CDK deployment
- Rollback capabilities for failed migrations
- Migration status tracking and reporting
Purpose: Serverless API with Lambda functions
Resources:
- API Gateway: REST API with CORS
- Lambda Functions: Users, Languages, Words handlers
- Lambda Layers: PostgreSQL driver (psycopg2)
- IAM Roles: Secrets Manager access
- CloudWatch Logs: Function logging
Endpoints:
GET /health
POST /users
GET /users/{userId}
PUT /users/{userId}
GET /users/{userId}/languages
GET /languages
POST /languages
GET /languages/{languageId}
PUT /languages/{languageId}
DELETE /languages/{languageId}
GET /languages/{languageId}/words
GET /words
POST /words
GET /words/{wordId}
PUT /words/{wordId}
DELETE /words/{wordId}
Purpose: User authentication and authorization
Resources:
- User Pool: Email-based authentication
- User Pool Client: SPA configuration
- User Pool Domain: Hosted UI
- Identity Providers: Google OAuth (optional)
Features:
- Email verification required
- Password policy enforcement
- Account recovery via email
- OAuth integration ready
- Hosted UI for sign-in/sign-up
Purpose: Secure database access for administration
Resources:
- EC2 Instance: t3.micro bastion host
- Security Group: SSH and database access
- Elastic IP: Static IP address
- Key Pair: SSH key management
cd infra
# Deploy specific stack
cdk deploy phaserai-prod-database-dev
# Deploy all stacks
cdk deploy --all
# Destroy all stacks (careful!)
cdk destroy --all
# Show deployment differences
cdk diff
# Synthesize CloudFormation templates
cdk synthThe CDK stacks use context variables for configuration:
# Set context variables
cdk deploy --context appName=phaserai \
--context environment=prod \
--context googleClientId=your-client-id \
--context googleClientSecret=your-secretCREATE TABLE app_8b514_users (
user_id TEXT PRIMARY KEY, -- Cognito user ID
email TEXT NOT NULL UNIQUE,
username TEXT NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
### Deployment Commands
```bash
cd infra
# Deploy all stacks including backup and migration
cdk deploy --all --context notificationEmail=admin@example.com
# Deploy specific stacks
cdk deploy phaserai-prod-database-dev
cdk deploy phaserai-backup-dev
cdk deploy phaserai-migration-dev
# Destroy all stacks (careful!)
cdk destroy --all
# Show deployment differences
cdk diff
# Synthesize CloudFormation templates
cdk synthThe CDK stacks use context variables for configuration:
# Set context variables
cdk deploy --context appName=phaserai \
--context environment=prod \
--context notificationEmail=admin@example.com \
--context googleClientId=your-client-id \
--context googleClientSecret=your-secretPhaserAI uses a comprehensive database migration system for schema versioning and automated deployments.
infra/migrations/
βββ 20250101_120000_initial_schema.sql
βββ 20250102_143000_add_etymology_tables.sql
βββ 20250103_091500_add_user_preferences.sql
βββ README.md
# Check migration status
./scripts/migrate.sh status
# Apply all pending migrations
./scripts/migrate.sh up
# Rollback last migration
./scripts/migrate.sh down
# Production migration with backup
ENVIRONMENT=prod ./scripts/migrate.sh upMigrations run automatically during CDK deployment via the MigrationStack custom resource.
- Always use transactions: Wrap changes in BEGIN/COMMIT blocks
- Make migrations idempotent: Use IF NOT EXISTS clauses
- Test thoroughly: Test on development before production
- Backup before production: Automatic backups before migration
Production Environment:
Daily Backups:
Time: 2:00 AM UTC
Retention: 35 days
Storage: Standard β IA (30 days)
Weekly Backups:
Time: Sunday 1:00 AM UTC
Retention: 1 year
Storage: IA β Glacier (30 days)
Monthly Backups:
Time: 1st day 12:00 AM UTC
Retention: 7 years
Storage: Glacier β Deep Archive (90 days)
Staging/Development:
Daily Backups:
Time: 3:00/4:00 AM UTC
Retention: 7/3 days
Storage: Standard only# Daily automated verification (runs at 6:00 AM UTC)
./scripts/backup-verification.sh
# Manual verification with restoration test
./scripts/backup-verification.sh --test-restore
# Production verification with notifications
ENVIRONMENT=prod ./scripts/backup-verification.sh --notification-email admin@example.com- Automated Scheduling: AWS Backup service with lifecycle management
- Cross-Region Replication: Geographic redundancy for disaster recovery
- Encryption: All backups encrypted at rest and in transit
- Integrity Verification: Daily automated backup validation
- Point-in-Time Recovery: 7-day PITR for production databases
Production:
RTO (Recovery Time Objective): 4 hours
RPO (Recovery Point Objective): 1 hour
Data Loss Tolerance: Maximum 1 hour
Staging:
RTO: 8 hours
RPO: 24 hours
Data Loss Tolerance: Maximum 24 hours# Test database recovery
./scripts/disaster-recovery-test.sh --test-type database
# Test infrastructure recovery
./scripts/disaster-recovery-test.sh --test-type infrastructure
# Test cross-region failover
./scripts/disaster-recovery-test.sh --test-type cross-region
# Run all DR tests
./scripts/disaster-recovery-test.sh --test-type all- Database Corruption: Restore from latest automated backup
- Regional Outage: Failover to cross-region backup
- Infrastructure Failure: Redeploy from CDK source control
- Security Breach: Clean backup restoration with credential rotation
| Data Type | Production | Staging | Development |
|---|---|---|---|
| Active user accounts | While active + 3 years | 30 days | 7 days |
| Deleted user accounts | 30 days recovery period | 7 days | 1 day |
| User preferences | Same as user account | 30 days | 7 days |
| Authentication logs | 1 year | 30 days | 7 days |
| Data Type | Production | Staging | Development |
|---|---|---|---|
| Languages & words | While user active + 30 days | 7 days | 3 days |
| Etymology data | While user active + 30 days | 7 days | 3 days |
| Violation logs | 90 days | 30 days | 7 days |
| Performance metrics | 1 year detailed, 3 years aggregated | 30 days | 7 days |
| Data Type | Production | Staging | Development |
|---|---|---|---|
| Application logs | 90 days | 30 days | 7 days |
| Security logs | 1 year | 90 days | 7 days |
| Audit trails | 7 years (compliance) | 90 days | 7 days |
| Backup data | See backup schedule | 7 days | 3 days |
- GDPR Compliance: Right to access, rectification, erasure, and portability
- CCPA Compliance: California consumer privacy rights
- SOX Compliance: 7-year audit trail retention
- Automated Cleanup: Scheduled jobs for expired data removal
- User Rights: Self-service data export and deletion requests
# Check retention compliance
cd infra && npm run retention:check
# Generate retention report
cd infra && npm run retention:report
# Manual cleanup (development only)
cd infra && npm run retention:cleanupCREATE TABLE app_8b514_users (
user_id TEXT PRIMARY KEY, -- Cognito user ID
email TEXT NOT NULL UNIQUE,
username TEXT NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE app_8b514_languages (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
user_id TEXT REFERENCES app_8b514_users(user_id),
name TEXT NOT NULL,
phonemes JSONB NOT NULL DEFAULT '{"consonants":[],"vowels":[],"diphthongs":[]}',
alphabet_mappings JSONB NOT NULL DEFAULT '{"consonants":{},"vowels":{},"diphthongs":{}}',
syllables TEXT NOT NULL DEFAULT 'CV',
rules TEXT NOT NULL DEFAULT '',
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE app_8b514_words (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
language_id UUID REFERENCES app_8b514_languages(id),
word TEXT NOT NULL,
ipa TEXT NOT NULL,
pos TEXT[] NOT NULL DEFAULT '{}',
is_root BOOLEAN NOT NULL DEFAULT false,
embedding FLOAT[] NULL, -- For AI features
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE app_8b514_translations (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
word_id UUID REFERENCES app_8b514_words(id),
language_code TEXT NOT NULL, -- ISO 639-1 codes
meaning TEXT NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE app_8b514_word_etymology (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
word_id UUID REFERENCES app_8b514_words(id),
parent_word_id UUID REFERENCES app_8b514_words(id),
derivation_type VARCHAR(50), -- 'compound', 'affix', 'sound_change'
derivation_notes TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE app_8b514_phonological_violations (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY,
word_id UUID REFERENCES app_8b514_words(id),
violation_type VARCHAR(100) NOT NULL,
description TEXT NOT NULL,
severity VARCHAR(20) DEFAULT 'warning',
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);CREATE TABLE schema_migrations (
version VARCHAR(50) PRIMARY KEY,
applied_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
checksum VARCHAR(64),
execution_time_ms INTEGER,
description TEXT
);Performance-optimized indexes for common queries:
-- Core table indexes
CREATE INDEX idx_languages_user_id ON app_8b514_languages(user_id);
CREATE INDEX idx_words_language_id ON app_8b514_words(language_id);
CREATE INDEX idx_words_is_root ON app_8b514_words(is_root);
CREATE INDEX idx_translations_word_id ON app_8b514_translations(word_id);
CREATE INDEX idx_translations_language_code ON app_8b514_translations(language_code);
-- Etymology indexes
CREATE INDEX idx_word_etymology_word ON app_8b514_word_etymology(word_id);
CREATE INDEX idx_word_etymology_parent ON app_8b514_word_etymology(parent_word_id);
-- Validation indexes
CREATE INDEX idx_phonological_violations_word ON app_8b514_phonological_violations(word_id);All API endpoints (except /health) require authentication via AWS Cognito JWT tokens.
Headers:
Authorization: Bearer <cognito-jwt-token>
Content-Type: application/json
Standard Error Response:
{
"error": "Error message",
"code": "ERROR_CODE",
"details": {}
}HTTP Status Codes:
200: Success201: Created400: Bad Request401: Unauthorized403: Forbidden404: Not Found500: Internal Server Error
GET /healthResponse:
{
"status": "healthy",
"timestamp": "2025-01-01T00:00:00Z"
}Get User:
GET /users/{userId}Create User:
POST /users
Content-Type: application/json
{
"user_id": "cognito-user-id",
"email": "user@example.com",
"username": "username"
}List User Languages:
GET /users/{userId}/languagesGet Language:
GET /languages/{languageId}Create Language:
POST /languages
Content-Type: application/json
{
"user_id": "cognito-user-id",
"name": "My Conlang",
"phonemes": {
"consonants": ["p", "t", "k"],
"vowels": ["a", "i", "u"],
"diphthongs": []
},
"alphabet_mappings": {
"consonants": {"p": "p", "t": "t", "k": "k"},
"vowels": {"a": "a", "i": "i", "u": "u"},
"diphthongs": {}
},
"syllables": "CV",
"rules": "No consonant clusters"
}List Language Words:
GET /languages/{languageId}/wordsCreate Word:
POST /words
Content-Type: application/json
{
"language_id": "uuid",
"word": "taku",
"ipa": "taku",
"pos": ["noun"],
"is_root": true,
"translations": [
{
"language_code": "en",
"meaning": "house"
}
]
}src/
βββ components/ # Reusable UI components
β βββ ui/ # shadcn/ui base components
β βββ forms/ # Form-specific components
β βββ charts/ # Data visualization
β βββ layout/ # Layout components
βββ pages/ # Route components
βββ hooks/ # Custom React hooks
βββ lib/ # Utilities and services
βββ types/ # TypeScript type definitions
βββ styles/ # Global styles
Auth Store (useAuthStore):
interface AuthState {
user: User | null;
isAuthenticated: boolean;
isLoading: boolean;
signIn: (email: string, password: string) => Promise<void>;
signOut: () => Promise<void>;
initializeAuth: () => Promise<void>;
}Language Store (useLanguageStore):
interface LanguageState {
currentLanguage: Language | null;
languages: Language[];
setCurrentLanguage: (language: Language) => void;
addLanguage: (language: Language) => void;
updateLanguage: (id: string, updates: Partial<Language>) => void;
}Query Keys:
const queryKeys = {
users: ['users'] as const,
user: (id: string) => ['users', id] as const,
languages: ['languages'] as const,
userLanguages: (userId: string) => ['languages', 'user', userId] as const,
language: (id: string) => ['languages', id] as const,
words: ['words'] as const,
languageWords: (languageId: string) => ['words', 'language', languageId] as const,
word: (id: string) => ['words', id] as const,
};export function useLanguageData(languageId: string) {
const { data: language, isLoading: languageLoading } = useQuery({
queryKey: queryKeys.language(languageId),
queryFn: () => api.getLanguage(languageId),
});
const { data: words, isLoading: wordsLoading } = useQuery({
queryKey: queryKeys.languageWords(languageId),
queryFn: () => api.getWords(languageId),
enabled: !!languageId,
});
return {
language,
words,
isLoading: languageLoading || wordsLoading,
};
}export function usePhonemeValidation(language: Language) {
return useCallback((word: string, ipa: string) => {
const violations: Violation[] = [];
// Validate phonemes exist in language
const ipaPhonemes = parseIPA(ipa);
const validPhonemes = [
...language.phonemes.consonants,
...language.phonemes.vowels,
...language.phonemes.diphthongs,
];
ipaPhonemes.forEach(phoneme => {
if (!validPhonemes.includes(phoneme)) {
violations.push({
type: 'invalid_phoneme',
description: `Phoneme '${phoneme}' not defined in language`,
severity: 'error',
});
}
});
return violations;
}, [language]);
}Using React Hook Form + Zod:
const languageSchema = z.object({
name: z.string().min(1, 'Language name is required'),
phonemes: z.object({
consonants: z.array(z.string()).min(1, 'At least one consonant required'),
vowels: z.array(z.string()).min(1, 'At least one vowel required'),
diphthongs: z.array(z.string()),
}),
syllables: z.string().min(1, 'Syllable structure required'),
rules: z.string(),
});
type LanguageFormData = z.infer<typeof languageSchema>;
export function LanguageForm() {
const form = useForm<LanguageFormData>({
resolver: zodResolver(languageSchema),
defaultValues: {
name: '',
phonemes: { consonants: [], vowels: [], diphthongs: [] },
syllables: 'CV',
rules: '',
},
});
// Form implementation...
}- User Registration/Login: Cognito handles user management
- JWT Token: Issued by Cognito, includes user claims
- API Authorization: Lambda functions validate JWT tokens
- Database Access: Row-level security based on user_id
Nginx Configuration:
# Security headers
add_header X-Frame-Options DENY always;
add_header X-Content-Type-Options nosniff always;
add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline';" always;-- Enable RLS on all user tables
ALTER TABLE app_8b514_languages ENABLE ROW LEVEL SECURITY;
ALTER TABLE app_8b514_words ENABLE ROW LEVEL SECURITY;
ALTER TABLE app_8b514_translations ENABLE ROW LEVEL SECURITY;
-- Example policy for languages table
CREATE POLICY "Users can only access their own languages" ON app_8b514_languages
FOR ALL USING (user_id = current_setting('app.current_user_id'));- Frontend: Zod schemas for all forms
- Backend: Lambda function input validation
- Database: Constraints and triggers
- Database Credentials: AWS Secrets Manager
- API Keys: Environment variables (not in code)
- OAuth Secrets: Secure parameter store
Implemented Protections:
- β SQL Injection: Parameterized queries
- β XSS: Content Security Policy + input sanitization
- β CSRF: SameSite cookies + CORS configuration
- β Authentication: JWT tokens with expiration
- β Authorization: Role-based access control
- β Data Exposure: Row-level security
- β Logging: Structured logging without sensitive data
- Run all tests (
npm test) - Build passes without warnings (
npm run build) - Environment variables configured
- Database migrations applied
- SSL certificates ready
- Monitoring configured
cd infra
# Deploy database stack first
cdk deploy phaserai-prod-database-prod
# Deploy API stack
cdk deploy phaserai-prod-api-prod
# Deploy auth stack
cdk deploy phaserai-auth-prod
# Deploy bastion (optional)
cdk deploy phaserai-prod-bastion-prod# Build production image
docker build \
--build-arg VITE_API_URL="$VITE_API_URL" \
--build-arg VITE_COGNITO_USER_POOL_ID="$VITE_COGNITO_USER_POOL_ID" \
--build-arg VITE_COGNITO_CLIENT_ID="$VITE_COGNITO_CLIENT_ID" \
--build-arg VITE_AWS_REGION="$VITE_AWS_REGION" \
--build-arg VITE_COGNITO_DOMAIN="$VITE_COGNITO_DOMAIN" \
-t phaserai:latest .
# Tag for registry
docker tag phaserai:latest your-registry/phaserai:latest
# Push to registry
docker push your-registry/phaserai:latest# .env.development
VITE_API_URL=https://dev-api.phaserai.com
VITE_COGNITO_USER_POOL_ID=us-east-1_devpool123
VITE_COGNITO_CLIENT_ID=devclient123# .env.staging
VITE_API_URL=https://staging-api.phaserai.com
VITE_COGNITO_USER_POOL_ID=us-east-1_stagingpool123
VITE_COGNITO_CLIENT_ID=stagingclient123# .env.production
VITE_API_URL=https://api.phaserai.com
VITE_COGNITO_USER_POOL_ID=us-east-1_prodpool123
VITE_COGNITO_CLIENT_ID=prodclient123# .github/workflows/deploy.yml
name: Deploy to Production
on:
push:
branches: [master]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
cache: 'pnpm'
- run: pnpm install
- run: pnpm run lint
- run: pnpm run test
- run: pnpm run build
deploy-infrastructure:
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- run: cd infra && npm install
- run: cd infra && npm run deploy
deploy-application:
needs: [test, deploy-infrastructure]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: docker/build-push-action@v4
with:
context: .
push: true
tags: your-registry/phaserai:latest
build-args: |
VITE_API_URL=${{ secrets.VITE_API_URL }}
VITE_COGNITO_USER_POOL_ID=${{ secrets.VITE_COGNITO_USER_POOL_ID }}Application Health:
GET /healthResponse:
{
"status": "healthy",
"timestamp": "2025-01-01T00:00:00Z",
"version": "1.0.0",
"database": "connected",
"memory": "45MB",
"uptime": "2h 15m"
}Docker Health Check:
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost/ || exit 1// Structured logging
const logger = {
info: (message: string, meta?: object) => {
if (import.meta.env.PROD) {
console.log(JSON.stringify({ level: 'info', message, ...meta, timestamp: new Date().toISOString() }));
} else {
console.log(message, meta);
}
},
error: (message: string, error?: Error, meta?: object) => {
const logData = {
level: 'error',
message,
error: error?.message,
stack: error?.stack,
...meta,
timestamp: new Date().toISOString(),
};
console.error(JSON.stringify(logData));
},
};log_format json_combined escape=json
'{'
'"time_local":"$time_local",'
'"remote_addr":"$remote_addr",'
'"request":"$request",'
'"status":$status,'
'"body_bytes_sent":$body_bytes_sent,'
'"http_referer":"$http_referer",'
'"http_user_agent":"$http_user_agent",'
'"request_time":$request_time'
'}';
access_log /var/log/nginx/access.log json_combined;# View application logs
docker logs -f phaserai-prod-container
# Monitor resource usage
docker stats phaserai-prod-container
# Check health status
curl http://localhost/health
# View nginx logs
docker exec phaserai-prod-container tail -f /var/log/nginx/access.logLambda Function Logs:
- Automatic log group creation
- 1-week retention by default
- Structured JSON logging
RDS Monitoring:
- Performance Insights enabled (production)
- CloudWatch metrics for CPU, memory, connections
- Automated backups with point-in-time recovery
API Gateway Logs:
- Request/response logging
- Error rate monitoring
- Latency tracking
Backup Monitoring:
- Daily backup verification at 6:00 AM UTC
- Automated alerts for backup failures
- Cross-region backup status monitoring
- Storage cost optimization tracking
- Framework: Jest + React Testing Library
- Coverage: Components, hooks, utilities
- Location:
src/**/__tests__/
- Framework: Jest + MSW (Mock Service Worker)
- Coverage: API integration, user flows
- Location:
src/__tests__/integration/
- Framework: Playwright
- Coverage: Critical user journeys
- Location:
e2e/
- Framework: CDK Unit Tests + AWS Testing Library
- Coverage: CDK stack validation, resource configuration
- Location:
infra/test/
# Install testing dependencies
pnpm add -D jest @testing-library/react @testing-library/jest-dom
pnpm add -D @testing-library/user-event msw playwright
# Run tests
pnpm test
# Run tests with coverage
pnpm test --coverage
# Run e2e tests
pnpm test:e2e// src/components/__tests__/LanguageForm.test.tsx
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import { LanguageForm } from '../LanguageForm';
describe('LanguageForm', () => {
it('validates required fields', async () => {
render(<LanguageForm onSubmit={jest.fn()} />);
fireEvent.click(screen.getByRole('button', { name: /create language/i }));
await waitFor(() => {
expect(screen.getByText('Language name is required')).toBeInTheDocument();
});
});
it('submits valid form data', async () => {
const onSubmit = jest.fn();
render(<LanguageForm onSubmit={onSubmit} />);
fireEvent.change(screen.getByLabelText(/language name/i), {
target: { value: 'Test Language' },
});
fireEvent.click(screen.getByRole('button', { name: /create language/i }));
await waitFor(() => {
expect(onSubmit).toHaveBeenCalledWith({
name: 'Test Language',
// ... other form data
});
});
});
});// src/lib/__tests__/api-client.test.ts
import { api } from '../api-client';
import { server } from '../mocks/server';
describe('API Client', () => {
beforeAll(() => server.listen());
afterEach(() => server.resetHandlers());
afterAll(() => server.close());
it('creates a new language', async () => {
const languageData = {
name: 'Test Language',
phonemes: { consonants: ['p'], vowels: ['a'], diphthongs: [] },
};
const result = await api.createLanguage(languageData);
expect(result).toMatchObject({
id: expect.any(String),
name: 'Test Language',
});
});
});// e2e/language-creation.spec.ts
import { test, expect } from '@playwright/test';
test('user can create a new language', async ({ page }) => {
await page.goto('/dashboard');
// Login flow
await page.click('text=Sign In');
await page.fill('[data-testid=email]', 'test@example.com');
await page.fill('[data-testid=password]', 'password123');
await page.click('[data-testid=sign-in-button]');
// Create language
await page.click('text=Create New Language');
await page.fill('[data-testid=language-name]', 'My Test Language');
await page.click('[data-testid=add-consonant]');
await page.fill('[data-testid=consonant-input]', 'p');
await page.click('[data-testid=add-vowel]');
await page.fill('[data-testid=vowel-input]', 'a');
await page.click('[data-testid=create-language-button]');
// Verify creation
await expect(page.locator('text=My Test Language')).toBeVisible();
});PhaserAI includes comprehensive disaster recovery testing capabilities to ensure business continuity.
Database Recovery Testing:
# Test database backup restoration
./scripts/disaster-recovery-test.sh --test-type database
# Test with cleanup disabled (for investigation)
./scripts/disaster-recovery-test.sh --test-type database --no-cleanupInfrastructure Recovery Testing:
# Test complete infrastructure deployment from source
./scripts/disaster-recovery-test.sh --test-type infrastructure
# Test with notifications
./scripts/disaster-recovery-test.sh --test-type infrastructure --notification-email admin@example.comCross-Region Recovery Testing:
# Test cross-region failover capabilities
./scripts/disaster-recovery-test.sh --test-type cross-regionSecurity Breach Recovery Testing:
# Test security incident recovery procedures
./scripts/disaster-recovery-test.sh --test-type securityComprehensive DR Testing:
# Run all disaster recovery tests
./scripts/disaster-recovery-test.sh --test-type all --notification-email team@example.comMonthly Tests:
- Database recovery validation
- Backup integrity verification
- Infrastructure deployment testing
Quarterly Tests:
- Cross-region failover testing
- Complete infrastructure recovery
- Security breach simulation
Annual Tests:
- Full disaster simulation
- Regional outage scenario
- Complete data center loss simulation
- Assess Impact: Determine scope of database issues
- Stop Traffic: Prevent further data corruption
- Identify Backup: Select appropriate recovery point
- Restore Database: Use AWS Backup or point-in-time recovery
- Validate Data: Run integrity checks
- Resume Operations: Gradually restore traffic
- Source Control: Ensure latest CDK code is available
- Deploy Infrastructure: Use CDK to recreate resources
- Restore Configuration: Apply environment-specific settings
- Test Connectivity: Verify all services are operational
- Update DNS: Point traffic to new infrastructure
- Isolate Systems: Prevent further compromise
- Rotate Credentials: Update all passwords and keys
- Clean Restore: Restore from known-good backup
- Security Patches: Apply latest security updates
- Monitor: Enhanced monitoring for suspicious activity
Detailed Procedures: See docs/DISASTER_RECOVERY_PLAN.md Migration Guide: See docs/MIGRATION_GUIDE.md Data Retention Policy: See docs/DATA_RETENTION_POLICY.md
-
Fork & Clone
git clone https://github.com/your-username/phaserai.git cd phaserai -
Create Feature Branch
git checkout -b feature/your-feature-name
-
Development Setup
pnpm install cd infra && npm install && cd .. cp .env.example .env # Configure your .env file
-
Make Changes
- Follow TypeScript best practices
- Add tests for new functionality
- Update documentation as needed
- Follow existing code style
-
Test Your Changes
pnpm run lint pnpm run test pnpm run build # Test infrastructure changes cd infra && npm run test # Test migrations ./scripts/migrate.sh status
-
Commit & Push
git add . git commit -m "feat: add new feature description" git push origin feature/your-feature-name
-
Create Pull Request
- Provide clear description
- Link related issues
- Include screenshots for UI changes
- Use strict mode
- Prefer interfaces over types for object shapes
- Use proper generic constraints
- Document complex types
// Good
interface LanguageFormProps {
language?: Language;
onSubmit: (data: LanguageFormData) => Promise<void>;
isLoading?: boolean;
}
// Avoid
type LanguageFormProps = {
language: Language | undefined;
onSubmit: Function;
isLoading: boolean | undefined;
};- Use functional components with hooks
- Prefer composition over inheritance
- Extract custom hooks for reusable logic
- Use proper prop types
// Good
export function LanguageForm({ language, onSubmit, isLoading = false }: LanguageFormProps) {
const { register, handleSubmit, formState: { errors } } = useForm<LanguageFormData>();
return (
<form onSubmit={handleSubmit(onSubmit)}>
{/* Form content */}
</form>
);
}
// Avoid
export const LanguageForm: React.FC<any> = (props) => {
// Component implementation
};- Use Tailwind CSS classes
- Follow mobile-first responsive design
- Use CSS custom properties for theming
- Maintain consistent spacing scale
// Good
<div className="flex flex-col gap-4 p-6 bg-white rounded-lg shadow-sm">
<h2 className="text-xl font-semibold text-gray-900">Title</h2>
<p className="text-gray-600">Description</p>
</div>
// Avoid inline styles
<div style={{ display: 'flex', padding: '24px' }}>Use Conventional Commits:
feat:New featuresfix:Bug fixesdocs:Documentation changesstyle:Code style changes (formatting, etc.)refactor:Code refactoringtest:Adding or updating testschore:Maintenance tasks
Examples:
feat: add phoneme validation to word creation form
fix: resolve IPA chart rendering issue on mobile
docs: update API documentation for languages endpoint
refactor: extract common form validation logic
test: add unit tests for etymology tracking
chore: update dependencies to latest versions
Use conventional commit format:
feat: add real-time phonological validation
## Description
Brief description of changes
## Type of Change
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation update
## Testing
- [ ] Unit tests pass
- [ ] Integration tests pass
- [ ] Manual testing completed
## Screenshots (if applicable)
[Add screenshots for UI changes]
## Checklist
- [ ] Code follows project style guidelines
- [ ] Self-review completed
- [ ] Documentation updated
- [ ] Tests added/updated**Describe the bug**
A clear description of the bug
**To Reproduce**
Steps to reproduce the behavior
**Expected behavior**
What you expected to happen
**Screenshots**
If applicable, add screenshots
**Environment:**
- OS: [e.g. macOS, Windows, Linux]
- Browser: [e.g. Chrome, Firefox, Safari]
- Version: [e.g. 1.0.0]**Is your feature request related to a problem?**
A clear description of the problem
**Describe the solution you'd like**
A clear description of what you want to happen
**Describe alternatives you've considered**
Alternative solutions or features considered
**Additional context**
Any other context or screenshotsProblem: Environment variables are undefined in the application
Solutions:
-
Check
.envfile format (no quotes around values)# β Correct VITE_API_URL=https://api.example.com # β Wrong VITE_API_URL="https://api.example.com"
-
Ensure variables start with
VITE_prefix -
Restart development server after changes
-
Check build-time vs runtime variables
Problem: Changes not reflected in development container
Solutions:
-
Verify volume mounting:
docker run -v "$(pwd):/app" -v /app/node_modules ... -
Check file permissions (especially on Windows/WSL)
-
Ensure
.dockerignoredoesn't exclude source files -
Try rebuilding the development image
Problem: Cannot connect to RDS database
Solutions:
- Check security group rules
- Verify VPC configuration
- Confirm database endpoint and credentials
- Test connection from bastion host:
psql -h your-rds-endpoint -U phaserai_admin -d phaserai_dev
Problem: Database migrations failing or stuck
Solutions:
-
Check migration status:
./scripts/migrate.sh status
-
Verify database connectivity:
psql -h your-db -U user -d database -c "SELECT 1;" -
Check migration file syntax:
# Validate SQL syntax psql -h your-db -U user -d database --dry-run -f migration.sql -
Manual migration rollback:
./scripts/migrate.sh down
-
Reset migration state (development only):
DELETE FROM schema_migrations WHERE version = 'problematic_version';
Problem: Backup verification script reporting issues
Solutions:
-
Check backup vault access:
aws backup list-recovery-points-by-backup-vault --backup-vault-name your-vault
-
Verify backup age:
./scripts/backup-verification.sh --environment prod
-
Test backup restoration:
./scripts/backup-verification.sh --test-restore
-
Check S3 backup bucket:
aws s3 ls s3://your-backup-bucket/
-
Review CloudWatch logs for backup Lambda function
Problem: DR tests failing or timing out
Solutions:
-
Check AWS permissions:
aws sts get-caller-identity aws backup list-backup-vaults
-
Verify test environment resources:
./scripts/disaster-recovery-test.sh --test-type database --no-cleanup
-
Check restoration timeout settings:
# Increase timeout for large databases export DR_TEST_TIMEOUT=3600
-
Review test logs:
tail -f /tmp/dr-test-report-*.json
Problem: Cognito authentication failing
Solutions:
- Verify Cognito configuration in
.env - Check callback URLs in Cognito console
- Ensure user pool client settings are correct
- Clear browser cache and cookies
- Check CloudWatch logs for detailed errors
Problem: Production build fails
Solutions:
-
Check TypeScript errors:
npx tsc --noEmit
-
Verify all dependencies are installed:
pnpm install
-
Clear build cache:
rm -rf dist node_modules/.vite pnpm install
-
Check for missing environment variables in build
Problem: API requests returning 500 errors
Solutions:
- Check Lambda function logs in CloudWatch
- Verify database connection from Lambda
- Check IAM permissions for Secrets Manager
- Test Lambda functions individually
- Verify API Gateway integration configuration
# Check Docker container logs
docker logs -f container-name
# Inspect running container
docker exec -it container-name /bin/sh
# Check network connectivity
docker exec container-name ping database-endpoint
# View environment variables
docker exec container-name env
# Check file permissions
docker exec container-name ls -la /app
# Test database connection
docker exec container-name psql -h endpoint -U user -d database
# Check CDK diff
cd infra && cdk diff
# Validate CloudFormation template
cd infra && cdk synth
# Check AWS credentials
aws sts get-caller-identityDiagnostics:
- Check CloudWatch metrics for Lambda duration
- Monitor RDS performance insights
- Review database query performance
- Check network latency
Solutions:
- Add database indexes for common queries
- Implement caching (Redis/ElastiCache)
- Optimize Lambda function code
- Use connection pooling
Diagnostics:
# Analyze bundle
pnpm run build
npx vite-bundle-analyzer distSolutions:
- Implement code splitting
- Remove unused dependencies
- Use dynamic imports for large components
- Optimize images and assets
-
Check Documentation: Review this README and linked guides
-
Search Issues: Look for similar problems in GitHub issues
-
Create Issue: Use issue templates for bug reports or feature requests
-
Run Diagnostics: Use built-in diagnostic tools
# Health check curl http://localhost/health # Backup verification ./scripts/backup-verification.sh # DR test ./scripts/disaster-recovery-test.sh --test-type database # Migration status ./scripts/migrate.sh status
-
Community: Join our Discord/Slack for real-time help
-
Contact: Reach out to maintainers for urgent issues
- Migration Guide: Complete database migration procedures
- Disaster Recovery Plan: Comprehensive DR procedures and testing
- Data Retention Policy: Data lifecycle and compliance requirements
./scripts/migrate.sh: Database migration management./scripts/backup-verification.sh: Backup integrity verification./scripts/disaster-recovery-test.sh: DR testing and validation
- BackupStack: Automated backup and verification system
- MigrationStack: Database schema management and versioning
- ProductionDatabaseStack: RDS instance with Multi-AZ and encryption
- ProductionApiStack: Serverless API with Lambda functions
- CognitoAuthStack: User authentication and authorization
Β© 2025 PhaserAI - All rights reserved
This project is proprietary software. Unauthorized copying, modification, distribution, or use of this software is strictly prohibited without explicit written permission from the copyright holders.
For commercial licensing inquiries, please contact: licensing@phaserai.com
By contributing to this project, you agree that your contributions will be licensed under the same terms as the project.
Built with β€οΈ for constructed language creators worldwide
PhaserAI empowers linguists, writers, and world-builders to create rich, phonologically-consistent constructed languages with enterprise-grade reliability, comprehensive backup systems, and automated disaster recovery capabilities.