-->
Timeline: 2 months
Role: Frontend Engineer → Full Stack Implementation
Status: Live in Production
Built a modern dating platform featuring AI-powered conversational onboarding, replacing traditional forms with an intelligent chat assistant. The system processes user conversations, generates structured data, and automates application processing with PDF generation and email notifications.
User visits site → Next.js renders pages from Payload CMS
User starts chat → Together.ai LLM asks questions in JSON format
User uploads selfie → File stored in AWS S3
Chat completes (~100 messages) → Background job queued in Payload
Job executes → LLM extracts key-value pairs from conversation
Key-value pairs generated → LLM converts to structured JSON dataset
JSON dataset processed → Application record created in Payload CMS
Application saved → AWS SES sends emails to user and admins
Admin views application → Can download formatted PDF via custom button
Problem: Payload CMS stores files locally by default. Files uploaded in local development didn’t appear in production.
Solution: Integrated AWS S3 with IAM user configuration. All file uploads now route to S3, ensuring consistency across environments.
Learning: Spent time understanding AWS IAM permissions and S3 bucket policies.
Problem: Initial YAML output format failed consistently. LLMs couldn’t generate valid YAML despite explicit instructions and formatting requirements.
Solution: Switched to JSON output format. Reduced failure rate but still encountered 70% failure when asking LLM to both extract information AND format it simultaneously.
Final Solution: Implemented two-step pipeline:
Result: Failure rate dropped to near-zero. LLMs perform better with single-task instructions.
Problem: Processing ~100 messages with LLM calls takes significant time. Can’t block user interface during processing.
Solution: Implemented background job system using Payload’s built-in cron daemon. User sees thank you page immediately while processing happens asynchronously.
Learning: Spent 8 hours learning Node.js async patterns, daemon processes, and job queue management.
Problem: Railway’s Hobby plan blocks outbound SMTP. Managed email services (Resend, Mailgun, Sendgrid) cost $15-20/month.
Solutions Evaluated:
Final Solution: AWS SES
Problem: Client needed formatted PDF reports from structured application data.
Solution: Created custom Payload CMS element that appears before save actions. Clicking “Download PDF” generates human-friendly formatted document from application JSON.
Implementation: 2 hours of formatting and layout refinement.
Single-step extraction and formatting failed 70% of the time. LLMs handle single-task instructions more reliably than multi-task prompts.
Despite clear instructions and critical formatting tags, LLMs consistently produced invalid YAML. JSON proved significantly more reliable.
Cost optimization. Managed services cost $15-20/month for features we didn’t need. AWS SES costs pennies with same deliverability.
Processing 100+ messages with multiple LLM calls takes 30-60 seconds. Background processing maintains smooth UX while handling heavy compute.
Payload’s local file storage breaks in multi-environment deployments. S3 provides consistent file access across development and production.
✅ Delivered on time within 2-month timeline
✅ Successfully deployed full-stack application with AI integration
✅ Implemented cost-effective infrastructure (avoided $20/month email costs)
✅ Built reliable LLM pipeline with near-zero failure rate
✅ Automated application processing with background jobs
✅ Created admin-friendly PDF export system
Backend Development
Cloud Infrastructure
AI Integration
DevOps
Frontend: Next.js + React + Tailwind + Shadcn
CMS: Payload CMS
Database: PostgreSQL (Railway)
Storage: AWS S3
Email: AWS SES + Nodemailer
AI: Together.ai GPT OSS 120B
Hosting: Railway
Domain/DNS: One.com
This project pushed me outside my frontend comfort zone into full-stack territory. I navigated AWS services, implemented AI pipelines, built background job systems, and delivered a production-ready dating platform with conversational onboarding.
The most valuable lesson: complex problems often need simple solutions. When the LLM failed at multi-tasking, breaking it into sequential steps solved the issue. When email services were expensive, going lower-level with AWS SES cut costs by 95%.
Frontend engineers can absolutely ship full-stack projects. It just requires willingness to learn, systematic problem-solving, and not being afraid to dive into unfamiliar territory.
Send me an email or message me on LinkedIn if you're looking for someone who builds without BS.