A working record of the AI tools and automations I've built — the problem each one solves, how it works, and how I think about scaling it beyond the person or team who first used it.
Kitchen teams were wasting hours ideating — sifting through books, memory, and reports. I built a culinary ideation interface powered by Claude API and Gemini image generation. Chefs prompt with an ingredient, season or technique and get back flavour pairings, plating ideas, recipe directions, and generated visuals. Outputs save to a recipe bank, capturing institutional knowledge that previously walked out the door with staff turnover.
Digitising vinyl was a multi-step pain: record a full side, manually split into tracks, find metadata, tag each file, source cover art, export — every step a different tool. I built a macOS app that handles the entire pipeline end-to-end. It automatically detects silence to split tracks, queries the Discogs API for metadata and cover art, tags each file, and exports in your format of choice. Started as a personal terminal script, now has 30 GitHub stars and genuine community traction on r/vinyl.
GTM implementations are slow, error-prone, and require constant context-switching. I built an MCP server setup that connects Claude directly to GTM and GA4. Tags are deployed by prompting Claude in plain English — scroll depth events, conversion clicks, custom dimensions, all via prompt. GA4 can be queried conversationally: "what pages are losing people on mobile?" returns an actual answer, not a dashboard to interpret. Implementation time down from 1–2 days to 1 hour per client.
Running multiple AI projects simultaneously means costs blow out fast if every task hits the most expensive model. I built a Claude Code wrapper that routes tasks to Opus, Sonnet, or Haiku based on complexity — with SQLite cost logging per project. Integrated as an MCP server so routing happens automatically across all active projects. Weekly cost summaries by project make AI spend visible and manageable.
Students learning AI in a marketing context need to see it applied to real decisions — not just "AI can write copy." I built an interactive tool that takes campaign parameters (channel mix, budget, audience targeting, creative approach) and returns AI-generated performance predictions with reasoning. Used live in webinars across multiple cohorts, built to run reliably with 50–100 students at a time.
A small social enterprise needed to produce consistent content at higher volume — case studies, program comms, social posts — without the headcount to match. I built an AI-assisted content production workflow with a structured human review stage. Claude handles first-draft generation using brand voice guidelines; a team member reviews and approves before anything goes out. The workflow is documented in Asana with clear handoff points so nothing slips.