Back to YouTube
Parker Rex DailyMay 28, 2025

LLM Experience is the New Developer Experience (or at least part of the AI pic ya you understand)

LLMs redefine developer experience: Express to FastAPI migration, GCP insights, TanStack vs Next, and practical dev strategies.

Show Notes

Pushing through the Express to FastAPI move, this daily update dives into how LLMs shift the developer experience, plus practical notes on AI SDLC, background agents, and the roadmap for the next-gen tooling.

LLMs are shaping the dev experience

  • LLMs change tool choices more than coding languages do: pick stable, well-supported stacks that they can exploit well.
  • Design patterns matter: even with LLMs, avoid hand-rolled edge cases; lean on proven patterns (e.g., Remix/T3 stack fits fast, reactive front ends).
  • For feature work, use a three-solution approach: generate three viable options, then pick and refine.

GCP: bullish but with caveats

  • Strong case for GCP: robust data, science, TPUs, and deployment-first mindset.
  • Not every use case: enterprise DB services can be pricey for small projects (consider alternatives like Superbase at smaller scales).
  • If you want deeper context, check out the GCP playlist Parker mentions.

AI SDLC and context: designing with prompts

  • AI SDLC framework: use structured prompts to guide development.
    • Idea prompt
    • PRD prompt and PRD+ prompt (poke holes)
    • Architecture prompt (Python and TypeScript examples)
    • System patterns and tasks prompts
  • The goal: give the AI enough context to handle complex feature additions in a monorepo (apps include FastAPI, Express, Discord bots, Next.js web app).
  • Three-solution method applies here too: generate multiple viable approaches before committing.
  • Augment, co-pilots, and context tooling matter to keep the AI grounded in your repo and patterns.

Background agents: practical patterns and pain points

  • Context is everything: agents need access to the right code, docs, and prompts to do real work.
  • Tools mentioned: Augment, Cursor, and a co-pilot strategy to scaffold agent workflows.
  • Watch for reliability and cost: early builds can spin in loops; prefer reproducible templates and guarded workflows.

Architecture and roadmap: where the build is headed

  • Current focus: migrate from Express to FastAPI; build a thin back end with a rich front end.
  • Stack direction: FastAPI + Next.js, with a modular monorepo hosting API, Discord bots, and a web app.
  • Vision for V0 (production-ready UX): a dashboard-like experience with an AI-ops playground and scripting inside the app.
    • Features to expect: overview, events, repositories, announcements
    • AI integrations playground: a prompt library with inputs, outputs, and model selections
    • Learning tracks, user-generated paths, and co-pilots trained for different frameworks/languages
    • Context brains-based data scraping (Builder.io) to power knowledge and prompts

V0 product UX focus (high-level)

  • Prompt playground: test and compare prompts, with metadata like popularity and inputs/outputs
  • Repository and membership views: who can access what
  • Integrations: GitHub, Discord, etc. wired up
  • Settings & news: billing, invoices, notifications, and a live news/data feed
  • Learning paths and co-pilots: tailored to languages/frameworks you’re learning
  • Data pipeline and learning content: curated from docs, YouTube playlists, and community sources

Security and privacy note

  • Be mindful: background testing and agents may access source code depending on the tool’s policy.
  • Review privacy policies for the tools you use (Cursor, Augment, etc.) and proceed accordingly.

Community strategy and mindset

  • The “marketing flywheel”: teaching and sharing builds relationships and opens doors, even if monetization isn’t the primary goal.
  • Building in a tech desert context benefits from community learning, not just code.

Quick takeaways and action items

  • If you’re starting a new AI-enabled feature, run through the AI SDLC prompts and generate three options first.
  • For tool selection, favor stacks that maximize context and reliability for agents (e.g., stable back ends + robust front-end patterns).
  • Start small with background agents and scale context carefully to avoid runaway costs.
  • Watch the upcoming VI and V0 timeline (June launch; beta in a week) and consider how the playgrounds and prompts can plug into your current projects.