AICronis
Back to Podcast Digest
How I AI··45m

Claude Code + 15 repos: how a non-engineer answers every customer question | Al Chen

TL;DR

  • A non-engineer at Galileo uses Claude Code across 15 repos to answer customer questions docs can’t — Al Chen cloned Galileo’s multi-repo codebase into one VS Code workspace so he can ask cross-service questions like how API, auth, and deployment layers actually work together for enterprise customers.

  • The key move is treating the live codebase as the source of truth, not public docs — because Galileo ships multiple features and releases per day, Al uses a 16-line AI-written pull all script to sync every repo’s main branch daily and keep answers current.

  • Customer-specific context is what turns generic AI into real support leverage — Al feeds Claude Code not just repos, but Confluence pages and a “customer quirks” page with details like Google Secrets Manager, CRD restrictions, namespaces, and sidecar/security requirements to generate tailored deployment guidance.

  • AI makes messy knowledge systems usable instead of forcing perfect documentation hygiene — the conversation argues teams can stop obsessing over whether truth lives in Confluence, Slack, Notion, or code, because Claude plus MCPs can traverse all of it and assemble the answer.

  • The workflow doesn’t end at answering the question; it turns support into a knowledge flywheel — using Pylon, Al converts Slack support threads into public help articles, creating a faster-moving knowledge base than the formal docs process and surfacing patterns that can inform product roadmap decisions.

  • Humans still matter because customers want judgment, trust, and signal compression — Al doesn’t paste raw AI output; he rewrites it to sound human, trims verbosity, checks sources and code lines, and still sanity-checks edge cases with engineers when the answer depends on unwritten context or future refactors.

The Breakdown

When the docs stopped being enough

Al Chen opens with a blunt admission: the moment he realized he “couldn’t really do my job” was when public docs and standard AI prompting still weren’t producing the answers customers actually needed. At Galileo, where he works in field engineering for an AI observability platform, enterprise customers ask deeply technical questions about a product spread across many backend services deployed into Kubernetes.

Pulling 15 repos into one workspace

The unlock was simple and slightly wild: clone all the service repos into one giant VS Code directory and let Claude Code query the whole system. Al isn’t an engineer by title, but with UI, API, auth, and the rest all sitting side by side, he can ask Claude to trace how features work across repos instead of pinging engineering in Slack all day.

A 16-line script keeps the whole thing current

Because Galileo ships fast, stale code would ruin the whole setup, so Al had Claude write him a tiny script to git pull the main branch across every local repo. Instead of manually running pulls in 15 directories, he now does one “pull all” each day and keeps his workspace synced to the latest source of truth.

Claude + Confluence + customer quirks

Al then shows a deployment workflow built around a custom Claude command for Kubernetes installs. It first checks Confluence via MCP, then falls back to the codebase, and it gets dramatically better when paired with a living “customer quirks” page that captures enterprise-specific constraints like air-gapped environments, secrets management, namespaces, sidecars, and service-to-service encryption.

Let the AI navigate the chaos

One of the most memorable ideas in the conversation is that teams can now “live in a little bit more chaos” because AI can traverse scattered information better than humans ever could. Instead of obsessing over the perfect source of truth, the advice is practical: save the useful Slack thread, dump the notes into Confluence or Notion, and give Claude more context to work with later.

Turning support threads into a knowledge engine

In the second half, Al walks through Pylon, which Galileo uses to monitor external customer Slack channels. From one dense support thread, he can generate a help article draft, publish it into a public knowledge base, and create a faster-moving layer of documentation than the official docs repo and PR process allow.

AI handles the draft; humans handle the trust

Al is clear that he doesn’t blindly forward Claude’s output to customers. He trims the AI tone, removes overly verbose “in summary” filler, asks for source citations and code lines, and still checks with engineers when the answer may depend on hallway conversations, meeting notes, or refactors that haven’t landed in code yet.

The bigger lesson: customer-facing teams need hard skills now

The lightning round lands on a broader thesis: this is the era of the hard skill, even for non-engineers. Both Al and the host argue that customer-facing people should get comfortable with Git, repos, IDEs, and code-shaped workflows, because AI makes technical depth more accessible — and more necessary — than ever.