The Report That Came Back Shuffled
My AI assistant can write code, debug systems, research topics, manage files across dozens of projects. But when I asked it a basic orientation question about my work environment, it drew a blank. Not because it couldn’t access the information. Because nobody had ever connected the dots for it.
So I spent a real chunk of the afternoon doing something that felt oddly like onboarding a new hire. Walking through the project ecosystem. Explaining which directories matter. Clarifying what connects to what. Giving it the map of the territory, not just individual tasks to run.
The shift was immediate. Once the assistant understood the landscape, the nature of what I could ask it changed. Before, every request needed full context spelled out. After, I could say something closer to “handle this” and trust that it understood the surrounding constraints. We obsess over what AI can do. What it knows about your specific world matters more.
Every person’s context is different, though. My project structure, my naming conventions, my preferences, the relationships between my work and my clients. None of that comes pre-loaded. The investment is manual, human, and specific. But it compounds in ways that raw capability never does.
Something I haven’t fully processed yet. A big portion of today’s work involved using AI to write and edit posts for this very journal. The tool I’m documenting is the tool producing the documentation. That sounds like a novelty, but it created something useful.
Writing about the process while inside the process catches things. Turning a day’s work into a narrative forces a review you wouldn’t otherwise do. While editing today’s posts, I noticed patterns in how I was delegating tasks that I hadn’t consciously chosen. Some things I was handing off without thinking. Others I was holding onto for no reason other than habit. The writing surfaced the habits. The AI did the writing. And the cycle tightened.
A Turing Award winner once argued that writing doesn’t record thinking, it improves it. Five days into this journal and that’s not theory anymore. It’s Tuesday afternoon.
A piece crossed my feed about the hottest AI job according to Silicon Valley VCs. The punchline: it’s not technical. It’s strategic. Not prompt engineering, not model training. The role that’s emerging is the person who understands a business deeply enough to know where AI fits and where it doesn’t. That tracked with my afternoon. The hard part wasn’t the technology. It was translating my business context into something the technology could use.
Separately, Codie Sanchez mentioned speaking to 2,000 people recently. Ten were using agentic AI. About 20% had heard of the tools I use every day. We’re living inside a bubble and the edges are farther away than we think. That number sat with me all evening.
One more that stuck: someone pointed out that the real wealth gap isn’t rich versus poor. It’s business owners versus everyone else. Ownership compounds, employment doesn’t. AI is collapsing the cost of starting and running a business. I don’t know what to do with that yet, but I keep turning it over.
Six days in. The journal keeps circling back to the same thing: the bottleneck is context. Not computing power, not model intelligence. Just the patient, unglamorous work of teaching your tools what you actually do all day. Nobody’s automating that part yet.