Let’s address the elephant in the room: AI. People have mixed feelings about it. I do too. AI is powerful and it is dangerous. It will make it harder to tell truth from propaganda as the tools get better at sounding confident. It can speed up work, and it can speed up harm.

Here’s where I land. AI is great at writing, organizing, and analyzing information, and at helping me write code. AI should not be making decisions that affect people’s lives. Not in employment, not in healthcare, not in housing, and not anywhere else with real-world stakes. I’ve had a manager who seemed to outsource critical thinking to ChatGPT, copying generic answers and leaning on it to make management decisions. That’s not leadership, and it’s not what these tools should be used for.

There are ethical issues we need to solve. Large language models were trained on work made by creators who didn’t consent. There is a real cost to the energy and water required to train and run these systems, and the emissions from AI datacenters is significant and growing. These costs are real. They matter. And while we’re at it, here’s another line in the sand: I don’t think AI should be used to make art. It can imitate style, but it doesn’t carry lived experience. If people want to explore AI as a medium, consent and compensation for human artists are table stakes.

Used responsibly, fact-checked, and audited, AI can be a powerful productivity tool. Here are the ways it helps me every day.

Reclaim for calendar management

I’ve used Reclaim for years and I can’t imagine not having it. I live and die by my calendar, and my schedule changes constantly. Reclaim lets me feed it tasks with due dates and it blocks focused time automatically. That single feature has saved me from dropping balls more times than I can count. Smart 1:1s find the best time to connect with a teammate without back-and-forth. Syncing my work and personal calendars keeps me from double-booking myself. When my calendar reflects reality, everything else gets easier.

ChatGPT for writing and information organization

I’m a good idea-er, not a naturally clean writer. My first drafts are chaotic. ChatGPT helps me turn a pile of nonlinear notes into something other humans can read. I still take a few passes to nail tone and voice, but it’s faster than wrestling a blank Google Doc alone.

It’s also useful for analysis. I can paste a lot of data and ask for an executive summary, or I can ask it to walk through details step by step. I can point it at documentation and ask a nuanced question that a simple find-in-page won’t uncover. It isn’t perfect. Sometimes it gets things wrong or hallucinates. Trust but verify. Cross-check claims. Treat it like a sharp tool, not an oracle.

GitHub Copilot for code

Copilot is the newest part of my toolkit and it quickly became a favorite. I often know what I want to build and get stuck on the how. Before Copilot, that meant hours of searching, trying, failing, deciphering cryptic errors, and tab-explosions. Now I can iterate in minutes instead of hours. Copilot suggests scaffolds, explains errors in plain language, and unblocks me when I’m close but not quite there.

It helps in code reviews too. It surfaces improvements I wouldn’t have thought of, like refactors for readability or safer patterns for edge cases. It feels like pairing with a knowledgeable coworker who never gets tired of answering my questions. For straightforward tasks, I can write a well-defined GitHub issue, assign it to Copilot, and focus my attention on the work that needs a human brain.

Running an experiment: product owner mode

I’m experimenting with a project where I’m not writing a single line of code. I’m acting as a product owner. I write product requirements, user stories, feature requests, tasks, and bug reports. Copilot does the programming. The learning curve is real, but I have a working prototype of something I never would have built solo. Maybe I’ll share it, maybe I won’t. Either way, it’s been fun and it proves a point: with clear thinking and tight feedback loops, these tools can extend what a single person can produce.

Guardrails that keep me honest

AI is a power tool. Power tools demand respect. Here are the rules I try to follow:

  • Keep humans in the loop for decisions that affect people. Advice is fine. Decisions are not.
  • Fact-check, cite sources, and audit outputs. If I can’t verify it, I don’t ship it.
  • Prefer consented and licensed data and models when possible. Compensate creators.
  • Be mindful of the environmental cost. Efficiency matters.
  • Don’t use AI to make art. Encourage human creators. If AI is involved, make consent and compensation explicit.
  • Remember the failure modes: confident nonsense, subtle bias, and security risks. Design processes that catch them.

The payoff

With these guardrails, AI gives me time back. Reclaim defends focused work. ChatGPT turns messy notes into clear drafts and helps me interrogate information. Copilot accelerates development and explains the weird parts. I still own the outcomes. I still do the thinking. The tools make me faster and more consistent, not less responsible.

That’s the point. AI shouldn’t replace judgment, taste, or accountability. Used well, it amplifies them. Used poorly, it erodes them. Treat it like a circular saw: read the manual, mind your fingers, keep the blade sharp, and measure twice before you cut.

💖 Support My Work

If you enjoyed this post and want to support my blogging and open-source work, consider becoming a sponsor on GitHub Sponsors. Your support helps me continue creating and sharing valuable resources!

GitHub Sponsors