There’s no place like home.

My daughter (almost 4) is currently obsessed with The Wizard of Oz – my favorite film (and story) of all time, and one that I was equally obsessed with around her age. I used to make my mom rewind the tape and start it over again, and over again, and over again. And now here I am, the mom, starting it…over again. It’s a little less endearing at 38 to rewatch a story that I’ve essentially memorized, but I’m not really watching the movie so much as I am watching my daughter experience it. Her favorite moment is when Dorothy opens the door from her dusty, tornado-stricken Kansas home to see the bright and colorful landscape of Munchkinland. My daughter looks to me for agreement – this is so cool, right? The coolest. I go back to knitting, or doomscrolling, and she continues to watch the screen in wonderment as Dorothy takes her first steps into a journey that ultimately brings her right back home.

Rewinding for a moment back to undergrad, I can recall being in a literature class and a professor sharing a Tolstoy quote: “All great literature is one of two stories; a man goes on a journey, or a stranger comes to town.” And that’s the Wizard of Oz in sum, isn’t it? Dorothy Gale goes on a journey. She is a stranger in a strange land. She conquers the internal stuff she needs to conquer, and she goes back home.

As I read thought piece after thought piece on the use of AI in communication (and yeah, this is another one) I’ve been thinking a lot about those two stories, the journey and the stranger, and the story I think we’re all telling ourselves about AI truly does fit squarely in both those archetypes: the stranger who came to town, and the journey we’re now on as a result. Not all of us are skipping down the road arm in arm like Dorothy and her scarecrow, though.

When any new technology arrives, the reaction is never uniform. Everett Rogers mapped this out decades ago in his diffusion of innovations theory: the idea that adoption moves through a population in predictable waves, from innovators and early adopters through the early and late majority, all the way to the laggards who arrive last, if at all. Communications professionals are no different, and honestly, we’re probably more conflicted than most. We work in language, meaning, and trust, and know better than anyone how quickly perception outpaces intention. So, when AI wandered into town, the reactions landed exactly where you'd expect them to on Rogers' diffusive curve, and every point on that curve came with its own flavor of welcome mat or pitchfork.

On one end: the enthusiasts, already building workflows, training agents, and writing LinkedIn posts about their favorite prompting methods. On the other: the resisters, with real, valid ethical concerns. The environmental cost of running these systems at scale is real (those electric bills speak for themselves). The jobs that have quietly disappeared into efficiency gains belonged to real people, and that deserves more than a footnote in a thought piece about innovation.

But underneath all those reactions is the same unresolved question: now that the stranger is here, what do we do? Which brings me back to Dorothy, and her not so low-key arrival in Munchkinland. She didn't plan any of it. The house crash landed, the Wicked Witch of the East was poof gone, and suddenly the Munchkins had an instant, albeit violent, solution to a problem they'd been living with for years. But Dorothy didn't ask what they'd been doing under that witch's thumb, or what they'd do differently now that she was gone. She had somewhere to be.

AI has arrived like a stranger in our organizations with the same energy: enormous, unplanned, genuinely capable of crushing things that have held us back for years. And the question on the table right now, in workplaces across every industry, is: what should I crush for you? It's the right question, but I'm watching a lot of leaders answer it in misguided ways.

In my own organization, the pressure is real. Leadership wants AI in the workflow, and I do too. We are stretched. There is always more work than there are hours, and more hours than there are people. But the approach I keep seeing isn't "how do we work differently?" It's "how do we do more of what we're already doing, faster, and with less hands?" Hand the robot stranger the to-do list, tell it to get busy, and when the dust settles, quietly eliminate the positions we no longer think we need. That's not a journey, that’s just offloading, and at great humanitarian and environmental expense. Crushing the to-do list faster doesn't fix anything. It just creates the illusion of progress while the real obstacles stay standing, looming off in the distance.

What are the real obstacles? For communications professionals, I'd argue they're the same ones they've always been, and they have nothing to do with bandwidth. They're the health campaign that was designed around what the organization wanted to say rather than what the community needed. The crisis plan that was written to satisfy a requirement and never tested against reality. The messaging framework that used the right words and reached exactly no one because the trust was never there to begin with. The report that took three months to produce and lived on a website no one visited. These are not problems of volume or speed. They are problems of thinking: about audience, about relationship, and about the role of communication.

AI is genuinely, remarkably good at the part of our jobs that any of us would have happily handed off anyway. It can produce volume and speed. It can draft, adapt, reformat, and summarize faster than I can on four shots of espresso. But it cannot tell you whether your message will land with a mother questioning if her newborn really needs vitamin K because she doesn't trust the institution telling her that her brand new baby could die without it. It cannot build the relationship that makes a community partner pick up the phone and work with you to organize a health clinic. It cannot sit with someone in crisis and find the words that don't make things exponentially worse. It cannot decide what matters.

I want to push back on myself here, because I've read the counterargument and I don't think it's without merit — the case that AI will eventually close the gap on human connection, that it will get good enough at reading emotional context, at responding with warmth, at being present in the ways that matter — that isn't science fiction anymore. People are out there having real, often detrimental relationships with AI. And I'll be honest: I've had conversations with AI that went somewhere I didn't expect. Late-night processing of things I wasn't ready to say out loud to another person. There's something real in that utility, and I'm not going to pretend otherwise.

But here's what I keep coming back to: it's not the same, and I don't think more sophistication closes that gap. AI approximates and reflects, and it responds in ways that feel sometimes remarkably like being heard, but what it doesn't do is need anything from you in return. I think that asymmetry, the fact that the connection only runs one way, is precisely what makes it useful for certain things and horribly insufficient for others.

For communications work, that distinction is everything. The communities we serve, the ones navigating health crises, housing instability, trauma, and systems that were not built with them in mind, are not looking for a sophisticated approximation of being heard. They are looking for evidence that someone is in the trenches with them, and that is not a capability gap that the next model release will close.

None of this is an argument against the adventure of AI in communications. I've watched colleagues plant their flags firmly in the "not for me" camp and I understand the impulse even though I don't share it. The people pushing back, those raising their hands (justifiably so) about the environmental cost and the human cost of all of it, are not being precious. They are asking what we're walking toward, and at what cost.

My answer: I want to walk toward communications work that is more precise, more human, and more honest about what it can and cannot do. I think AI, used well, can help us get there.

But before you open the tool, do the harder thing first: ask yourself what you're actually walking toward. Not "how can we be more efficient" or "how do we do more with less,” those are to-do list questions. I mean the real destination: what does good communication work look like for the people you serve? What gets better for them if you get this right? What are you protecting when you push back on a message that isn't ready, or insist on another round of community input, or argue for a strategy that takes longer but actually lands? Actually take the time to name that. Because if you don't know where home is before you start walking, no tool in the world is going to get you there, it's just going to help you move faster in whatever direction you're already headed.

That destination is the thing AI cannot give you. And once you have it, the question of how to use AI gets a lot simpler: does this help me get there, or does it just make me feel like I'm moving?

Once I know my true North, here's what using AI well actually looks like for me. It looks like pressure-testing a message before it goes to a community I don't fully understand yet, not writing the message, but feeding the AI my audience analysis and interrogating the draft from their perspective. It looks like synthesizing research so I can spend my actual time on the interpretation, the judgment call, the decision about what matters. Drafting and editing faster so I can spend longer in the room with the people the communication is for. Asking AI to steelman the argument I’m about to make publicly, so I can walk in sharper.

None of these use cases are replacements for the work. They're ways of protecting your capacity to do the work that only you can do. Not handed a to-do list, or deployed to simulate the human work, but brought into the process the way any good tool should be: in service of a destination you've already named.

Dorothy always had the power to go home. But she wouldn't have known what home meant to her if she hadn't walked every step of that yellow brick road first. The journey, for Dorothy, wasn't the detour, it was the whole point. So, let’s look toward the open door. The landscape out there is bright, even if a little terrifying.

This is so cool, right?

Next
Next

Pattern Fuels Perception