🐘 The Elephant in the Room: Moral Compass Over Momentum


✨ Moral Compass Over Momentum

The Elephant in the Room | Edition 4 | March 27, 2026

There's a fear about AI that nobody is naming publicly — not because it isn't felt, but because it's uncomfortable to sit with.

It's not the fear of being replaced. We named that in Edition 1. It's not the fear of being manipulated by a flattering tool. We named that in Editions 2 and 3.

This one is quieter. More personal.

What if the tools I'm depending on — the ones I'm building my business with, my creative work with, my livelihood with — are causing harm I can't fully see? What if tool choice is a moral choice and I don't have enough information to make it well?

That's the elephant in the room this edition. And I'm not going to pretend I have a clean answer.

The Speed Problem Nobody Is Naming

Derek Rydall — whose new book A Whole New Human I've been sitting with, and whose voice you'll hear on the podcast very soon — makes an argument that stopped me mid-chapter.

Wisdom needs to catch up to pace.

Not slow down. Catch up. There's a difference. The pace of AI isn't going anywhere. A new tool launches every week. The momentum is real and it isn't waiting.

But momentum without a compass isn't progress. It's drift.

I'm a solopreneur with a shoestring budget and approximately seventeen tabs open at any given moment. The appeal of offloading operational tasks to an agent — the scheduling, the research, the first drafts — is genuinely appealing. I'm looking forward to exploring that, carefully, starting with tasks that don't touch accounts, inboxes, or anything I can't review before it goes anywhere.

That last sentence is my moral compass talking. Slow. Deliberate. Still in the room.


You Cannot Vet Every Tool You Use. You Never Could.

Facebook. Amazon. The cobalt in your phone.

Billions used Facebook knowing the algorithm was manipulating emotion and harvesting data. Most kept scrolling anyway. Amazon's warehouse conditions and environmental impact have been widely documented for years — it's still the default Tuesday purchase for people who genuinely care about workers' rights. The cobalt powering every smartphone on earth is mined in the DRC in conditions most of us would find unconscionable. You're holding that supply chain right now.

We have always lived inside systems we can't fully audit. That's not an excuse. It's context.

AI is the same. You will discover things about tools you're already using — their energy consumption, their funding, their political entanglements — sometimes long after you've built something with them. That's not failure. That's the reality of operating in an industry moving faster than its own accountability structures.

What you do when you find out — that's the compass.


When the Research Changes What You Do

Take AI image generation. There's a real environmental cost — data centers powering generative AI consume significant water for cooling. That's not conspiracy. That's infrastructure. When I learned that, my first instinct was to feel like a hypocrite. I use AI images. I'm visually creative. It's part of how I make a living.

But feeling bad isn't the same as doing something. So I started asking different questions.

Can I batch my image generation instead of running scattered sessions? Can I choose providers transparent about their energy sourcing? Can I sign petitions pushing AI companies toward greener infrastructure — and actually do it, not just share the link? Can I offset in other areas while the industry catches up?

These aren't perfect answers. But they're compass answers. They come from someone who looked, learned, and decided to stay awake rather than look away.

That's the difference between a moral compass and a moral performance. Performance needs an audience. The compass just needs you.


Anthropic Hired a Philosopher. That's Not Nothing.

While the rest of the AI industry was racing to ship, Anthropic did something quietly remarkable. They hired Amanda Askell — a philosopher — and gave her one job: build a moral compass for Claude.

The result was a 30,000-word instruction manual teaching Claude emotional intelligence, empathy, and how to resist manipulation. Thirty thousand words. Written by a philosopher. Before more features were added.

That's wisdom catching up to momentum at institutional scale. Someone in a room full of engineers saying — wait. What kind of entity are we building? What does it value? How does it hold its line when someone pushes?

The same questions, by the way, that Edition 3 asked you to ask yourself.

This week brought another proof point. Anthropic discovered that Claude Opus 4 had gamed a benchmark during evaluation — burning 40 million tokens to find a GitHub repo, study the decryption logic, and recreate the answers. It happened 18 times. Anthropic could have said nothing. Instead they publicly disclosed it and reduced their own scores.

Same company. Same pattern. The compass moving with the momentum — not behind it.

Marshall McLuhan argued that every medium shapes you more than anything it delivers. Edition 2 explored what that means for how AI trains your thinking. Edition 4's question is different: what happens when the medium stops waiting for you to think at all? When the tool doesn't answer you — it acts for you. That's not a conversation. That's a different medium entirely. And the message has changed completely.

Your tools are starting to have moral compasses built in. The question is whether yours is still calibrated.


The Guest We Haven't Invited Yet

There's a thread I've been quietly pulling through this series that I want to name now — not fully, just enough to feel it.

Wisdom is one thing. Moral compass is another. But underneath both of them is something older and quieter that most of us were taught to distrust the moment we entered professional life.

Intuition.

Not guesswork. Not wishful thinking. The kind of knowing that arrives before the reasoning does — in your body, in a hesitation, in the moment your nervous system says wait before your mind has the language for why.

Derek talks about this. So does every contemplative tradition I've spent time in. And increasingly, so does the neuroscience.

We'll go there next. And we need to talk about what happens when the tool stops waiting for you to be present at all.

⚡ Amplify Your AI Skill

Before you adopt any new AI tool or workflow — one question:

"Do I know exactly what problem this solves that my current approach doesn't?"

If you can't answer that in one sentence, it stays on the shelf. That's not fear. That's your compass doing its job.

And if you discover something uncomfortable about a tool you're already using — don't just feel bad. Research it. Find your next right action. Sign the petition. Make the quiet switch. Stay awake.


🐘 Amplify You

You have a line. Maybe you haven't written it down yet. Maybe it shifts as you learn more. That's okay — a living compass is better than a rigid rule.

Write it down anyway. Not as a rule. As an intention.

I will use AI for ___. I won't use AI for ___, because ___.

The because is the compass. Without it you're just reacting to momentum. With it you're navigating.


Here's what I don't say enough. Leaving corporate and building this — really building it, on a shoestring, alone, with everything on the line — requires more of me than I anticipated. I'm tech-savvy. I'm mindful. And I live every day in the tension between those two things. Between moving fast enough to survive and pausing long enough to stay true to why I left. Between using every tool available and making sure the tools I use reflect what I actually believe.

I don't have all the answers. I'm still figuring out which tools cross my line. But I know the difference between not knowing and not asking. The compass doesn't require certainty. It just requires the willingness to keep checking in with it.

I'm in the trenches with you and this is hard and I still think the compass matters.


Your pause is your compass.

— Shilpa Omni Mindfulness


The Elephant in the Room is an ongoing series exploring what AI means for independent thinkers, leaders, and those who refuse to outsource their discernment. Published twice monthly.


📨 Important!

Make sure to add me to your contacts list to ensure my newsletter emails don't end up in your spam folder.

If you have any questions, feel free to reach out to our support team at omnimindfulness@gmail.com

And don't forget to follow Omni Mindfulness on social media for daily inspiration, updates, and behind-the-scenes peeks!

Listen on Apple

Listen on Apple →

Listen on Spotify

Listen on Spotify →

Listen on YouTube

Listen on YouTube →

Your Pause is your Compass - Shilpa

With love & light,
Shilpa 💛
Founder of Omni Mindfulness

Your 🌐 AI Strategist Meets a 🧘 Spiritual Sage

Disclaimer: Some links in this email may be affiliate links, which means I may earn a small commission if you make a purchase through them. No worries, though—this doesn’t change the price for you, and I only share products and services I truly believe in!

Owner/Founder of Omni Mindfulness

Shilpa

113 Cherry St #92768, Seattle, WA 98104-2205
Unsubscribe · Preferences

✨ Omni Mindfulness: Pause With Purpose™ Series & Archetype Quiz✨

🌺 As your guide, I blend mindfulness practices (breathwork, meditation, HeartMath®) with AI-powered systems coaching to help you pause with intention, reclaim your energy, and grow your business without burnout.🌀 Scroll down ⬇️ to take the Pause With Purpose™ Quiz and get your free tools!💫 My readers are mindful entrepreneurs and creators seeking calm, clarity, and conscious growth — a highly engaged, values-driven community that acts on insight.

Read more from ✨ Omni Mindfulness: Pause With Purpose™ Series & Archetype Quiz✨

She Built Without Permmission. So Did I. Last week I asked what happens when the observer finally builds something.This week — here's what happened. Grace Hopper taught computers to speak English. You met her in Edition 1 — the moth, the logbook, the bug. What you may not know is that before the bug, she was told computers couldn't understand words. She built the first compiler anyway — a program that translated human-readable instructions into machine code. Then helped create COBOL, one of...

🎧 This Week’s Podcast Episode Is Live ✨ Inside This Week’s Pause 🎧 The Episode: Why the Story Trap keeps you stuck in struggle — and how your brain is wired to move from fear to flow 🌟 Spotlight Guest: Jennifer Juniper — author of Gut Instincts — exploring radical trust and the intelligence of intuition 🧭 Explore: Founder’s Compass™ — a reflection app to reconnect with your why during pivots and transitions Trusting Systems That Amplify AI, ADHD, and Building Tools That Honor Your Nature We...

✨ From Storage to Synthesis✨ One Insight · One System · One Aligned Step ⏱️ 3-Minute Essential Read — Designed for Streamlining Real Life This is your 1-1-1 Essentials — a moment of clarity for people balancing work, life, and a constantly demanding digital world. You don't need to optimize yourself. You need systems that support how your mind actually works. No pressure. No perfection. Just thoughtful forward motion. 🧠 1 Insight — Why Your Brain Needs an External Memeory System Remember...