More Budget Won’t Save You. We Can’t Outspend Inefficiency Anymore.
Spending doesn’t fix inefficiency, it just funds failure. The only way forward is structured, deliberate thinking before execution.
This morning, I was sitting on the rocks in front of Øresund, zoning out a bit, just watching the water.
For once, I wasn’t rushing through my day to day, I was just still.
It reminded me of Ryan Holiday’s book, Stillness is the Key, and how real clarity comes in the moments where you stop forcing motion. I’m still working on adopting stillness as a way of life, but when it comes to my work? That’s different.
People assume being fast means cutting corners. But I’m fast because I have a process. One that values deep, critical thinking first, prioritizing long-term impact over quick wins, and doing what’s actually best for the client, not surface-level “we’ll figure it out later” decisions. And if there’s one thing I’ve struggled with for the last 15 years of career is accepting inefficiency.
Businesses treat inefficiency like a budget problem rather than a decision-making one.
When something doesn’t work, they spend more on ads, tools, hiring, or automation, assuming scale can compensate for bad strategy. But more budget doesn’t fix misalignment. It just makes failure more expensive. A broken system with more budget is still broken and burns through cash at a faster rate.
These are some examples of how businesses subsidize their own inefficiencies:
a. More Budget, Same Broken Strategy. Instead of fixing weak messaging or poor targeting, companies throw money at campaigns, hoping volume will override the problem. It never does.
b. More Tools, More Chaos (and more overhead). Tech stacks keep expanding, not because they add efficiency, but because buying software feels like problem-solving. Data-driven type sh..
c. More People, Same Problems (and even more overhead). Hiring increases, not because of real workload demands, but because inefficiencies are misdiagnosed as resource shortages. This results i overstaffing, reorgs, and layoffs, a predictable cycle masquerading as a business strategy.
The digital environment that once rewarded brute-force spending has changed. Brand perception is no longer dictated by ad spend, it’s shaped in decentralized, real-time environments where businesses have no control. Conversations move faster than campaigns. Customer journeys are non-linear. Attention is fragmented.
Yet most companies are still operating as if paid acquisition alone can sustain them.
Beyond the Mean
The Psychology of Avoiding Thinking
If inefficiency is so costly, why do businesses keep making the same mistakes? Why does spending still feel like the right answer, even when it obviously isn’t?
Because business culture rewards speed, optics, and performative progress over precision. Deep thinking, actual strategic problem-solving is actively disincentivized.
Thinking forces teams to confront uncertainty. It requires intellectual discomfort, challenging assumptions, questioning existing strategies, and admitting blind spots.
That’s why businesses gravitate toward quick, surface-level solutions that create the illusion of efficiency.
As products of these environments, we’re wired to believe that doing something is always better than doing nothing.
In an environment where speed is mistaken for competence, stillness is seen as stagnation.
In most organizations, productivity is measured by output rather than outcomes. Thinking is dismissed as “overcomplicating,” while shallow, fast execution is celebrated as agility.
This results in:
Projects moving forward without clear objectives because “we’ll figure it out later.”
Teams shipping unfinished work under the assumption that problems can be patched later, when later is always more expensive.
Confusing reactivity with adaptability, making decisions based on pressure instead of intelligence.
When speed is the only KPI that matters, execution becomes the goal instead of effective problem-solving.
Thinking OS: Operationalizing Thinking Playbook
The goals isn’t to move slower, it’s to replace habitual, reactive spending with structured, deliberate thinking before execution.
Over the years, I’ve worked with teams across product, data, marketing, and operations, and I’ve seen firsthand how reactive decision-making burns resources, time, and momentum. (Hi
lol)I’ve put together my own process as a way to structure my decision-making, borrowing from strategic planning, risk management, and execution frameworks used across different industries. I’ve kept on refining over the years, over different experiences and I like to think of it as a Thinking OS.
It’s not a new concept, but it’s a practical, tactical system that I’ve been applying as a commercial leader to improve efficiency and clarity of how I move in my work. That’s why the quality of my work is always mindful and focused on excellence vs. speed. Even though, I am pretty fast :)
This is how it works.
Core Principles
Explicit “Kill” Criteria
Define success and failure before starting. If an initiative doesn’t meet the criteria, it gets cut. No dragging it along “just in case.” Without this, teams waste resources keeping inefficient projects alive.
Success and failure must be clearly defined before launching any initiative.
If a project lacks clear "kill criteria," it signals unpreparedness and wasted resources.
Decision Reversal Ratio
If decisions never get revisited, it’s a red flag. Either teams are moving too fast without enough upfront thinking, or they’re too rigid to adapt.
Track how often decisions are revisited or reversed.
If no decisions ever get walked back, teams are prioritizing speed over thinking.
Proactive Thinking Time
Most inefficiencies don’t come from a lack of action but from a lack of thinking before action.
Integrate structured thinking into workflows, not just post-mortems.
Anticipate problems before they happen, instead of reacting to them later.
Stop Funding the Wrong Metrics
Track how many low-value initiatives were avoided, not just how many were completed.
Measure decision quality, not just results. If a team hits revenue targets through reactive, costly moves, that’s not efficiency, that’s luck.
Prioritize long-term impact over short-term optics. If acquisition spikes but churn follows, the growth model is broken.
How to put it in practice?
a. Create Decision Thresholds
Not all decisions require the same level of scrutiny. The depth of review should scale with investment.
Under $1K: Teams move fast, own execution.
$1K–$10K: Requires documented objectives and rationale.
Over $10K: Must pass strategic vetting, clear exit criteria, and executive approval.
PS: OBV, these numbers would look different based on the context.
b. Run Challenge Sessions
Before major launches or project kickoffs, force structured debate. Every initiative should have an assigned “devil’s advocate” challenging its assumptions.
Define rules of engagement to keep discussions productive.
Rotate the devil’s advocate role to prevent bias.
Document key takeaways for organizational learning.
c. Do Cost Attribution Analysis
Most businesses only measure ROI, but what about the opportunities they ignored?
Assess what didn’t get funded as a result of this decision.
Regularly review whether resource allocation is aligned with long-term strategy.
Make cost attribution transparent so teams feel the weight of trade-offs.
d. Make Strategic Decision Pauses
Before finalizing major strategic moves, enforce a 24-hour cooling period.
Pause prevents reactionary decisions.
Forces second-order thinking: what happens if this works? If it doesn’t?
Great story Juliana, but how can I practically do this?
You might be thinking: "This sounds great, but getting an entire organization to implement this? Good luck."
And you’d be right. Rolling out structured decision-making at scale is hard. Too many moving parts, too many types of personalities, too much "we’ve always done it this way."
That’s why I don’t really wait for buy-in.
I use this system as a standalone process for myself on every project I touch. Before I commit to anything, whether it’s a strategy, an analysis, a new product, anything, I run it through these filters. I define what success looks like, what failure looks like, and whether is the best thing for my client and their clients.
It’s why I’m able to move fast without making a mess.
Long-term impact > surface-level urgency.
So yeah, implementing this org-wide? Difficult.
Using it yourself? Very doable. It’s just something you do before you throw yourself into anything.
And yes, we can’t outspend inefficiency anymore.
Most of the time, good enough is good enough. It keeps the wheels turning, avoids overcomplication, and gets the job done.
But when good enough stops working, it fails completely. The shortcuts compound, the inefficiencies stack up, and suddenly, businesses aren’t just moving fast, they’re running in circles, burning resources to compensate for bad decisions.
The only way forward is reinforcing thinking where it matters. Cutting what should never have existed. Replacing reaction with clarity. That’s what being efficient really means.
Until next time,
x
Juliana
PS: This way of working was heavily influenced over time by Kyle Brodeur, who was my manager when I was working at CXL, and of course my good friend, and partner in building AI products, Krasimir Bambalov. Two people that changed everything about how I think of project delivery. Definitely follow and connect with them.
PPS: Shorter version of the newsletter this week because I still have to wrap up my presentation for Conversion Boost Copenhagen happening next Tuesday, March 18th. Excited to see some of you there and wish me luck 😂
Nah dude, we were a good team and I was never a good team player til I met you. Real shit.
I would be VERY interested in more info on "Challenge Sessions"-- Some people (me) are allergic to "devil's advocate," because it's often a way of excusing jerky behavior ("Don't be so sensitive, I'm just asking questions!"). OR it's a way of not fully owning what the devil's advocate REALLY thinks – they frame their questions as "devil's advocate" because they're not comfortable with outright disagreement, which adds an un-needed layer of "reading between the lines".
But productive pressure testing of strategies is SO critical! (Which makes bad "devil's advocate" behavior all the more frustrating!)