We no longer live with machine learning. We live within it.

When journalist A.J. Jacobs decided to spend 48 hours without AI, he expected a few inconveniences: no Netflix recommendations, no autocorrect, no chatbots. What he did not expect was to quickly discover how profoundly artificial intelligence had already woven itself into every corner of modern life.

What caught my attention most in his story, was not the humour or the logistical challenge of his experiment, but the care with which Jacobs defined AI before beginning. He did not limit it to AI content generative tools or image creators. Instead, he included machine learning, hence including all the systems that learn from data, adjust predictions, and quietly power everything from weather forecasts to traffic lights to spam filters and energy grids.

That choice recognised that machine learning has been shaping our world for decades without most of us noticing it. Just as ecosystems are made up of interdependent species, today’s human systems are intertwined with algorithmic ones. We are participants in, and willing data providers to, an algorithmic ecology that influences how we live, lead, and relate.

The Ecology We Inhabit Is Not the One We Designed

When we speak about AI, we often imagine futuristic robots or clever chatbots. But beneath those visible forms lies machine learning, a quiet, adaptive force that has shaped our world for decades.

It optimises shipping routes, predicts customer churn, scans resumes, recommends news, and manages infrastructure. Each model learns from another, forming invisible chains of influence that link industries, governments, and individuals.

We have, in effect, built a vast algorithmic ecosystem, a web of interdependent systems that learn, adapt, and influence each other continuously.
No single person or organisation designed it, yet all of us are part of it.

Like nature, this ecosystem is complex, self-organising, and full of feedback loops. Unlike nature, it lacks inherent balance. It evolves toward efficiency, not necessarily toward wisdom.

Interdependence Without Awareness

In ecology, imbalance in one area sends ripples through the whole. The same is true of our digital systems.

For example, a tweak in supply-chain logistics, can alter global supply routes and as a result global energy use and/or emissions. A bias embedded in one company’s recruitment AI can quietly influence hiring norms.

We live within this algorithmic interdependence but often without any awareness of it.

As Jacobs discovered, even trying to “opt out” of AI exposes just how dependent we have become. It is not a question of whether something is touched by machine learning, but to what degree.

Leadership in a System You Did Not Design

For leaders, this raises an unsettling truth:
You are accountable for outcomes that are shaped by systems you did not build and can not fully see.

Here, systemic leadership offers valuable guidance. Drawing on the work of Jan Jacob Stam, who explores organisations as living systems governed by invisible fields and patterns, we can think of machine learning as part of the organisational field, an unseen force that shapes behaviour, decision-making, and culture.

In this view, leadership is not about controlling the algorithmic system. It is about becoming aware of the field you are already in.

That awareness unfolds in three dimensions:

  1. Systemic awareness – seeing your organisation as part of a larger ecosystem of data flows, platforms, and feedback loops.
  2. Ethical awareness – asking who benefits, who is excluded, and what dynamics your technology choices reinforce.
  3. Temporal awareness – recognising that machine learning compounds over time; today’s small bias becomes tomorrow’s norm.

From Users to Stewards

In natural ecosystems, every species contributes to maintaining balance. In the algorithmic ecology, leaders are called to act as stewards of relationships between human and algorithmic systems.

That stewardship might include:

  • Creating transparency around how algorithms inform key decisions.
  • Including algorithmic impact in strategic and ethical reviews, alongside social and environmental impact.
  • Encouraging teams to question how dependence on predictive tools affects creativity, empathy, and judgment.

As Jan Jacob Stam reminds us, awareness itself is an intervention. When leaders become conscious of the invisible dynamics shaping their organisations, the system begins to shift.

Learning to Live Consciously in the Machine

A.J.Jacobs ended his 48-hour experiment both amused and unsettled. The world, he concluded, is far more algorithmic than we realise and the line between the digital and the human has all but dissolved.

Machine learning is now part of our collective ecosystem. The task for leaders is not to escape it, nor to surrender to it, but to lead within it, with awareness, ethics, and curiosity.We may not have designed this system, but we can learn to navigate it wisely.
And in doing so, perhaps we can restore balance, humanity, and meaning in the age of the machine.

Share the Post: