Recently, a Los Angeles jury handed down an unprecedented decision: a young woman successfully sued Meta and YouTube, arguing that the platforms were intentionally designed to be addictive and that this addiction harmed her mental health. She was awarded $6 million in damages. Meta and Google say they will appeal, arguing that teen mental health is complex and cannot be blamed on a single app.
They are not entirely wrong.
But they are not entirely right either.
Because this case is not really about one young woman. It is about the kind of technological world we are building, and what that world is doing to us in return.
I remember an evening around 2001. My wife and I were invited to dinner at a colleague’s house — the colleague had cooked for us. While we were visiting in the kitchen and around the table, her husband sat in the living room playing video games for much of the evening. I remember feeling that the interaction with him was childlike and avoidant. Since then, I have had many clients describe losing not just evenings, but years inside games. One client told me he had spent more than 20,000 hours on three games. He spoke about responsibility to the server, to the other players, about keeping morale up, organizing events, making sure everyone succeeded. In another context, we would call him a leader, a manager, a community builder.
But the world he was holding together existed on a server that could be turned off.
And yet, I have also seen technology function as something like medicine.
In 1992, when the Toronto Blue Jays won the World Series, I was volunteering at a hospital in Mississauga. This was just before the dot-com explosion, before the internet became what it is now. There was a patient there with many complications stemming from his diagnosis of multiple sclerosis. His wife had left him, and he was largely alone. But someone — maybe even a small group of dedicated people — had set him up with a HAM radio.
Through that radio, he could leave the hospital without leaving the hospital. He could talk to truckers on the major highways running through Toronto. For some of the time he could participate in conversations, communities and friendships. You could see the change in him. It was connection. It was dignity.
I remember thinking at the time: this is good medicine.
I was reminded of that again recently. A friend of mine was caring for his dying mother — she was 95 years old. In her final days, they spent hours together watching YouTube videos of life back in China. They followed one particular channel — a grandmother planting vegetables, harvesting them, cooking simple meals, living close to the land. My friend told me how peaceful it was, how it seemed to comfort his mother, how the two of them sat together watching scenes that reminded her of earlier life, earlier rhythms, earlier worlds.
How wonderful that technology could give them that.
A window back into memory.
A shared space at the end of a life.
So the question is not whether technology is good or bad. That is too simple.
The real question is: When does technology expand a life, and when does it consume one?
This is where the thinking of Bertrand Russell feels prophetic. He once warned that the real danger is not that machines become more like humans, but that humans become more like machines — more predictable, more conditioned, more responsive to stimulus than to meaning.
Social media platforms and online games are not neutral. They are highly refined behavioural machines. They measure where you look, how long you look, what makes you angry, what makes you laugh, what makes you stay. Then they adjust. They learn you the way a casino learns a gambler.
A book ends.
A movie ends.
A baseball game ends.
Even a casino will sometimes close.
But the scroll does not end.
The game world does not end.
The server is always running.
So we are left with a strange situation: we have built machines that are extraordinarily good at holding human attention, and we are only beginning to ask what that means for a human life.
Because attention is not a small thing.
Attention is your life.
I think back to that man with MS and his HAM radio. That technology gave him more life. It connected him to the world when his world had become very small.
I think of my friend and his mother, watching videos of a grandmother planting and harvesting vegetables in China. That technology gave them comfort, memory, and a way of being together at the end.
That is what good technology does.
It gives you more life.
The danger is when technology starts to take more life than it gives.
And that is the question now facing parents, therapists, teachers, and maybe soon, the courts:
Are these machines serving human beings, or are human beings slowly learning to serve the machines?
That is not just a technical question.
It is a moral one.
And increasingly, it is a clinical one.
Because in therapy rooms everywhere now, we are not just treating anxiety, depression, or loneliness.
We are treating people who are in relationships with machines — machines that know exactly how to keep them from leaving.
The task ahead is not to destroy these technologies. Like that HAM radio in 1992, they can be extraordinary tools for connection and meaning.
The task is to make sure they remain tools —
and do not quietly become environments that take more life than they give.
Because in the end, the real question is very simple:
Does this technology give you more life, or does it take your life and turn it into time spent on a server that never turns off?