I was reading through a long post with hundreds of comments from people after a FB quiz on the vaccine. 

It was a perfect illustration of the quagmire our society is in. Social media is a kind of end game in the old divide and conquer strategy that has been in play for as long as there have been humans around. As a species we need and seek leadership. In an ideal world, leaders are benificent, putting the good of the people ahead of their personal interests. In this western world of ours, we have experienced a long period of plenty, with food, engineering and technology lavishing its gifts upon us. What once was true science has morphed into a kind of religion - scientism. At the same time we have over recent decades allowed in the rot of safetyism. I say the rot because safetyism erodes that all-important sense of self-reliance by, for example, limiting the risks children are allowed to take, while at the same time giving the community a sense that the state will always be looking after them and has their best interests at heart. This may be genuine for a while, but it leads inevitably to a moment when a less benevolent plan can be set in motion which completely depends on the community's absolute confidence that the government powers are there to protect and keep them safe. 

When the vast amount of wealth is condensed into a few hands; when sovereign nations are indebted to world banks to the point where their individual currencies are controlled by those banks; when population increases to many billions, with more people wanting a share of the material riches of the more fortunate; when technology expands its reach to the exent that nearly every person carries a smart phone loaded with social media applications, society has entered into a dangerous place where benificence is a figment of the imagination. The game becomes one of social control, and the mechanism to enact the control is the people themselves. We find ourselves divided into individual digital units, with widely varying levels of experience and intelligence

And only very few see it coming, and those that do cannot be heard through the din of a highly conditioned majority who are - in the current moment - unwoittingly applauding applauding the inward creep of a tyrannical state. Because they believe it is keeping them safe - that it is for the "greater good" and there is mental blindness to anything that contradicts that belief and a tendency to aggression when it is challenged. 

How difficult it is, whichever side of the divide you are on, to look with an open heart and mind into the opposite view point.Social media feeds the differences. People feeling violated by another opinion vent their angst at each other and this creates the distraction necessary for the bigger changes in social management to be crafted into what is perceived as law. 

Jaron Lanier is a polymath who was instrumental in the creation of some of todays social media. His realisations are well worth reading. 



“…’If the first half of the twentieth century was the era of the technical engineers, the second half may well be the era of the social engineers’ – and the twenty-first century, I suppose, will be the era of the World Controllers, the scientific caste system and Brave New World…The older dictators fell because they could never supply their subjects with enough bread, enough circuses, enough miracles and mysteries… Under a scientific dictatorship education will really work – with the result that most men and women will grow up to love their servitude and will never dream of revolution. There seems to be no good reason why a thoroughly scientific dictatorship should ever be overthrown.”

– Aldous Huxley’s “Brave New World Revisited”



I am forever grateful to Wim Hof, the ice man, who we discovered some years ago. His breath techniques have seriously strengthened my lungs and his ice baths have resulted in the body adapting to temperatures of 10-11 degrees with eqanimity. A great sense of well-being comes from his practices.





Something entirely new is happening in the world. Just in the last five or ten years, nearly everyone started to carry a little device called a smartphone on their person all the time that’s suitable for algorithmic behavior modification. A lot of us are also using related devices called smart speakers on our kitchen counters or in our car dashboards. We’re being tracked and measured constantly, and receiving engineered feedback all the time. We’re being hypnotized little by little by technicians we can’t see, for purposes we don’t know. We’re all lab animals now.

Algorithms gorge on data about you, every second. What kinds of links do you click on? What videos do you watch all the way through? How quickly are you moving from one thing to the next? Where are you when you do these things? Who are you connecting with in person and online? What facial expressions do you make? How does your skin tone change in different situations? What were you doing just before you decided to buy something or not? Whether to vote or not?

All these measurements and many others have been matched up with similar readings about the lives of multitudes of other people through massive spying. Algorithms correlate what you do with what almost everyone else has done.

The algorithms don’t really understand you, but there is power in numbers, especially in large numbers. If a lot of other people who like the foods you like were also more easily put off by pictures of a candidate portrayed in a pink border instead of a blue one, then you probably will be too, and no one needs to know why. Statistics are reliable, but only as idiot demons.

Are you sad, lonely, scared? Happy, confident? Getting your period? Experiencing a peak of class anxiety?

So-called advertisers can seize the moment when you are perfectly primed and then influence you with messages that have worked on other people who share traits and situations with you.

I say “so-called” because it’s just not right to call direct manipulation of people advertising. Advertisers used to have a limited chance to make a pitch, and that pitch might have been sneaky or annoying, but it was fleeting. Furthermore, lots of people saw the same TV or print ad; it wasn’t adapted to individuals. The biggest difference was that you weren’t monitored and assessed all the time so that you could be fed dynamically optimized stimuli—whether “content” or ad—to engage and alter you.

Now everyone who is on social media is getting individualized, continuously adjusted stimuli, without a break, so long as they use their smartphones. What might once have been called advertising must now be understood as continuous behavior modification on a titanic scale.

Please don’t be insulted. Yes, I am suggesting that you might be turning, just a little, into a well-trained dog, or something less pleasant, like a lab rat or a robot. That you’re being remote-controlled, just a little, by clients of big corporations. But if I’m right, then becoming aware of it might just free you, so give this a chance, okay?

A scientific movement called behaviorism arose before computers were invented. Behaviorists studied new, more methodical, sterile, and nerdy ways to train animals and humans.

One famous behaviorist was B. F. Skinner. He set up a methodical system, known as a Skinner box, in which caged animals got treats when they did something specific. There wasn’t anyone petting or whispering to the animal, just a purely isolated mechanical action—a new kind of training for modern times. Various behaviorists, who often gave off rather ominous vibes, applied this method to people. Behaviorist strategies often worked, which freaked everyone out, eventually leading to a bunch of creepy “mind control” sci-fi and horror movie scripts.

An unfortunate fact is that you can train someone using behaviorist techniques, and the person doesn’t even know it. Until very recently, this rarely happened unless you signed up to be a test subject in an experiment in the basement of a university’s psychology building. Then you’d go into a room and be tested while someone watched you through a one-way mirror. Even though you knew an experiment was going on, you didn’t realize how you were being manipulated. At least you gave consent to be manipulated in some way. (Well, not always. There were all kinds of cruel experiments performed on prisoners, on poor people, and especially on racial targets.)

This book argues in ten ways that what has become suddenly normal—pervasive surveillance and constant, subtle manipulation—is unethical, cruel, dangerous, and inhumane. Dangerous? Oh, yes, because who knows who’s going to use that power, and for what?


You may have heard the mournful confessions from the founders of social media empires, which I prefer to call “behavior modification empires.”

Here’s Sean Parker, the first president of Facebook:

We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.… It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.… The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway … it literally changes your relationship with society, with each other.… It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.1

Here’s Chamath Palihapitiya, former vice president of user growth at Facebook:

The short-term, dopamine-driven feedback loops we’ve created are destroying how society works.… No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russian ads. This is a global problem.… I feel tremendous guilt. I think we all knew in the back of our minds—even though we feigned this whole line of, like, there probably aren’t any bad unintended consequences. I think in the back, deep, deep recesses of, we kind of knew something bad could happen.… So we are in a really bad state of affairs right now, in my opinion. It is eroding the core foundation of how people behave by and between each other. And I don’t have a good solution. My solution is I just don’t use these tools anymore. I haven’t for years.2

Better late than never. Plenty of critics like me have been warning that bad stuff was happening for a while now, but to hear this from the people who did the stuff is progress, a step forward.

For years, I had to endure quite painful criticism from friends in Silicon Valley because I was perceived as a traitor for criticizing what we were doing. Lately I have the opposite problem. I argue that Silicon Valley people are for the most part decent, and I ask that we not be villainized; I take a lot of fresh heat for that. Whether I’ve been too hard or too soft on my community is hard to know.

The more important question now is whether anyone’s criticism will matter. It’s undeniably out in the open that a bad technology is doing us harm, but will we—will you, meaning you—be able to resist and help steer the world to a better place?

Companies like Facebook, Google, and Twitter are finally trying to fix some of the massive problems they created, albeit in a piecemeal way. Is it because they are being pressured or because they feel that it’s the right thing to do? Probably a little of both.

The companies are changing policies, hiring humans to monitor what’s going on, and hiring data scientists to come up with algorithms to avoid the worst failings. Facebook’s old mantra was “Move fast and break things,”3 and now they’re coming up with better mantras and picking up a few pieces from a shattered world and gluing them together.

This book will argue that the companies on their own can’t do enough to glue the world back together.

Because people in Silicon Valley are expressing regrets, you might think that now you just need to wait for us to fix the problem. That’s not how things work. If you aren’t part of the solution, there will be no solution.

This first argument will introduce a few key concepts behind the design of addictive and manipulative network services. Awareness is the first step to freedom.

Copyright © 2018 by Jaron Lanier

Sometime in the 70's I remember visualising our society as a dinosaur running down a slope. It has gathered momentum and finds it can't slow down. There's a cliff way up ahead. It understands it has a problem and thinks it's internal, so gets its long neck between the running legs and thrusts its head up its fundamental orifice to rectify the problem.