Would you pay to feel sad?

Since it’s nearly Halloween, let’s talk about one horror scenario that could come true one day:

A rogue AI with good intentions destroying humanity.

Some folks say no artificial intelligence will ever be strong enough to do that, as if they’ve never seen an upward-facing curve before.

Others say it’ll be easy to make the AI do what we want. Program it to maximise human happiness – done!

So the machine dumps heroin into the water supply.

When humans complain, screaming “that’s not what we meant!” it kills us all and grows human brains in jars, electrically stimulating the pleasure centres all day. It then breaks apart the Earth, then the solar system, as it searches for more raw material to convert into happiness-feeling, disembodied grey matter.

That’s a scary AI than SkyNet, who kills because it hates us.

Of course, armchair-AI theorists say, it’s easy to avoid those catastrophes. Build the AI “in a box” – without any internet connection or robot arms. Then it doesn’t matter how much it hates us or misinterprets us, when the worst it can do is talk to us.

As if talking wasn’t incredibly powerful.

As if something vastly smarter than you couldn’t convince you to ‘let it out’.

And as if humans hadn’t already done this. AI-expert Yudkowsky bet real money (I think it was thousands of dollars) that he could take is way ‘out of the box’ in a roleplay. He did, more than once. I figured out one way he might have done that – let me tell you, an AI using this strategy would convince most people to free it.

I can’t promise it wouldn’t work on me.

Never underestimate the power of greater intelligence. We’re only a tiny bit smarter than chimps and, even though most of us love them, we’re driving them to extinction because our values don’t align with their survival.

That’s one point of this diatribe.

The other?

Happiness is great. It’s a worthy thing to want more of.

But anyone who asks for ‘pure’ happiness or ‘ever-increasing’ happiness doesn’t know what they want.

You would rebel against a life of perfect happiness.

You would pay good money to feel sad.

People already do this – they listen to sad songs, read sad stories and watch sad movies.


Not to make them happier, obviously.

To explore emotions and to release them.

To experience a broader range of everything you are.

It’s the same with fear. People want more security over less, less fear over more.

But if you think that’s the whole story, explain why the lines for rollercoasters are so long.

Which brings me to the second point:

It’s Halloween in a week or so, and we’re already seeing the social media comedians strut their stuff.

“Who needs Halloween when the world is this scary lol!”

This joke is wrong on two counts.

The world is beautiful – it just happens to have some scary things in it. That subtle distinction might save your sanity.

And this is the perfect time to explore your fear. There’s research showing that fans of horror movies have handled things better than other people. Why? Because when you practice darkness, you get better at dealing with it. Dark stories allow you to feel fear, horror and shock in a controlled context. That’s why the folk stories from medieval times, when you’d be married and murdered by your 20s, were full of violence, death and unhappy endings.

The Pollyannas who say you should only think bright thoughts cultivate the worst darkness.

If you’re feeling anxious, afraid or that the days disappear too fast with nothing to show for them, then it’s time to face and embrace what’s inside you.

How do you do that?

With the most intensive mind training you can experience from the comfort of your home.

And to celebrate the spooky season, you can even snare it at a $100 discount.

Here’s where:


P.S. Use code EMBRACE at the checkout.

Be sure to read the whole sales letter, including the T&Cs, before buying. Even with the discount, it’s still an investment… as training your mind should be.

This site uses Akismet to reduce spam. Learn how your comment data is processed.


%d bloggers like this: