This Computer Scientist Built an Apps That Randomized His Life

What would happen if you gave up your free will?

Eliza
11 min readDec 17, 2020

Algorithms control more of our experiences than ever before. What we watch on Netflix, what we listen to on Spotify, what gets recommended to us on Instagram, all of these choices are governed by software designed to learn our preferences and feed us more of what we want. But what if those algorithms didn’t care what we wanted? What would life be like if we truly had no idea what was coming next? That’s the question Max Hawkins set out to answer a few years ago. This is his story.

Itwas already getting dark by the time the car pulled up outside Max Hawkins’ apartment. His phone buzzed as a notification told him the details of its make, model, and registration number. The car had been ordered from his own Uber account, which wouldn’t have been particularly noteworthy if he’d been the one who requested it. Or if he had any idea where it was supposed to be taking him. But he didn’t.

He got in anyway.

The perfect life

A few weeks earlier, Max had been living the perfect life. After graduating from Carnegie Mellon University, he’d landed his perfect job as a software engineer at Google. He was based in the perfect city, San Francisco, famous for its perfect weather and perfect looking people.

https://phoenixvilleseniorcenter.org/wur/videos-fifa-v-football-awards-ttv01.html
https://phoenixvilleseniorcenter.org/wur/videos-fifa-v-football-awards-ttv02.html
https://phoenixvilleseniorcenter.org/wur/videos-fifa-v-football-awards-ttv03.html
https://phoenixvilleseniorcenter.org/wur/videos-fifa-v-football-awards-ttv04.html
https://phoenixvilleseniorcenter.org/wur/videos-fifa-v-football-awards-ttv05.html
https://phoenixvilleseniorcenter.org/wur/The-Best-FIFA-v-Premios-a-los-Awrd201.html
https://phoenixvilleseniorcenter.org/wur/The-Best-FIFA-v-Premios-a-los-Awrd202.html
https://phoenixvilleseniorcenter.org/wur/The-Best-FIFA-v-Premios-a-los-Awrd203.html
https://phoenixvilleseniorcenter.org/wur/The-Best-FIFA-v-Premios-a-los-Awrd204.html
https://phoenixvilleseniorcenter.org/wur/The-Best-FIFA-v-Premios-a-los-Awrd205.html
http://trob.be/ktx/video-m-v-c-be801.html
http://trob.be/ktx/video-m-v-c-be802.html
http://trob.be/ktx/video-m-v-c-be803.html
http://trob.be/ktx/video-m-v-c-be804.html
http://trob.be/ktx/video-m-v-c-be805.html
https://phoenixvilleseniorcenter.org/wur/video-m-v-c-be801.html
https://phoenixvilleseniorcenter.org/wur/video-m-v-c-be802.html
https://phoenixvilleseniorcenter.org/wur/video-m-v-c-be803.html
https://phoenixvilleseniorcenter.org/wur/video-m-v-c-be804.html
https://phoenixvilleseniorcenter.org/wur/video-m-v-c-be805.html
https://www.fareva.com/mkp/videos-fifa-v-football-awards-ttv01.html
https://www.fareva.com/mkp/videos-fifa-v-football-awards-ttv02.html
https://www.fareva.com/mkp/videos-fifa-v-football-awards-ttv03.html
https://www.fareva.com/mkp/videos-fifa-v-football-awards-ttv04.html
https://www.fareva.com/mkp/videos-fifa-v-football-awards-ttv05.html
https://www.fareva.com/mkp/The-Best-FIFA-v-Premios-a-los-Awrd201.html
https://www.fareva.com/mkp/The-Best-FIFA-v-Premios-a-los-Awrd202.html
https://www.fareva.com/mkp/The-Best-FIFA-v-Premios-a-los-Awrd203.html
https://www.fareva.com/mkp/The-Best-FIFA-v-Premios-a-los-Awrd204.html
https://www.fareva.com/mkp/The-Best-FIFA-v-Premios-a-los-Awrd205.html
https://www.fareva.com/mkp/video-m-v-c-be801.html
https://www.fareva.com/mkp/video-m-v-c-be802.html
https://www.fareva.com/mkp/video-m-v-c-be803.html
https://www.fareva.com/mkp/video-m-v-c-be804.html
https://www.fareva.com/mkp/video-m-v-c-be805.html
https://skatepowerplay.com/mnk/video-Rom-c-Tor-sky8-tv-01.html
https://skatepowerplay.com/mnk/video-Rom-c-Tor-sky8-tv-02.html
https://skatepowerplay.com/mnk/video-Rom-c-Tor-sky8-tv-03.html
https://skatepowerplay.com/mnk/video-Rom-c-Tor-sky8-tv-04.html
https://skatepowerplay.com/mnk/video-Rom-c-Tor-sky8-tv-05.html
https://skatepowerplay.com/mnk/video-Luc-v-Gam-orario-tv1.html
https://skatepowerplay.com/mnk/video-Luc-v-Gam-orario-tv2.html
https://skatepowerplay.com/mnk/video-Luc-v-Gam-orario-tv3.html
https://skatepowerplay.com/mnk/video-Luc-v-Gam-orario-tv4.html
https://skatepowerplay.com/mnk/video-Luc-v-Gam-orario-tv5.html
https://skatepowerplay.com/mnk/v-ideo-Br-v-Me-dfb-01.html
https://skatepowerplay.com/mnk/v-ideo-Br-v-Me-dfb-02.html
https://skatepowerplay.com/mnk/v-ideo-Br-v-Me-dfb-03.html
https://skatepowerplay.com/mnk/v-ideo-Br-v-Me-dfb-04.html
https://skatepowerplay.com/mnk/v-ideo-Br-v-Me-dfb-05.html
https://skatepowerplay.com/mnk/video-Ud-Chile-ver-cdf-01.html
https://skatepowerplay.com/mnk/video-Ud-Chile-ver-cdf-02.html
https://skatepowerplay.com/mnk/video-Ud-Chile-ver-cdf-03.html
https://skatepowerplay.com/mnk/video-Ud-Chile-ver-cdf-04.html
https://skatepowerplay.com/mnk/video-Ud-Chile-ver-cdf-05.html
https://phoenixvilleseniorcenter.org/fuk/video-Rom-c-Tor-sky8-tv-01.html
https://phoenixvilleseniorcenter.org/fuk/video-Rom-c-Tor-sky8-tv-02.html
https://phoenixvilleseniorcenter.org/fuk/video-Rom-c-Tor-sky8-tv-03.html
https://phoenixvilleseniorcenter.org/fuk/video-Rom-c-Tor-sky8-tv-04.html
https://phoenixvilleseniorcenter.org/fuk/video-Rom-c-Tor-sky8-tv-05.html
https://phoenixvilleseniorcenter.org/fuk/video-Luc-v-Gam-orario-tv1.html
https://phoenixvilleseniorcenter.org/fuk/video-Luc-v-Gam-orario-tv2.html
https://phoenixvilleseniorcenter.org/fuk/video-Luc-v-Gam-orario-tv3.html
https://phoenixvilleseniorcenter.org/fuk/video-Luc-v-Gam-orario-tv4.html
https://phoenixvilleseniorcenter.org/fuk/video-Luc-v-Gam-orario-tv5.html
https://phoenixvilleseniorcenter.org/fuk/v-ideo-Br-v-Me-dfb-01.html
https://phoenixvilleseniorcenter.org/fuk/v-ideo-Br-v-Me-dfb-02.html
https://phoenixvilleseniorcenter.org/fuk/v-ideo-Br-v-Me-dfb-03.html
https://phoenixvilleseniorcenter.org/fuk/v-ideo-Br-v-Me-dfb-04.html
https://phoenixvilleseniorcenter.org/fuk/v-ideo-Br-v-Me-dfb-05.html
https://phoenixvilleseniorcenter.org/fuk/video-Ud-Chile-ver-cdf-01.html
https://phoenixvilleseniorcenter.org/fuk/video-Ud-Chile-ver-cdf-02.html
https://phoenixvilleseniorcenter.org/fuk/video-Ud-Chile-ver-cdf-03.html
https://phoenixvilleseniorcenter.org/fuk/video-Ud-Chile-ver-cdf-04.html
https://phoenixvilleseniorcenter.org/fuk/video-Ud-Chile-ver-cdf-05.html
https://www.visualcom.it/nub/video-Rom-c-Tor-sky8-tv-01.html
https://www.visualcom.it/nub/video-Rom-c-Tor-sky8-tv-02.html
https://www.visualcom.it/nub/video-Rom-c-Tor-sky8-tv-03.html
https://www.visualcom.it/nub/video-Rom-c-Tor-sky8-tv-04.html
https://www.visualcom.it/nub/video-Rom-c-Tor-sky8-tv-05.html
https://www.visualcom.it/nub/video-Luc-v-Gam-orario-tv1.html
https://www.visualcom.it/nub/video-Luc-v-Gam-orario-tv2.html
https://www.visualcom.it/nub/video-Luc-v-Gam-orario-tv3.html
https://www.visualcom.it/nub/video-Luc-v-Gam-orario-tv4.html
https://www.visualcom.it/nub/video-Luc-v-Gam-orario-tv5.html

Every morning, Max woke up full of energy at precisely 7 a.m., dropped in at his favorite coffee shop to pick up his favorite coffee, and cycled to work via a carefully optimized route that took him precisely 15 minutes and 37 seconds. Everything in Max’s life was exactly as he wanted. Yet he couldn’t shake the feeling that he wasn’t in control.

He didn’t figure out why this was until he read a research paper about a location-based machine-learning algorithm. According to the paper, if you fed the algorithm the coordinates of all the places you’d been for the past week, it would predict with surprising accuracy where you were going to be on the following day.

If a computer knew what he was going to do before he did, why was he even necessary?

As Max considered what would happen if he were to input his own details, he realized how easy it would be to anticipate where he would be on any given day. A fancy algorithm probably wasn’t even necessary. Anybody who observed his routine for a week would be able to predict his future movements down to the meter.

The idea that his pursuit of the perfect lifestyle made him predictable bothered him, but what bothered him even more was the question of what his role in the decision-making process was. Was he really living? Or was he just stuck in a meticulously ordered rut? After all, if a computer knew what he was going to do before he did, why was he even necessary?

Free Will in an Algorithmic World

In this brave new world, many of our choices aren’t choices at all

onezero.medium.com

Build a machine to beat a machine

Instead of spiraling deeper into an existential crisis, Max did what most computer scientists would do when faced with a difficult problem; he built a solution. He decided the best way to break out of the loop he was stuck in was to make his life so unpredictable that even he wouldn’t know what was coming next.

Which brings us back to the Uber.

The first step in Max’s plan was an app that randomized his social life. If he told it he wanted to go out to eat, for example, it would pick a Google listing at random, order an Uber on his behalf, and send it to his apartment.

Crucially, it would do all of this without giving him any idea where he was going until he got there, so there was no way for his preferences to interfere. When a friend asked him to choose somewhere to go that evening, Max took the opportunity to test it out. He fired up the app, the Uber arrived at his apartment, and they let it whisk them off into the night.

Instead of spiraling deeper into an existential crisis, Max did what most computer scientists would do when faced with a difficult problem; he built a solution.

After a few minutes, Max and his friend found themselves driving through a part of San Francisco that they’d never been in before, and as they looked at the unfamiliar surroundings, doubts began to set in. What kind of person lets a machine send them to some random spot in a big city? What’s so terrible about sticking to what you know? What if the place they were going to didn’t have locally sourced, organic, grass-fed beef?

But before they had a chance to answer any of these questions, the driver announced they’d arrived. In his 2017 TEDx Talk, Max describes what happened next:

When the driver told us we’d reached our destination we thought it must have been a joke. We showed up in front of this austere brick building with a wrought iron fence in front of it and a sign that said the words: San Francisco General Hospital Psychiatric Emergency Center.

In what may be a sign from the universe, the computer sent me to the local mental hospital.

Adventures in randomized living

Despite this minor setback, Max was hooked on his new adventures in randomization. He expanded his app to randomize things like where he got his hair cut and where he bought his groceries and what music he listened to on Spotify.

When he transitioned to freelance work, he let an algorithm choose random cities around the world for him to live in. He socialized by attending random Facebook events. He signed up for a website that eliminated random foods from his diet. He even got a random image from the internet tattooed on his chest. For over two years, Max left every aspect of his daily life to the capricious whims of machines.

And it was during this time, while attempting a downward dog in a randomly selected acrobatic-yoga class in Mumbai, the city that had been randomly chosen as his home for the next few months, that Max had an epiphany: His preferences had been a prison, and abandoning them had set him free.

The nagging feeling that he wasn’t in control of his life was born of how narrow it had become. His life was made up of a small selection of things he’d decided were for him and excluded everything else. By opening himself to new experiences, he’d discovered places and met people he would never have found by staying in his comfort zone. And by doing so, he enjoyed countless experiences that he’d previously been missing out on not because he couldn’t appreciate what they had to offer but because it would never have occurred to him to try.

Invisible options

As search and social media algorithms control more and more of what we’re exposed to, we find ourselves in a similar situation. Amazon knows what books we like, YouTube and other hubs know what videos we enjoy, Facebook knows who our friends are and how we vote and pretty much everything else. And all of these services are laser-focused on showing us more of those things to keep us engaged.

What does this emphasis on the things we already like leave in our blind spot? What are we missing because we don’t know it’s there? And as these systems become more ubiquitous, how long before we stop noticing that anything is missing? As Eric Schmidt, former CEO of Google, says, the end goal of these systems is to work so well that we don’t even need to think: “I actually think most people don’t want Google to answer their questions. They want Google to tell them what they should be doing next.”

What does it mean for us if this vision of the future comes true? If a machine can predict our desires so accurately that we can simply push a button and an Uber will take us somewhere we’ll love, do we still have free will? And more importantly, what happens when the machine suggests something that it shouldn’t?

What kind of person lets a machine send them to some random spot in a big city?

Kill a deer

The dangers of ceding too much control to our robot overlords are vividly illustrated by another of Max’s creations: a printer that suggests random activities.

He wrote a program that scanned thousands of books, blog posts, and Wikipedia articles for verb phrases such as “she went to the park and…” to set it up. With a little time and some grammar processing, it could then produce an almost limitless pool of ideas.

Most of them weren’t very useful. The list was full of suggestions that were too vague or abstract, such as “go to the park and think you are a tree” or “meet a certain goal.” But one day, Max fired up the printer, and it came up with this: “Kill a deer.”

Deer hunting hadn’t been part of Max’s perfectly optimized life so far, so his first instinct was to throw the piece of paper away and move on with his life. But then he asked himself the same question he’d been asking since the whole experiment began: “Why is that not something that I could do?”

When he thought about it, he realized there wasn’t a particularly good reason. He ate meat without any moral discomfort, even though he knew that doing so required the death of countless animals. He might even have eaten deer at some point. He didn’t have any particular problem with guns. Wasn’t his feeling that he couldn’t kill a deer just another self-limiting preference?

So he tried. Despite his misgivings, Max went on a hunting trip with a friend. The plan was to get to the point where he had the deer in his sights, then he’d decide whether to pull the trigger or not. (Luckily for the deer, Max and his friend didn’t spot any on their hunt.)

A randomly generated suggestion from a source he knew to be completely arbitrary led him to entertain the idea of doing something he felt uncomfortable with.

If you know your pop psychology, this probably doesn’t shock you. The ease with which our morality can be influenced is well documented in experiments like Stanley Milgram’s electric shock experiment or Philip Zimbardo’s prison experiment. Once we get used to following orders, it’s alarming how far we’ll allow ourselves to be pushed. Only now, those orders are controlled by algorithms so complex that no human is quite sure what they’ll suggest next.

Question your preferences

Most of us never think to question our preferences, and as a result, our actions are almost entirely guided by them. Just over 90% of iPhone users plan to stick with their iPhones when they upgrade. The same goes for 86% of Samsung phone users. Over 99% of U.S. voters in battleground states voted the same way in 2020 as they did in 2016.

Why is this? Did these people genuinely decide that their previous choice remains the best option for them? Do any of us? Research shows that often, we simply latch onto the first option we’re exposed to. Our subconscious then fills in the gaps retroactively, convincing us that we chose them deliberately. Liberal or conservative, iPhone or Android, that thing we like to do in bed — we tell ourselves that we’ve chosen these preferences even though we don’t truly understand where they came from. And in the process, we miss out on everything that falls outside those boundaries. Here’s how Max puts it in his TEDx Talk:

My preference had blinded me from the complexity and the richness of the world. And following the [computer’s suggestions] gave me the courage to live outside of my comfort zone. To discover parts of the human experience that I ignored because they were too different or “not for me.”

Following our preferences makes sense. They keep us comfortable and safe and even happy. But following them to the exclusion of everything else keeps us stagnant. Surprise, novelty, adventure — these are the things that give life meaning, and we can only find them outside the ordinary.

As computers and our habits work to keep us in our bubbles, there’s the danger that our lives become smaller. From music to politics to hobbies to culture, we spend most of our time seeing a fraction of the whole picture. This isn’t to say that it’s bad to have preferences, of course, but that it’s worth reminding ourselves to challenge them from time to time. It’s the algorithm’s job to keep life comfortable. But it’s up to us to keep things interesting

--

--