Is Google… evil?

Google’s motto is “don’t be evil”.

It’s written into its code of conduct.

Paul Buchheit, the creator of Gmail, came up with the motto in the year 2000.

He stated that it was: “A bit of a jab at a lot of the other companies, especially our competitors, who at the time, in our opinion, were kind of exploiting the users to some extent.”

Oh how times have changed.

While the world’s ire has been focused on Facebook’s exploitative data policies, Google has been given a mostly free ride.

But the thing is, the amount of data Google has on you makes Facebook’s haul look pathetic.

As Dylan Curran pointed out on Twitter, Google:

  • Stores every location you’ve ever visited with your phone.
  • Stores all your search history across all your devices. So even if you delete your searches on your phone, they’ll still be on your computer.
  • Creates an advertisement profile based on your location, age, hobbies, career, interests, relationship status, possible weight and income. So it can better target ads.
  • Stores information on every app and extension you use. Where you use them, how often, and who you use them to interact with.
  • Stores all your YouTube history – every video you’ve ever watched or partially watched. From this it can work out your political views, and mental state.
  • Has a file with all your data that will be around 3 million Word documents big.
  • Also records and stores any images and files you download while browsing.
  • Stores every event you’ve ever put in your calendar, whether you attended and at what time you got there.
  • Stores all your deleted Google Drive files.
  • Has a record of every step you’ve ever taken with your phone, if you’re using Google Fit.
  • Stores every photo and video you’ve ever watched or taken with your phone, if you’re using Google Photos.
  • Has a file of every ad you’ve ever clicked or viewed.
  • Stores every email you’ve ever received, including ones you’ve deleted or marked as spam.

Curran also made a good point about what this information could be used for in the wrong hands.

But what if Google itself is “the wrong hands”?

Given this, you have to laugh at the irony of Paul Buchheit’s reason for creating the “don’t be evil” slogan.

Remember, he said it was: “A bit of a jab at a lot of the other companies, especially our competitors, who at the time, in our opinion, were kind of exploiting the users to some extent.”

Well, if all of the above isn’t “kind of exploiting users to some extent” I don’t know what is.

Maybe that’s why when Google restructured and created Alphabet as its parent company, Alphabet went for “do the right thing”, rather than “don’t be evil”.

The problem is, “doing the right thing” may actually end up with Google doing what many would consider to be evil.

Google hit the news a few months ago when it revealed it was providing artificial intelligence (AI) to the US military through something called “Project Maven”. That AI, it was claimed, would be used to kill people more efficiently.

Google released a statement saying its AI wouldn’t be used to directly kill people, etc. As The Verge reported:

Google has described its work on Project Maven as “non-offensive,” and Diane Greene, the head of Google’s cloud operation who sits on Alphabet’s board of directors, said the technology will not be used to “operate or fly drones” and “will not be used to launch weapons.” But this is not enough for the many employees who signed the letter addressed to Pichai. “While this eliminates a narrow set of direct applications,” the letter reads, “the technology is being built for the military, and once it’s delivered it could easily be used to assist in these tasks.”

That was over a month ago, and since then things have hotted up. In fact, around “a dozen” Google employees have quit over Google’s involvement with the US military.

And according to an open letter, which has been signed by over 300 academics, there are over 3,100 Google employees opposed to its participation in Project Maven.

If you’re not aware of why so many tech experts and academics are calling for Google to not work on autonomous weapon systems, you should watch the video below.

(Click to watch on YouTube. But remember, Google will be watching you watching it.)

That video shows how mini drones, like those used all around the world in drone racing, could be fitted with explosives and controlled through AI. Swarms of them could kill any army or assassinate any figure worldwide.

Traditional armies and weapons would be useless against a swarm of thousands of mini explosive drones. And the crazy part is, the technology to make these drone swarms of slaughterbots already exists.

It’s no surprise that Google employees are quitting Google over this. Imagine knowing you helped create a weapon that could be used to kill any person, or any community of people, on the planet at will.

Although, there’s always the argument that “if we don’t develop it first someone else will.” Still, it’s a long way from Google’s “don’t be evil” motto, isn’t it?

And then we have the revelation that came out this week about how Google plans to use all the data it has on you. Something called the “Selfish Ledger”.

Google’s “Selfish Ledger” is the stuff of a dystopian nightmare

Earlier this week a video surfaced of an internal Google video from back in 2016.

Called the Selfish Ledger, it equates the selfish gene theory – whereby we are just vessels for our genes to survive – with a data equivalent.

In Google’s vision, we are no more than vessels to be used by the ledger. The ledger is the store of all our data.

Just think about that for a second. In Google’s vision, our data is more important, valuable and has more right to life than us. We are merely its earthly custodians.

Directly transcribed from the video:

“User-centered design principles have dominated the world of computing for many decades, but what if we looked at things a little differently? What if the ledger could be given a volition or purpose rather than simply acting as a historical reference? What if we focused on creating a richer ledger by introducing more sources of information? What if we thought of ourselves not as the owners of this information, but as custodians, transient carriers, or caretakers?”

You can watch the full video on The Verge, here.

This video was never meant to be seen by the public, and once you watch it, you’ll understand why.

If you don’t want to watch it, here’s a summary, from The Verge’s article:

The video was made in late 2016 by Nick Foster, the head of design at X (formerly Google X) and a co-founder of the Near Future Laboratory. The video, shared internally within Google, imagines a future of total data collection, where Google helps nudge users into alignment with their goals, custom-prints personalized devices to collect more data, and even guides the behavior of entire populations to solve global problems like poverty and disease.

This project is much more fitting with Alphabet’s new motto: “do the right thing”. Alphabet seems to believe that it knows what’s best for the human race, and it will use all the endless data it has on us to make us comply with its vision.

Of course, this “Selfish Ledger” was never developed. And never will be. It was merely “a thought experiment” Google has said, once the video came to light.

“You either die a hero or live long enough to see yourself become the villain”

When I started writing this article, I just thought it was funny that Google seems to have gone against its own “don’t be evil” motto.

I thought it would be sort of a fun, tongue-in-cheek piece. But the more I’ve researched it, the more I find myself questioning if Google really is, well, evil. At least to some extent.

The question doesn’t seem quite as funny to me now.

The whole thing reminds me of one of the key themes from Christopher Nolan’s The Dark Knight.

If you haven’t seen the film, one of the main characters is “a good man”. An unquestionably moral man who has a strong character and sense of right and wrong.

The Joker, played by the late Heath Ledger, sees this good man and wants to break him. He wants to show that even a good man has his limits. He wants to prove that there is no objective good and evil.

And so he puts this good man through a series of ordeals. The good man loses his morality and becomes a total nihilist. Once devoid of his morals, he becomes a killer who decides whether people should live or die with the flip of a coin.

In a moment of clarity, this once good man explains what happened to him. And in doing so explains what happens to many once “good” people and “good” organisations:

“You either die a hero, or you live long enough to see yourself become the villain,” he says.

History is full of well-meaning monsters who believed they were doing the right thing, but who in hindsight would be considered villains.

I wonder if that’s how people will one day look back on Google, or on any of our current internet giants, for that matter.

Now saying all this, I just payed to up my Google Drive storage from 100GB to 1TB, using my Android phone and Google Pay.

Is Google turning into the villain? Maybe, but I’d argue many people, myself included, will find it too much effort to look for a better solution.

At the end of the day, we’re all hypocrites. Just like Google.

Until next time,

Harry Hamburg
Editor, Exponential Investor

Category: Technology

From time to time we may tell you about regulated products issued by Southbank Investment Research Limited. With these products your capital is at risk. You can lose some or all of your investment, so never risk more than you can afford to lose. Seek independent advice if you are unsure of the suitability of any investment. Southbank Investment Research Limited is authorised and regulated by the Financial Conduct Authority. FCA No 706697.

© 2019 Southbank Investment Research Ltd. Registered in England and Wales No 9539630. VAT No GB629 7287 94.
Registered Office: 2nd Floor, Crowne House, 56-58 Southwark Street, London, SE1 1UN.

Terms and conditions | Privacy Policy | Cookie Policy | FAQ | Contact Us | Top ↑