Why and when did you think of writing this book?

I can give you the exact date. August 27, 2015. That was when I saw on my newsfeed a post by Mark Zuckerberg. A billion people had been on Facebook the day before. I had been looking at Facebook for almost a decade before that, and knew what it was trying to do. But the idea that a billion people, such a big percentage of the world, was on that one network on the same day is a lot different from, say, watching the World Cup. And I realised Zuckerberg was doing something no one had done before. And I figured I had to write this book and tell that story; how that happened; who they were; and what it meant.

But did your hypothesis change by the time you finished the book?

Yeah, dramatically. I had done a book about Google earlier. It worked very well. So, I thought I’d do it the same way — I’d get Facebook to give me access to the employees and they would not be able to control what I’d write or even read it before when I was finished. It took me a while to convince Facebook, but I finally did. And I started almost a year to the day after that first post. I started in 2016 and went to Nigeria with Mark Zuckerberg. A couple months later, the election happened in the US, and everything changed for Facebook. All of a sudden, the criticisms that had been festering exploded and Facebook entered a period where it was under extreme criticism and had to own up to mistakes that it made. So, it really did change the book a lot and changed the way I had to report the book, a lot.

What is the most significant change that happened over the course of writing the book?

The biggest change was the narrative switch from this success story basically, to one where you had to examine how they got to that success and what decisions they made to this amazing growth — a dangerous thing. How did that happen? It was playwright Anton Chekov who said that if in the first act, you have hung a gun on the wall, then in the following one it should be fired. When I was reporting the book, these guns were going off, so I had to go back to Act One and see where the guns were planted. So that was how the book was done.

But access to the company can also come with riders...

There was no problem at all. There was no compromise. I just had this benefit of access and obviously, I’m able to tell when I’m being spun when they’re not telling me the whole truth. I was under no obligation to Facebook. It was information I wanted. There would have been compromises if I was gullible and had taken everything they said at face value. But I’d like to think that I know how to judge the information I get.

To some, Facebook is the best thing to have happened to technology in a long time. And to some others, it is the worst. What are your learnings?

I certainly wouldn’t call Facebook the best thing that happened to technology. Look at the arc of history. I started writing about technology just as the personal computer revolution was happening. There were great companies built on that. And then I looked at the Internet, and great companies were built on that. And now there’s the social revolution and Facebook, which was built on that, is for all its problems a great company.

But the difference is the later you get, the faster companies have the opportunity to grow and the more pervasive they become, because in technology one thing doesn’t happen and the next thing comes. At each turn of the wheel, things get bigger and more powerful, affect people more, and take more people’s lives and change people’s lives more, because they’re built on the previous advance. So, the personal computer industry gets computers in the hand; the Internet connects all the computers; and the social revolution connects all the people — each, happening on top of the other, becomes more powerful.

So, each company winds up having a bigger influence on all of us. Each generation gets a bigger set of weapons or tools. So in a way, the story of Facebook isn’t just the story of Facebook, it’s the story of where technology is.

There are reasons to believe the sort of social experiment Facebook has introduced has taken us in some wrong directions, which even Facebook hasn’t imagined it would...

I like the idea that you call it an ‘experiment’. Because obviously, they didn’t intend to say this was an experiment. They said this was a business. “This is our vision”. But it really was an experiment. And in experiments you don’t know what’s going to happen. So, you have a responsibility. When you’re doing something so profoundly different and important and do not treat it like an experiment, you’re in trouble. Because it’s the real thing and you should try to anticipate the negative consequences that can come of what you do. That’s what Facebook did not do. That’s what Mark Zuckerberg, in particular, didn’t want to consider or would brush off saying he would deal with it later.

In the book, I describe instance after instance where Zuckerberg wasn’t interested in anticipating and dealing in advance with the potential consequences. So, in the past three years, they have had to apologise for not doing that. They say they are going to try to do that now. But because they didn’t run it with the understanding that Facebook is real and not an experiment, we’ve had to suffer for their mistakes.

How much of an overlap is there between Zuckerberg the person and Facebook, his product? If Mark is a product, does it look like FB?

You can’t pull them apart. It’s like weaving a thread into a fabric. If you pull the Mark Zuckerberg thread from Facebook, I don’t think you have anything. Facebook is Mark Zuckerberg’s image. In the book I talk about how he roped in a few people, in 2011-2012, who tried to figure out what the values were and the person in charge of it was a young woman named Molly Graham. She was the daughter of Donald E Graham, who was at the time, CEO of The Washington Post . She said Mark was the value. In order to understand the values of Facebook, I had to understand the values of Zuckerberg. And those are the same values. You go to Facebook headquarters, and you see all these posters that say “move fast and break things”, “what would you do if you’re not afraid?”, etc. These are all projections of Mark’s thinking. So, in the Venn Diagram of Mark Zuckerberg and Facebook, it’s a pretty complete overlap.

Critics say Zuckerberg is very slow in trusting people. A lot of people in his generation, allegedly, seem to be cool about sharing. Does this psychology reflect in the overall character of Facebook?

That’s an interesting idea. It is true that Zuckerberg is slow to trust people. And to this day, for all important jobs, he relies on people he’s known for a long time. Take the coronavirus; the person at Facebook in charge of health is the one who literally had pizza with Zuckerberg on February 4, 2004, when Facebook was launched. So that shows. The irony is that while he is slow to trust others, he had this vision for the world where everyone should trust each other to share their information. And then you should trust Facebook with that information. So, I think what happened was he took a look at his generation and felt, yes, they’re ready to share, but when it came to him personally trusting, not so much.

Given its ubiquity and grand influence, we can say that our society is being curated by Facebook’s algorithms today. Do they realise the influence and are they concerned about it?

Facebook at certain points became “interested” — not exactly concerned — in this. That’s when they did the famous ‘mood study’, which I write about in the book. They tried to find out whether people would be in a better mood if their friends posted positive things on Facebook or vice versa. They did an experiment on that, and drew a lot of criticism for tampering with people’s moods. But that’s exactly what it does. It has the unexpected intent of driving us to outrage. The way the algorithms work is that we engage more with sensational things. It’s been well documented that fake news or make-believe stories get far more engagement than real stories from real publications. That’s just a consequence Facebook originally embraced — that things can go viral on Facebook and they thought that was good, but they didn’t realise until recently they had to deal with it. This could be corrosive to society.

There is criticism that you have ignored data mining practices of Facebook.

I could have gotten to a lot of places — the book is over 500 pages — but I still could not be right about everything. I don’t give a detailed analysis of how Facebook does that (data mining). This book is a story, and I did talk about that. But it’s not a guide to how Facebook does data. I’m not going to be putting charts about where all the information comes from. Even Zuckerberg doesn’t know where the information comes from. But I talked about how he went to Congress, and they had questions about exactly that. He didn’t know the answer.

After the Cambridge Analytica scandal and all the controversy around political advertising on Facebook, what’s the future of FB going to look like? What are the solutions to its maladies?

I don’t have the solutions. I don’t think Zuckerberg does either. What I learned while writing the book is that Facebook didn’t have to be this way. Facebook, as we know it, is shaped by a number of decisions, and many of them were taken recklessly or certainly without regard to how this would have unintended consequences. It’s so difficult to find a solution, because they painted themselves into a corner by creating such a product. It’s like saying the Internet was built without thinking too much of security, and now it’s really difficult to get good Internet security. People inside Facebook were warning about this as it happened. And Zuckerberg went ahead and made a decision, figuring you could fix things later on.

So, what should be a consumer’s approach to Facebook today?

9780241297940-facebook

Title: Facebook: The Inside StoryAuthor: Hardcover: 592 pagesPublisher: Blue Rider Press (25 February 2020)Price:

I would recommend everyone to dive into the privacy settings and make sure that they’re not sharing more than they want to share. You can control that. Under pressure, over the years, Facebook has made those settings accessible. These were once really hard to understand. You almost needed a PhD in computer science to understand how to work the privacy settings, but not anymore. On Facebook, I concentrate a lot on the people I know and avoid the crowd.

Just in the past few weeks, because of coronavirus, our world has changed so much and in a strange way, Facebook has become more useful. It’s better equipped to handle misinformation about coronavirus and more responsible in being proactive in spreading good information about the virus. So, this is a tool that we could use not to scream at each other, or get worked up by made up stories, but for being in touch with each other. Ever since I wrote the book, the story’s changed much. So, it is impossible to predict what’s in store for Facebook. It’s impossible to say there will be a Facebook in five years.

comment COMMENT NOW