clock menu more-arrow no yes mobile
Javier Zarracina/Vox; Jeff Roberson/AP

Filed under:

Mark Zuckerberg on Facebook’s hardest year, and what comes next

“We will dig through this hole, but it will take a few years.”

It’s been a tough year for Facebook. The social networking juggernaut found itself engulfed by controversies over fake news, electoral interference, privacy violations, and a broad backlash to smartphone addiction. Wall Street has noticed: The company has lost almost $100 billion in market value in recent weeks.

Behind Facebook’s hard year is a collision between the company’s values, ambitions, business model, and mind-boggling scale. Mark Zuckerberg, the founder of Facebook, has long held that the company’s mission is to make the world more open and connected — with the assumption being that a more open and connected world is a better world. That assumption has been sorely tested over the past year. As we’ve seen, a more open world can make it easier for governments to undermine each other’s elections from afar; a more connected world can make it easier to spread hatred and incite violence.

In 2017, Facebook hit more than 2 billion monthly users — and that’s to say nothing of the massive user bases of Facebook-owned properties like Instagram and WhatsApp. There is no way to track, or even understand, all that is happening on Facebook at any given time. Problems that look small in the moment — like organized disinformation campaigns mounted by Russia — reveal themselves, in retrospect, to be massive, possibly even world-changing, events.

Photo illustration by Javier Zarracina/Vox; Jonathan Nackstrand/AFP/Getty Images

I spoke with Zuckerberg on Friday about the state of his company, the implications of its global influence, and how he sees the problems ahead of him.

“I think we will dig through this hole, but it will take a few years,” Zuckerberg said. “I wish I could solve all these issues in three months or six months, but I just think the reality is that solving some of these questions is just going to take a longer period of time.”

But what happens then? What has this past year meant for Facebook’s future? In a 2017 manifesto, Zuckerberg argued that Facebook would help humanity takes its “next step” by becoming “the social infrastructure” for a truly global community.

Remarkably, Facebook’s scale makes this a plausible vision. But it comes with a dark side: Has Facebook become too big to manage, and too dangerous when it fails? Should the most important social infrastructure of the global community be managed by a single company headquartered in Northern California? And does Zuckerberg’s optimism about human nature and the benefits of a connected world make it harder for him to see the harm Facebook can cause?

The full conversation with Zuckerberg can be heard on my podcast, The Ezra Klein Show. A transcript, lightly edited for length and clarity, follows.

Ezra Klein

I want to begin with something you said recently in an interview, which is that Facebook is now more like a government than a traditional company. Can you expand on that idea?

Mark Zuckerberg

Sure. People share a whole lot of content and then sometimes there are disputes between people around whether that content is acceptable, whether it’s hate speech or valid political speech; whether it is an organization which is deemed to be a bad or hateful or terrorist organization or one that’s expressing a reasonable point of view.

I think more than a lot of other companies, we’re in a position where we have to adjudicate those kinds of disputes between different members of our community. And in order to do that, we’ve had to build out a whole set of policies and governance around how that works.

But I think it’s actually one of the most interesting philosophical questions that we face. With a community of more than 2 billion people all around the world, in every different country, where there are wildly different social and cultural norms, it’s just not clear to me that us sitting in an office here in California are best placed to always determine what the policies should be for people all around the world. And I’ve been working on and thinking through: How can you set up a more democratic or community-oriented process that reflects the values of people around the world?

That’s one of the things that I really think we need to get right. Because I’m just not sure that the current state is a great one.

Javier Zarracina/Vox; AFP/Getty Images

Ezra Klein

I’d love to hear more about where your thinking is on that because when Facebook gets it wrong, the consequences are on the scale of when a government gets it wrong. Elections can lose legitimacy, or ethnic violence can break out.

It makes me wonder, has Facebook just become too big and too vast and too consequential for normal corporate governance structures, and also normal private company incentives?

Mark Zuckerberg

We’re continually thinking through this. As the internet gets to a broader scale and some of these services reach a bigger scale than anything has before, we’re constantly confronted with new challenges. I try to judge our success not by, “Are there no problems that come up?” But, “When an issue comes up, can we deal with it responsively and make sure that we can address it so that those kinds of issues don’t come up again in the future?”

You mentioned our governance. One of the things that I feel really lucky we have is this company structure where, at the end of the day, it’s a controlled company. We are not at the whims of short-term shareholders. We can really design these products and decisions with what is going to be in the best interest of the community over time.

Ezra Klein

That is one of the ways Facebook is different, but I can imagine reading it both ways. On the one hand, your control of voting shares makes you more insulated from short-term pressures of the market. On the other hand, you have a lot more personal power. There’s no quadrennial election for CEO of Facebook. And that’s a normal way that democratic governments ensure accountability. Do you think that governance structure makes you, in some cases, less accountable?

Mark Zuckerberg

I certainly think that’s a fair question. My goal here is to create a governance structure around the content and the community that reflects more what people in the community want than what short-term-oriented shareholders might want. And if we do that well, then I think that could really break ground on governance for an internet community. But if we don’t do it well, then I think we’ll fail to handle a lot of the issues that are coming up.

Here are a few of the principles. One is transparency. Right now, I don’t think we are transparent enough around the prevalence of different issues on the platform. We haven’t done a good job of publishing and being transparent about the prevalence of those kinds of issues, and the work that we’re doing and the trends of how we’re driving those things down over time.

A second is some sort of independent appeal process. Right now, if you post something on Facebook and someone reports it and our community operations and review team looks at it and decides that it needs to get taken down, there’s not really a way to appeal that. I think in any kind of good-functioning democratic system, there needs to be a way to appeal. And I think we can build that internally as a first step.

But over the long term, what I’d really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.

Javier Zarracina/Vox; Drew Angerer/Getty Images

Ezra Klein

One thing that has been damaging for Facebook over the past year is a concern will arise and initially the answer is, “Very, very few people saw fake news.” Or, “Very, very few people saw anything from Russia-related bots.” And then slowly it comes out, “No, actually it was more. Millions. Maybe hundreds of millions.”

The problem wasn’t the lack of transparency; it was how to know we could trust what was coming out. And one of the reasons I’m interested to hear you broach the idea of independent institutions is I wonder if part of transparency has to be creating modes of information that are independent.

Mark Zuckerberg

Yeah, I think that’s a good point. And I certainly think what you’re saying is a fair criticism. It’s tough to be transparent when we don’t first have a full understanding of where the state of some of the systems [is]. In 2016, we were behind having an understanding and operational excellence on preventing things like misinformation, Russian interference. And you can bet that’s a huge focus for us going forward.

Right now in the company, I think we have about 14,000 people working on security and community operations and review, just to make sure that we can really nail down some of those issues that we had in 2016.

After the 2016 US elections, a number of months later, there were the French elections. And for that, we spent a bunch of time developing new AI tools to find the kind of fake accounts spreading misinformation and we took down — I think it was more than 30,000 accounts, and I think the reports out of France were that people felt like that was a much cleaner election on social media.

A few months later, there were the German elections. And there, we augmented the playbook again to work directly with the election commission in Germany. If you work with the government in a country, they’ll really actually have a fuller understanding of what is going on and what are all the issues that we would need to focus on.

And then fast-forward to last year, 2017, and the special election in Alabama. We deployed a number of new tools that we’d developed to find fake accounts who were trying to spread false news, and we got them off before a lot of the discussion around the election. And again, I think we felt a lot better about the result there.

Javier Zarracina/Vox; AFP/Getty Images

Ezra Klein

Let me ask you about your tools to punish that misbehavior, though. The risk reward of manipulating a national election using Facebook is very high. If you’re Russia and you get caught hacking into our election systems — which they also tried to do — and you fail and Hillary Clinton wins, the consequences of that can be really severe. Sanctions could be tremendous, and you could even imagine something like that escalating into armed conflict.

If you do this on Facebook, maybe you get caught and your bots get shut down, but Facebook, in not being a government, really doesn’t have the ability to punish. If Cambridge Analytica messes with everybody’s privacy, you can’t throw them in jail in the way that, if you’re a doctor and you repeatedly violate HIPAA, the government makes sure you face very severe legal consequences. So do you have capacity to do not just detection but sanction? Is there a way to increase the cost of using your platform for these kinds of efforts?

Mark Zuckerberg

I can walk through how we’re basically approaching this.

There are three big categories of fake news. There’s a group of people who are like spammers. These are the people who, in pre-social media days, would’ve been sending you Viagra emails. The basic playbook that you want to run on that is just make it non-economical. So the first step, once we realized that this was an issue, was a number of them ran Facebook ads on their webpages. We immediately said, “Okay. Anyone who’s even remotely sketchy, no way are you going to be able to use our tools to monetize.” So the amount of money that they made went down.

Then they’re trying to pump this content into Facebook with the hopes that people will click on it and see ads and make money. As our systems get better at detecting this, we show the content less, which drives the economic value for them down. Eventually, they just get to a point where they go and do something else.

The second category is state actors. That’s basically the Russian interference effort. And that is a security problem. You never fully solve it, but you strengthen your defenses. You get rid of the fake accounts and the tools that they have. We can’t do this all by ourselves, so we try to work with local governments everywhere who have more tools to punish them and have more insight into what is going on across their country so that they can tell us what to focus on. And that one I feel like we’re making good progress on too.

Then there’s the third category, which is the most nuanced, which are basically real media outlets who are saying what they think is true but have varying levels of accuracy or trustworthiness. And that is actually the most challenging portion of the issue to deal with. Because there, I think, there are quite large free speech issues. Folks are saying stuff that may be wrong, but they mean it, they think they’re speaking their truth, and do you really wanna shut them down for doing that?

So we’ve been probably the most careful on that piece. But this year, we’ve rolled out a number of changes to News Feed that try to boost in the ranking broadly trusted news sources. We’ve surveyed people across the whole community and asked them whether they trust different news sources.

Take the Wall Street Journal or New York Times. Even if not everyone reads them, the people who don’t read them typically still think they’re good, trustworthy journalism. Whereas if you get down to blogs that may be on more of the fringe, they’ll have their strong supporters, but people who don’t necessarily read them often don’t trust them as much.

Zavier Zarracina/Vox; LightRocket via Getty Images

Ezra Klein

I’m somebody who came up as a blogger and had a lot of love for the idea of the open internet and the way the gates were falling down. One thing I hear when I listen to the third solution there is it also creates a huge return to incumbency.

If you’re the New York Times and you’ve been around for a long time and you’re well-known, people trust you. If you’re somebody who wants to begin a media organization two months from now, people don’t know if they can trust you yet. If Facebook is the way people get their news, and the way Facebook ranks its News Feed is by privileging news people already trust, it’s going to be a lot harder for new organizations to break through.

Mark Zuckerberg

That’s an important point that we spend a lot of time thinking about. One of the great things about the internet and the services we’re trying to build, you’re giving everyone a voice. That’s so deep in our mission. We definitely think about that in all the changes that we’re making.

I think it’s important to keep in mind that of all the strategies that I just laid out, they’re made up of many different actions, which each have relatively subtle effects. So the broadly trusted shift that I just mentioned, it changes how much something might be seen by, I don’t know, just call it in the range of maybe 20 percent.

What we’re really trying to do is make it so that the content that people see is actually really meaningful to them. And one of the things I think we often get criticized for is, and incorrectly in this case, is people say, “Hey, you’re just ranking the system based on what people like and click on.”

That’s actually not true. We moved past that many years back. There was this issue with clickbait, where there were a bunch of publications that would push content into Facebook, [and] people would click on them because they had sensational titles but then would not feel good about having read that content. So that was one of the first times that those basic metrics around clicks, likes, and comments on the content really stopped working to help us show the most meaningful content.

The way that this works today, broadly, is we have panels of hundreds or thousands of people who come in and we show them all the content that their friends and pages who they follow have shared. And we ask them to rank it, and basically say, “What were the most meaningful things that you wish were at the top of feed?”

And then we try to design algorithms that just map to what people are actually telling us is meaningful to them. Not what they click on, not what is going to make us the most revenue, but what people actually find meaningful and valuable. So when we’re making shifts — like the broadly trusted shift — the reason why we’re doing that is because it actually maps to what people are telling us they want at a deep level.

Ezra Klein

One of the things that has been coming up a lot in the conversation is whether the business model of monetizing user attention is what is letting in a lot of these problems. Tim Cook, the CEO of Apple, gave an interview the other day and he was asked what he would do if he was in your shoes. He said, “I wouldn’t be in this situation,” and argued that Apple sells products to users, it doesn’t sell users to advertisers, and so it’s a sounder business model that doesn’t open itself to these problems.

Do you think part of the problem here is the business model where attention ends up dominating above all else, and so anything that can engage has powerful value within the ecosystem?

Mark Zuckerberg

You know, I find that argument, that if you’re not paying that somehow we can’t care about you, to be extremely glib and not at all aligned with the truth. The reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people who can’t afford to pay. And therefore, as with a lot of media, having an advertising-supported model is the only rational model that can support building this service to reach people.

That doesn’t mean that we’re not primarily focused on serving people. I think probably to the dissatisfaction of our sales team here, I make all of our decisions based on what’s going to matter to our community and focus much less on the advertising side of the business.

But if you want to build a service which is not just serving rich people, then you need to have something that people can afford. I thought Jeff Bezos had an excellent saying on this in one of his Kindle launches a number of years back. He said, “There are companies that work hard to charge you more, and there are companies that work hard to charge you less.” And at Facebook, we are squarely in the camp of the companies that work hard to charge you less and provide a free service that everyone can use.

I don’t think at all that that means that we don’t care about people. To the contrary, I think it’s important that we don’t all get Stockholm syndrome and let the companies that work hard to charge you more convince you that they actually care more about you. Because that sounds ridiculous to me.

Javier Zarracina/Vox; Jenny Kane/AP

Ezra Klein

So I’m also within an advertising model, and I have a lot of sympathy for the advertising model. But I also think the advertising model can blind us. It creates incentives that we operate under and justify. And one of the questions I wonder about is whether diversifying the model doesn’t make sense. If I understand, and I might not, WhatsApp, which is also part of Facebook, is subscription, right? People pay a small amount?

Mark Zuckerberg

No, we actually got rid of that.

Ezra Klein

Well, see, there you go. Shows what I know.

Mark Zuckerberg

But keep going.

Ezra Klein

The broader point I want to make is that you don’t need to only serve rich people to diversify away from just being about attention. And when it is about attention, when it is about advertising, when you need to show growth to Wall Street, that does pull you toward getting more and more and more of people’s attention over time.

I did an interview with Tristan Harris, who’s been a critic of Facebook. And we were talking about your announcement that some of the changes you’re making have brought down, a little bit, the amount of time people are spending on the platform. And he made the point, “You know that’s great. But he couldn’t do that by 50 percent. Wall Street would freak out; his board would freak out.” There are costs to this model, and I do wonder how you think about at least protecting yourself against some of them dominating in the long run.

Mark Zuckerberg

Well, I think our responsibility here is to make sure that the time people spend on Facebook is time well spent. We don’t have teams who have, as their primary goal, making it so people spend more time. The way I design the goals for the teams is that you try to build the best experience you can. I don’t think it’s really right to assume that people spending time on a service is bad. But at the same time, I also think maximizing the time that people spend is not really the goal either.

In the last year, we’ve done a lot of research into what drives well-being for people. And what uses of social networks are correlated with happiness and long-term measures of health and all the measures of well-being that you’d expect, and what areas are not as positive.

And the thing we’ve found is that you can break Facebook and social media use into two categories. One is where people are connecting and building relationships, even if it’s subtle, even if it’s just I post a photo and someone I haven’t talked to in a while comments. That person is reminding me that they care about me.

The other part of the use is basically content consumption. So that’s watching videos, reading news, passively consuming content in a way where you’re not actually interacting with anyone or building a relationship. And what we find is that the things that are about interacting with people and building relationships end up being correlated with all of the measures of long-term well-being that you’d expect, whereas the things that are primarily just about content consumption, even if they’re informative or entertaining and people say they like them, are not as correlated with long-term measures of well-being.

So this is another shift we’ve made in News Feed and our systems this year. We’re prioritizing showing more content from your friends and family first, so that way you’ll be more likely to have interactions that are meaningful to you and that more of the time you’re spending is building those relationships.

That change actually took time spent down a little bit. That was part of what I was talking about on that earnings call. But over the long term, even if time spent goes down, if people are spending more time on Facebook actually building relationships with people they care about, then that’s going to build a stronger community and build a stronger business, regardless of what Wall Street thinks about it in the near term.

Javier Zarracina/Vox; Richard Drew/AP

Ezra Klein

I want to ask you another question about the advertising model, and this one is trickier because it bears very directly on my industry. Something I’ve seen recently has been a perception at Facebook that a lot of the critical coverage from the media comes from journalists angry that Facebook is decimating the advertising market that journalism depends on. And there is that view. The publisher of Dow Jones, Will Lewis, said that the diversion of advertising dollars into Facebook and Google is killing news and that it has to stop.

Is he right or wrong? And given that so much of the advertising on Facebook wraps around news that journalism organizations are paying to report and publish, what responsibility do you feel you have to the people creating real news for their business model to work, given that their products create value, not just for the world but for Facebook itself?

Mark Zuckerberg

So I do think a big responsibility that we have is to help support high-quality journalism. And that’s not just the big traditional institutions, but a big part of what I actually think about when I’m thinking about high-quality journalism is local news. And I think that there are almost two different strategies in terms of how you address that.

For the larger institutions, and maybe even some of the smaller ones as well, subscriptions are really a key point on this. I think a lot of these business models are moving toward a higher percentage of subscriptions, where the people who are getting the most value from you are contributing a disproportionate amount to the revenue. And there are certainly a lot of things that we can do on Facebook to help people, to help these news organizations, drive subscriptions. And that’s certainly been a lot of the work that we’ve done and we’ll continue doing.

In local news, I think some of the solutions might be a little bit different. But I think it’s easy to lose track of how important this is. There’s been a lot of conversation about civic engagement changing, and I think people can lose sight of how closely tied that can be to local news. In a town with a strong local newspaper, people are much more informed; they’re much more likely to be civically active. On Facebook, we’ve taken steps to show more local news to people. We’re also working with them specifically, creating funds to support them and working on both subscriptions and ads that should hopefully create a more thriving ecosystem.

Ezra Klein

I’ve been thinking a lot, in preparing for this interview, about the 2017 manifesto where you said you wanted Facebook to help humankind take its next step. You wrote that “progress now requires humanity coming together, not just as cities or nations, but also as a global community,” and suggested that Facebook could be the social infrastructure for that evolution.

In retrospect, I think a key question here has become whether creating infrastructure where all the tensions of countries and ethnicities and regions and ideologies can more easily collide into each other will actually help us become that global community or whether it will further tear us apart. Has your thinking on that changed at all?

Mark Zuckerberg

Sure. I think over the last few years, the political reality has been that there’s a lot of people feeling left behind. And there’s been a big rise of isolationism and nationalism that I think threatens the global cooperation that will be required to solve some of the bigger issues, like maintaining peace, addressing climate change, eventually collaborating a lot in accelerating science and curing diseases and eliminating poverty.

So this is a huge part of our mission. One of the things I found heartening is if you ask millennials what they identify the most with, it’s not their nationality or even their ethnicity. The plurality identifies as a citizen of the world. And that, I think, reflects the values of where we need to go in order to solve some of these bigger questions.

So now the question is how do you do that? I think it’s clear that just helping people connect by itself isn’t always positive. A much bigger part of the focus for me now is making sure that as we’re connecting people, we are helping to build bonds and bring people closer together, rather than just focused on the mechanics of the connection and the infrastructure.

There’s a number of different pieces that you need to do here. I think civic society basically starts bottom-up. You need to have well-functioning groups and communities. We’re very focused on that. You need a well-informed citizenry, so we’re very focused on the quality of journalism, that everyone has a voice, and that people can get access to the content they need. That, I think, ends up being really important.

Civic engagement, both being involved in elections and increasingly working to eliminate interference and different nation-states trying to interfere in each other’s elections, ends up being really important. And then I think part of what we need to do is work on some of the new types of governance questions that we started this conversation off with because there hasn’t been a community like this that has spanned so many different countries.

So those are some of the things that I’m focused on. But right now a lot of people aren’t as focused on connecting the world or bringing countries closer together as maybe they were a few years back. And I still view that as an important part of our vision for where the world should go — that we do what we can to stay committed to that and hopefully can help the world move in that direction.

Javier Zarracina/Vox; Jeff Chiu/AP

Ezra Klein

One of the scary stories I’ve read about Facebook over the past year is that it had become a real source of anti-Rohingya propaganda in Myanmar, and thus become part of an ethnic cleansing. Phil Robertson, who’s a deputy director of Human Rights Watch in Asia, made the point that Facebook is dominant for news information in Myanmar but Myanmar is not an incredibly important market for Facebook. It doesn’t get the attention we give things that go wrong in America. I doubt you have a proportionate amount of staff in Myanmar to what you have in America. And he said the result is you end up being like “an absentee landlord” in Southeast Asia.

Is Facebook too big to manage its global scale in some of these other countries, the ones we don’t always talk about in this conversation, effectively?

Mark Zuckerberg

So one of the things I think we need to get better at as we grow is becoming a more global company. We have offices all over the world, so we’re already quite global. But our headquarters is here in California and the vast majority of our community is not even in the US, and it’s a constant challenge to make sure that we’re putting due attention on all of the people in different parts of the community around the world.

The Myanmar issues have, I think, gotten a lot of focus inside the company. I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side.

So that’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that’s going on. We stop those messages from going through. But this is certainly something that we’re paying a lot of attention to.

Javier Zarracina/Vox; NurPhoto via Getty Images

Ezra Klein

I think if you go back a couple years in technology rhetoric, a lot of the slogans people had that were read optimistically have come to take on darker connotations too. The idea that “anything is possible.” Our sense of what “anything” means there has become wider. Or the idea that you want to make the world more open and connected — I think it’s become clearer that an open and connected world could be a better world or it could be a worse world.

So, when you think about the 20-year time frame, what will you be looking for to see if Facebook succeeded, if it actually made the world a better place?

Mark Zuckerberg

Well, I don’t think it’s going to take 20 years. I think the basic point that you’re getting at is that we’re really idealistic. When we started, we thought about how good it would be if people could connect, if everyone had a voice. Frankly, we didn’t spend enough time investing in, or thinking through, some of the downside uses of the tools. So for the first 10 years of the company, everyone was just focused on the positive.

I think now people are appropriately focused on some of the risks and downsides as well. And I think we were too slow in investing enough in that. It’s not like we did nothing. I mean, at the beginning of last year, I think we had 10,000 people working on security. But by the end of this year, we’re going to have 20,000 people working on security.

In terms of resolving a lot of these issues, I think it’s just a case where because we didn’t invest enough, I think we will dig through this hole, but it will take a few years. I wish I could solve all these issues in three months or six months, but I just think the reality is that solving some of these questions is just going to take a longer period of time.

Now, the good news there is that we really started investing more, at least a year ago. So if it’s going to be a three-year process, then I think we’re about a year in already. And hopefully, by the end of this year, we’ll have really started to turn the corner on some of these issues.

But getting back to your question, I think human nature is generally positive. I’m an optimist in that way. But there’s no doubt that our responsibilities to amplify the good parts of what people can do when they connect, and to mitigate and prevent the bad things that people might do to try to abuse each other.

And over the long term, I think that’s the big question. Have we enabled people to come together in new ways — whether that’s creating new jobs, creating new businesses, spreading new ideas, promoting a more open discourse, allowing good ideas to spread through society more quickly than they might have otherwise? And on the other side, did we do a good job of preventing the abuse? Of making it so that governments aren’t interfering in each other’s civic elections and processes? Are we eliminating, or at least dramatically reducing, things like hate speech?

We’re in the middle of a lot of issues, and I certainly think we could’ve done a better job so far. I’m optimistic that we’re going to address a lot of those challenges, and that we’ll get through this, and that when you look back five years from now, 10 years from now, people will look at the net effect of being able to connect online and have a voice and share what matters to them as just a massively positive thing in the world.

Javier Zarracina/Vox; AFP/Getty Images

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.