Mark Zuckerberg, Facebook CEO: “It is time for us to adopt a new company brand to encompass everything that we do, to reflect who we are and what we hope to build. I am proud to announce that starting today, our company is now Meta. Our mission remains the same. It’s still about bringing people together. Our apps and their brands, they’re not changing either and we are still the company that designs technology around people.”
“Building our social media apps will always be an important focus for us. But right now, our brand is so tightly linked to one product, that it can’t possibly represent everything that we’re doing today, let alone in the future. Over time, I hope that we’re seen as a metaverse company and I want to anchor our work and our identity on what we’re building towards.”
“From now on, we’re going to be metaverse first, not Facebook first. That means that over time you won’t need to use Facebook to use our other services. As our new brands start showing up and our products, I hope that people come to know the Meta brand and the future that we stand for. It’s a future that is beyond one company that will be made by all of us.”
“We’ve built things that have brought people together in new ways. We’ve learned alot from struggling with social issues and living under closed platforms. Now it is time to take everything that we’ve learned and help build the next chapter. I’m dedicating our energy to this more than any other company in the world. If this is the future that you want to see, I’m hoping that you will join us because the future is going to be beyond anything that we can imagine.”
Omar Akhtar, Research Director, Altimeter: “Instead of acknowledging in very explicit terms all the negative publicity and the reasons for that negative publicity, Facebook has very much decided to leave everything behind and go forward and you can see that in the name change, Meta, which means beyond. They’re really looking to dazzle us with everything that they’re going to do and it’s not even in the near future. This is very much five years out, maybe even ten years out. The kind of stuff that they’re talking about isn’t close to being a reality.”
“But as in life, and this applies to tech companies as well, you can’t just leave the dirty stuff aside or sweep it under the rug or put it to one end and say you’re just going to focus on the good stuff. You have to address those things head on and that’s something we didn’t see in this talk from Zuckerberg and company.”
“What this does is it allows Zuckerberg to say that, look, you can get away from that toxic old Facebook platform and we can appeal to you, Gen Z and all you TikTokkers out there, we’ve got something new and exciting for you. You can see a lot of this was really geared towards that youth generation that Facebook is trying so hard, or rather Meta now, is trying so hard to go after.”
Antigone Davis, Global Head of Safety at Facebook: “One of the things that I think is very important in all of this is allowing independent researchers to be able to do work to look at the impact, aside from us.”
John Nicolson, Member of Joint Committee on the Draft Online Safety Bill: “Agreed, agreed, So why are you not letting them do more of it?”
Davis: “We are working to try to try to do that. So, we’re working with academic institutions to…”
Nicolson: “What does that mean? ‘We’re working towards it’ means ‘we’re not doing it now’. Why are you not doing it now?”
Davis: “Because there are privacy obligations we have around people’s data and what we’d like to be able to do is provide that data to researchers in a way that meets those privacy obligations. Again, I think this is actually an area where a regulator like Ofcom could help to say to us: ‘Here are the privacy obligations that have to be met.’ We can show that we’ve met those obligations and then enable researchers to do the work that they want to do.”
John Nicolson, Member of Joint Committee on the Draft Online Safety Bill: “You’re not coping with this. You should be handing over your research to folk who can do better than you in assessing the damage that Facebook and Instagram are doing.”
Antigone Davis, Global Head of Safety at Facebook: “You are correct that somewhere between about 12% and 30% and on the various issues of people who identify having issues have issues. We actually do take those things quite seriously. So, in the context of one of the issues, let’s just take eating disorders for a moment, because that’s the one with the highest number. We actually don’t serve weight loss ads to teens. We actually surface resources to teens who may be searching for this content. We work with experts to identify how we can put on proper warning systems, for example, for people and surface that content, to try not to recommend that content. If you look at something like suicide and self-ideation, another very serious issue, Not only do we surface resources when someone searches for it, but we’ve actually built in the opportunity right within our platform. We identify that content to flag that…”
John Nicolson, Member of Joint Committee on the Draft Online Safety Bill: “Listen, it’s not working. It’s not working, because the figures are too high.”
Antigone Davis, Global Head of Safety at Facebook: “Quite the contrary, it’s my experience that we take down these pages because they violate our policies, we’ve invested in AI to identify those pages…”
Nicolson: “Right, why did you only take them down after Apple issued you with that threat?”
Davis: “Our AI is not perfect. It’s something that we’re continuing to always improve.”
Nicolson: “You can say that again.”
Davis: “People also flag this content for us, and when they flag this content for us, we also will remove it.”
Nicolson: “Come on, that wasn’t Apple flagging the content. Apple threatened you with something financially disastrous for you, which was to have Facebook removed from the App Store. That’s when you leapt into action.”