©2021 Reporters Post24. All Rights Reserved.
If you still use Facebook Messenger, then you need to be aware of the social media giant’s warning that an “essential” security update is now seriously delayed. And while you might be thinking about hanging on, what was seen as a simple update has become horribly twisted. For Messenger’s 1.3 billion users, is it finally time to quit?
If you’re a regular reader of this column, you’ll be well versed on the critical differences between Messenger and Facebook’s other hyper-scale messaging platform, WhatsApp. While the latter encrypts all the content between senders and recipients, ensuring that no-one—not even Facebook—can read any of it, Messenger does not. The company admits that it monitors content, and we recently exposed it for doing even worse.
“The first step of keeping people safe is, you have to have strong security,” WhatsApp boss Will Cathcart said last month, as the secure messenger launched yet another privacy campaign around the strength of its encryption. “We think governments shouldn’t be out there trying to encourage tech companies to offer weak security.”
So, on the face of it, Facebook’s promise back in early 2019 to extend WhatsApp’s encryption to protect Messenger content was a good thing, right? Many users and security professionals certainly think so. “We need encryption for all conversations, all platforms,” ESET’s Jake Moore tells this week’s Straight Talking Cyber—the video at the top of this story. “Otherwise, companies will be selling our data, profiting from it.”
Apple iMessage Soundly Beaten By WhatsApp’s Stunning New Update
Has Google Chrome Secretly Installed ‘Nasty’ New Tracking On Your iPhone?
Why Delayed Google Chrome Update Is A Reason To Switch
But with Messenger, this isn’t as straightforward as it might seem. WhatsApp is a point-to-point messenger, where you can only contact people or numbers you know, with Messenger you can search and browse the site, contacting those you do not.
Facebook’s most controversial update in years risks the platform “failing to protect children from avoidable harm,” one of the world’s leading children’s advocacy groups, NSPCC, warned this week, telling me that the evidence they have seen suggests “a significant drop in reports of child abuse” on its sites.
Where a messenger is directly linked to a social media site that hosts user profiles, especially where that includes minors, there are serious risks in shielding messaging content, in preventing that content form being monitored. This isn’t a theoretical risk. The investigators who brought U.K. pedophile David Wilson to justice this year say that he may not have been caught with Facebook expanded encryption in place.
U.K. children’s charity NSPCC tells me that “10% of child sexual offences on Facebook-owned platforms take place on WhatsApp, but they account for less than 2% of child abuse the company reports to police because they can’t see the content of messages.” We saw the impact this can have with the 58% reduction in child exploitation reports after the EU’s ePrivacy directive. While an emergency reprieve has addressed this, child advocates claim this is illustrative of the impact encryption will have.
When asked about this issue, Facebook told me that “we’re building strong safety measures that are designed to prevent harm from happening in the first place and give people controls to respond if it does. Working together also gives us more information to identify abusive accounts and allows us to introduce safety features behind the scenes like restricting interactions between adults and minors.”
Facebook’s spokesperson also told me that encrypting Messenger would not impact on its ability to flag and prevent such online harms. But earlier this year, the company’s head of global policy management, when asked by U.K. lawmakers whether cases of child abuse might “disappear” once encryption is in place, admitted that “I would expect the numbers to go down. If content is being shared and we don’t have access to that content, if it is content we cannot see then it is content we cannot report.”
A few weeks ago, the head of Britain’s MI5 warned that by encrypting Messenger, Facebook would give a “free pass” to “some of the worst people in our society.” And this week, the U.K. police lead on child protection has said much the same: “Unless the tech industry starts to take this really seriously, we are going to see the exponential growth in the numbers of images we are recording, the continuing abuse of children… and an ever growing number of people that have a sexual interest in children.”
But WhatsApp’s Cathcart warns that the downside of weakening encryption is worse. “Imagine there was a proposal from the government to put a video camera in every living room in a country hooked up to the internet,” he told the Guardian last month, “so the government can turn it on when they’re investigating a crime… Because it’s technical, sometimes the horror of what’s being proposed is lost.”
This debate is far from clear cut. On the one hand, messaging without end-to-end encryption in place puts content and privacy at risk. But, on the other hand, where that encryption is in place, and where adults might message minors, it introduces child safety risks. The real Messenger issue is child safety, grooming and radicalization, not messaging between bad actors engaged in plotting terrorism and serious crime.
The answer, in my view, is pretty obvious. Facebook has a secure messenger in the form of WhatsApp. This is the platform that popularized end-to-end encryption in the first place, and it uses this security as its USP. Facebook users can turn to WhatsApp to message those they know securely and privately. Facebook doesn’t need to offer the same level of “absolute” security for Messenger and Instagram. It can certainly protect against casual monitoring or interception, but it can also facilitate case-by-case interception or forensic examination through the use of protected backdoors.
As a security professional, it’s anathema to me to pull back from end-to-end encryption as a universal default. But as a father it’s obvious we need to find some balance here. I keep returning to that fact that users can browse Facebook and click to message users at will. Vast numbers of children do not have privacy protections in place, and while Facebook tells me it can prevent adults from contacting unknown children, anecdotal data and sheer common sense tells me that’s not an absolute.
“The lessons of the past five years make it absolutely clear that technology companies and governments must prioritize private and secure communication,” Cathcart says. He’s right, of course, but it doesn’t have to be one size fits all. And whether that means not encrypting platforms like Messenger, or allowing backdoors or restricting encryption to adult accounts, options should be explored.
And this brings me to the real crux of the issue here. The serious danger for Facebook is that if it pushes forwards with expanding WhatsApp’s security to include Messenger, and if lawmakers insist that any such expansion comes with compromises, then there is a real risk that WhatsApp’s own security will be weakened.
Conversely, of course, Facebook seemingly takes the view that fully integrating its platforms is the best way to keep them under the same roof. Whether or not that changes in light of its recent antitrust victory we will wait to see.
Facebook should accept limitations on any messaging encryption expansion as a trade-off to protect WhatsApp. Messenger users reading this should switch their personal chats to WhatsApp (or Signal) and leave Messenger for trivial comms and kids. You should certainly not be using it as the default messaging platform if you’re on Android, where switching from the OS default is an option—unlike on iOS.
It makes sense for Facebook to integrate its messengers. Between them, Messenger, WhatsApp and Instagram serve half the world’s online population. And while privacy enhancements are always welcome, please don’t forget the profit motive in any such move. Messengers are sticky, they generate invaluable metadata, and they increasingly bring businesses and customers together. Combining three already giant platforms into the world’s messaging behemoth is not being done to protect our privacy.
Encrypting Facebook Messenger is an answer to a problem that doesn’t need to exist. Facebook can choose to better control its own data harvesting and monitoring, and it can continue to work with law enforcement to flag bad behaviors on its platform.
You absolutely should stick to encrypted messaging platforms. That means switching where you might be using Facebook Messenger or Telegram (which isn’t default end-to-end encrypted) or SMS. It doesn’t mean we need to secure all communications against lawful interception, without any regard for the unintended consequences.
Source: www.forbes.com