In private Facebook groups devoted to natural treatments for cancer and other ailments, hundreds of thousands of members tell each other that baking soda, apple cider vinegar, and frankincense are cures that doctors don’t want you to know about. Parents of children with autism have their own groups devoted to unscientific treatments — like swallowing bleach — that they believe will “heal” their children.
Facebook on Tuesday announced that it was taking steps to limit the reach of these false and sometimes dangerous claims by treating them as similar to clickbait or spam. The announcement was the latest move by Facebook, which along with Google has recently started taking more aggressive action against medical misinformation on their platforms, where it thrived for years.
Facebook will “down-rank” posts that it believes contain health misinformation, meaning those posts will appear in the news feeds of fewer users, and less prominently. The down-ranking will also apply to some posts from Facebook groups devoted to natural treatments, which will show up less often in the news feeds of group members.
The down-ranking process will use keywords and phrases that commonly appear in posts containing exaggerated or false health claims, but which tend not to appear in posts containing accurate information on those topics. Facebook’s News Feed algorithms will use those suspicious phrases, which the company has identified with the help of health-care professionals, to predict which posts might contain sensational health claims.
“Misleading health content is particularly bad for our community,” Travis Yeh, a Facebook product manager, wrote in a blog post. “So, last month we made two ranking updates to reduce (1) posts with exaggerated or sensational health claims and (2) posts attempting to sell products or services based on health-related claims.”
Medical misinformation isn’t a problem that Facebook created. But on the massive social network, bogus treatments have thrived. Facebook has always said that its mission is to connect people with each other. In the case of medical misinformation, it has been a place where vulnerable and desperate people can connect to those who promise to know how to “cure” any disease (and sometimes, sell the promised “cure” for profit).
The announcement of Facebook’s latest move against medical misinformation comes a week after The Washington Post reported on the popular, private Facebook groups where hundreds of thousands of cancer patients and their families seek natural alternatives to medical treatment. In groups like “Alternative Cancer Treatments” (7,000 members), “Colloidal Silver Success Stories” (9,000 members) and “Natural healing + foods” (more than 100,000 members), members trade anecdotes as proof that alternative treatments can cure various cancers and other illnesses.
The changes Facebook announced on Tuesday could mean that members of groups like “Natural healing + foods” see fewer posts from other group members in their news feeds. However, those who prefer to navigate directly to group pages will still be able to easily access posts, including those containing medical misinformation.
On Tuesday, the Wall Street Journal also reported on similar examples of cancer misinformation on the platform.
NBC News reported in May on private Facebook groups that promote dangerous methods to “heal” autism in children, including the dangerous practice of forcing a child to ingest bleach.
As The Post reported, medical misinformation has thrived on platforms like Facebook and Google, drawing millions of views and creating communities of believers devoting to sharing dubious health advice. Social media companies in recent months have taken steps to curb the spread of exaggerated and harmful medical claims. Those changes came after a series of measles outbreaks across the United States prompted renewed scrutiny of the role social-media platforms play in the spread of anti-vaccine conspiracy theories.
In March, Facebook announced measures it would take to combat anti-vaccine content on their platforms. The platform would stop recommending that users join groups that contain anti-vaccine content, and block advertisements promoting those conspiracy theories.
But even as social-media companies moved to stop anti-vaccine misinformation, the online world remained saturated with medical misinformation. Until very recently, the top search results on YouTube for “cure for cancer” included videos falsely promising that baking soda and certain diets could cure cancer. Those videos have millions of views.
In May, YouTube changed how it treats search results for “cure for cancer” and several other cancer-related terms; although the videos containing viral misinformation are still available on YouTube, they no longer appear in the top results, and authoritative sources are displayed prominently.