Facebook appears to be testing the addition of Wikipedia knowledge panels in search results, according to reports from multiple users.
Based on the screenshots shared on Twitter, this feature is reminiscent of Google’s integration with Wikipedia.
Here’s an example that was spotted a few days ago:
New? Facebook shows Wikipedia snippits in search results
h/t @jc_zijl pic.twitter.com/zcbQJmauhE
— Matt Navarra | 🚨 #StayAtHome (@MattNavarra) June 9, 2020
Just like in Google’s search results, the Wikipedia knowledge box in Facebook search shows key details about the entity being searched for.
You’ll also notice there’s a lone Instagram link, which is a stark contrast to Google’s search results containing links to all popular social media profiles.
Unlike Google’s knowledge panels, which link to a number of domains where people can learn more about a entity, Facebook is trying to keep people within the Facebook ecosystem as much as possible.
Here’s another example that looks similar:
#Facebook sta introducendo una nuova finestra dove inserire la biografia dei personaggi, tratta da Wikipedia.
L'interfaccia è molto simile a Google. pic.twitter.com/LmMrmhtBXY
— Giulio S. (@astaniscia86) June 9, 2020
I’m able to replicate these knowledge boxes in Facebook search, and can confirm that all links keep users within Facebook.
For example, if you click on an entity under “See Also,” Facebook will load another set of search results with that entity as the new query.
It’s unknown how widely this feature is rolled out at the moment, though Facebook confirms to TechCrunch it’s still in the pilot stage.
The pilot program is currently running in English on iOS, desktop, and mobile web.
Information is gathered from publicly available data, including Wikipedia, about public figures, places, and specific interests.
Why Facebook is now deciding to include Wikipedia snippets in search results is anyone’s guess, although the timing is suspect.
Why is the timing suspect? Because Facebook has been heavily criticized in recent weeks for its fact checking policies.
Critics say Facebook should be doing more to keep misinformation off of its platform. Stopping the spread of misinformation can prevent it from potentially causing harm to other users.
Some argue Facebook should be more like Twitter with regards to how it handles misinformation, but the company doesn’t see it that way.
In fact, Facebook CEO Mark Zuckerberg is on record stating he doesn’t think his company should be the “arbiter of truth.”
After such a bold statement, it’s interesting how Facebook is now making an effort to provide context about entities being searched for.
What Facebook is doing is giving users the resources to do their own fact checking. It’s not actually making any changes to the platform or its policies.
And those resources are questionable at best, as relying on Wikipedia for accurate information is a slippery slope.
Anyone can edit an Wikipedia entry, which leads to frequent instances of misinformation.
However, Facebook is at least asking users “Is this information accurate?” with the option to click ‘yes’ or ‘no.’ Presumably, clicking the ‘no’ button would give users an opportunity to submit a correction.
So there are some checks and balances, but the addition of Wikipedia knowledge panels is unlikely to be enough to silence the critics of Facebook’s policies.
Although it may be enough to keep people somewhat informed about hot button issues such as 5G, coronavirus, and other topics that are often the target of misinformation campaigns.