Facebook Shares New 'Widely Viewed Content' Report to Counter the Idea that it Amplifies Division


Facebook is very keen to dispel the notion that it helps amplify divisive content and misinformation, and it’s been working over the past few months to devise a new means of proving exactly this, resulting in the launch of its latest quarterly report, which it’s calling it’s ‘Widely Viewed Content’ update.

As explained by Facebook:

“Over time, the Widely Viewed Content Report will provide more detail on the most-viewed content that people see on Facebook. It starts with the Top 20 most-viewed domains, links, Pages and posts in News Feed in the past quarter, and excludes ads but includes content recommended by Facebook within News Feed units like Suggested For You.”

As you can see in the above chart, Facebook is looking to underline the fact that, despite some news coverage suggesting that political content dominates user feeds, that is simply not the case.

“The vast majority of content viewed in News Feed during Q2 2021 (87.1%) did not include a link to a source outside of Facebook. Only about 12.9% of News Feed content views in the US during Q2 2021 were on posts that contain links.”

Which makes sense – Facebook has always sought to prioritize posts from friends and family, and has made specific algorithm changes to boost this further over time. But it is interesting to note the wording here – Facebook is saying that the most ‘viewed’ content on its platform is clearly not links to divisive political content, given the overall stats.

But ‘views’ and ‘engagement’ are two different things, and in this case, that distinction is very important.

To provide some more context, in November last year, Facebook published an official response to this Twitter account, created by New York Times journalist Kevin Roose, which shares the top ten Facebook posts that see the most engagement in the app each day.

This listing is powered by Facebook’s own data, accessible via CrowdTangle, its monitoring and analytics platform, which is primarily used by journalists. As you can see, the daily list of the posts that see the most active engagement on the platform is generally dominated by divisive political spokespeople, most right-leaning, which appears to underline Facebook’s role in amplifying such content.

Facebook does not like this characterization, and as noted, last November, it sought to explain that this listing wasn’t actually a true reflection of what gains the most traction on Facebook, with engagement data only one piece of the broader puzzle.

As per Facebook:

“Most of the content people see [on Facebook], even in an election season, is not about politics. In fact, based on our analysis, political content makes up about 6% of what you see on Facebook. This includes posts from friends or from Pages (which are public profiles created by businesses, brands, celebrities, media outlets, causes and the like).”   

So while this content may see a high level of engagement, that still doesn’t mean that it’s what people see more of in the app. 

In order to counter this, Facebook first reportedly sought to implement new options for how it displays data within CrowdTangle, with a view to essentially painting a better picture of Facebook’s actual content engagement scope. 

That, according to the New York Times, didn’t go as planned:

“Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad. But [Brandon] Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.”

So, no matter how Facebook was looking to spin it, these types of divisive posts were still gaining traction, which shows that, even with the aforementioned algorithm updates to move away from such sharing, this remains the type of content that sees the most engagement on The Social Network.

Which leads us back to this new report – with this context in mind, Facebook has now once again sought to re-shape the narrative that its systems help to amplify divisive content. this time by shifting the discussion from ‘engagement’ – the posts that Facebook users actually comment on, Like and share – to ‘views’, or the comment that people actually see in their feeds.

So how does Facebook define ‘Views’?

“Content views are registered when a piece of content appears on someone’s News Feed, is visible on their phone, computer or tablet, and is present long enough to be seen; ‘Content viewers’ refers to the number of accounts who have viewed a piece of content.”

This is a critically important distinction in this context. The original CrowdTangle data that Facebook is now trying to play down, shows the types of posts that people on Facebook are actively engaging with, while this report shows the types of content and posts that appear in user feeds, but that they may not actually click on, comment on, Like, etc. 

This is just the content that people see as they scroll. Which is a significant variance.

So, based on this, what do Facebook users see more of?

Well, there are also some complications here – but first, this is the list of the top 10 most widely viewed domains, based on Facebook links, over the past three months.

Facebook Most Viewed Content report

So people are seeing YouTube links and UNICEF content, nothing controversial here.

In terms of the most viewed links, there are also few divisive and/or controversial domains:

Facebook Most Viewed Content report

Recipes, ‘Reppin’ for Christ’ – see, these are not all right-wing commentators and political conspiracy theories.

Though they are strange – as Ethan Zuckerman points out, the listing includes a speaking agency of former Green Bay Packers players, a CBD seller and the aforementioned ‘Reppin for Christ’, which sells ‘stylish, pro-Jesus apparel’.

How are these among the most viewed links across all of Facebook? Seems like someone’s spamming the heck out of these pages, as there doesn’t appear to be any organic driver that could be pushing that type of referral traffic.

Even the most viewed Facebook Pages over the past three months listing refutes the idea that Facebook is loaded with right-wing propaganda:

Facebook Most Viewed Content report

It’s all entertainment – light, harmless. Facebook’s not, according to these figures, disproportionately amplifying division and political angst, while health misinformation would also be at a minimum, based on these findings.

But there are some key distinctions here that Facebook has sought to gloss over – most notably through the use of quarterly stats, as opposed to daily, real-time info.

News stories generally only gain traction on that specific day, so they may not see the same amount of presence in a quarterly listing, which highlight cumulative views over time, while focusing on page views is almost irrelevant, when you consider that Facebook is also including content recommended by Facebook within News Feed units (like ‘Suggested for you’).

And again, the usage of ‘views’ over ‘engagement’ also seems deceptive, in that these are the posts that are displayed in user feeds, not the ones that they’re actively engaging with. 

Facebook has also listed domain-level info for external links, as opposed to what specifically users are linking to. What are the actual YouTube links that are being shared? That type of deeper drill-down may also provide more context – but it does seem like more context is not necessarily what Facebook’s looking to achieve with this report. 

It seems, more likely, that Facebook is looking for a means to counter the notion that it bears some responsibility for amplifying political movements. And this data also doesn’t measure shares in private groups, shares in message threads, etc.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here