Facebook is once again trying to address the dumpster fire that is public comments on its site.
As anyone who has moderated or even just browsed a Facebook page knows, the comments section on public posts is usually a cesspool of hatred, bigotry, spam, and irrelevance. Even by the(low) standards of the internet, Facebook comments are famously awful. And bad comments aren’t only unpleasant to read, they can also actuallyreduce the credibilityof the content they are commenting on.

Now Facebook is introducing anew comment ranking systemto attempt to tackle this problem. Comments on public posts made by Pages or people with many followers will be ranked, with the aim of showing the most relevant and highest quality comments at the top. To determine the quality of comments, Facebook will use data from four metrics:
This is far from the first time Facebook has tried to address the quality of comments. From newvisual designs for commentstoemoji comment reactions, the company has tried to improve the appearance of comment sections before. And anexperiment with downvotingwas an attempt to raise the quality of comments as well as their look.
But the problem with comments may run deeper than something a few cosmetic improvements or ranking algorithms can fix. Facebook has shown itself to be woefully inadequate to the task of moderating content on its platform, withhate speechbeing allowed to proliferate andfake newsspreading like wildfire. The company has been hiring more human moderators but still tends to rely on A.I. for the majority of its moderation, and there are many types of negative content thatA.I. can’t catchas it lacks understanding of social context.
The new comment ranking system may help to some extent, but until Facebook tackles the site-wide issues with its platform it will only be a band-aid over a deeper problem.