How to put in line a) the value of free speech and b) the different understandings of customer satisfaction on social media platforms? The answer is this: via moderation by standard rules for any given platform, and by placing counter-narrative content in secluded spaces.
As we know, some groups of people, for one reason or another, will wish to publish content deemed by others as radical, while these others will wish to see the same, “appaling” content taken down from the internet spaces. Tech companies have been reluctant, in the beginning, to impose strict regimes, preferring to uphold the value of free speech. But they cannot be everybody’s darling. With the shift having taken place in recent years, at least the big tech companies now embrace counter-extremism measures, to take viral conent off their platforms.
Content Moderation Is Central
There is always the phase of market observation, of trial and error. But the issue of problematic content is not new to tech sites any more. In recent years, there was public outcry, media criticism and calls from politics to impose stricter governing rules. Companies have to set up clear benchmarks and rules, the application of which will be relatively transparent, given technical restraints and limited financial resources.
While it is true that companies ought to employ more moderators (the human factor, in a positive sense, being crucial in moderation) we can hope that the implementation of company regimes will become more consistent through the years and transcending platforms.
Rising Scrutiny, Standard Rules
For the more critical among us, this may mean that some low-level content will remain, dissatisfactory as this may be. For others, it will mean that their contestable contributions will be subject to moderation and taken down. This will be dissatisfactory for them, too. But company scrutiny will probably rise, and rightfully so, as departments and instances are being built and enlarged which are responsible for moderation within the tech companies. These departments and instances are not alibi blocks, they are tasked with fulfilling an important role.
Securing Private Spaces
On the other hand, there is a tendency to create more secluded spaces within social media platforms. Within these, two or more people meet, as is the case with instant messaging already. The trend of more secluded spaces will bring about a higher amount of user experience and ease on some platforms, less broad discussion, and it will create more so-called filter bubbles, inaccessible to the public, wherein accountability is lower. This is worrysome, as communication within those filter bubbles will, all in all, still trickle down manifestly into society, despite more tiny individual spheres – should the trend prevail. Those who support it argue that while private conversation is legitimate, group discussions, especially in the shadows, with problematic subjects are proven to reinforce stereotypes, simplification and, possibly, falsehoods.
Our proposition would be that the enhancement of user experience by facebook and others shall be such, among other things, that counter-narrative ads and news with content from reliable sources be distributed anonymously, balancing the stereotypes within the filter bubbles to some degree. Public discussion should still be encouraged via more user-convenient spheres where groups of people can meet. These spaces are to be accessible and should be and remain attractive. Reported, indexed or known violations of laws must be sanctioned in secluded spaces as in public spaces. No double standard.
Thorsten Koch, MA, PgDip
Updated December 2019
[responsivevoice_button voice=”UK English Female” buttontext=”Listen to Post”]