How to put the value of free speech and the different understandings of customer satisfaction on social media platforms in line? Via moderation by standard rules for any platform, and by placing counter-narrative content in secluded spaces.
As we know, some groups of people, for one reason or another, will wish to publish content deemed by others as radical, while these others will wish to see the same, “appaling” content taken down from the internet spaces. Tech companies have been reluctant, in the beginning, to impose strict regimes, preferring to uphold the value of free speech. But they cannot be everybody’s darling.
Content Moderation Is Central
There is always the phase of market observation, of trial and error. But the issue of problematic content is not new to tech sites any more. In recent months and years, there has been public outcry, media criticism and calles from politics to impose stricter governing rules. Companies have to set up clear benchmarks and rules, the application of which will be more or less transparent, given technical restraints and limited provision of financial resources.
While it is true that the companies ought to employ more moderators, the human factor, in a positive sense, being crucial in moderation, we can hope that the implementation of company regimes will become more consistent through the years.
Rising Scrutiny, Standard Rules
For the more critical among us, this may mean that some low-level content will remain, dissatisfactory as this may be. For others, it will mean that their contestable contributions will be subject to moderation and taken down. Company scrutiny will probably rise as departments and instances are being built and enlarged which are responsible for moderation within the tech companies. These departments and instances are not alibi blocks, they are tasked with fulfilling an important role.
On the other hand, there is a tendency to create more secluded spaces within social media platforms. Within these, two or more people meet, as is the case with instant messaging already.
Securing Private Spaces
The trend of more secluded spaces will bring about a higher amount of user experience and ease on some platforms, less broad discussion, and it will create more so-called filter bubbles, inaccessible to the public, wherein accountability is lower. This is worrysome, as communication within those filter bubbles will, all in all, trickle down manifestly into society. While private conversation is legitimate, group discussions, especially in the shadows, with problematic subjects are proven to reinforce stereotypes, simplification and, possibly, falsehoods.
My proposition would be that the enhancement of user experience by facebook and others shall be such, among other things, that counter-narrative ads and news with content from reliable sources be distributed anonymously, balancing the stereotypes within the filter bubbles to some degree. Public discussion should still be encouraged via more convenient spheres where whole groups of people can meet. These spaces are accessible to everybody and should be and remain more attractive. Reported, indexed or known violations of laws must be sanctioned in secluded spaces as in public spaces. No double standard.
Thorsten Koch, MA, PgDip