CHARLOTTE – You might have noticed a shift occurring across many social media platforms, especially Facebook—new rules are in place when it comes to what content is allowed.
Officially Facebook is updating its terms and conditions this month to explicitly say they can remove your content, however, they have always been able to do that.
Since the rise of fake news, spam, and scams on social media there has been immense pressure on the platforms to crack down on problematic content.
Facebook has faced the most criticism and seems to be taking the most notable actions as well.
They are working to shut down hate groups and any events coordinated in support of them. Fact-checkers have been popping up below links to outside articles to debunk any misinformation immediately.
Users are also limited to how many people they can send a private message to at once which has proven to help slow the spread of fake news significantly.
As mental health and the negative effects of social media become a bigger concern, more changes are likely to be on the way to combat those effects.
Instagram is known for taking measures to remove hate speech and spam from the news feed and the comment section.
Even TikTok will remove videos with certain key phrases and words the platform has flagged as troublesome.
As users, we can expect quicker and more severe consequences for things like hate speech, cyberbullying and fake news across all social media platforms.
A wise rule I use for my daughters: if you wouldn’t say it to your grandmother, don’t say it online.