Hate Speech Online

Hate speech online affects many people. Especially on social media platforms, users are increasingly confronted with derogatory or hostile hate comments. What is meant by hate speech, and how can one successfully defend oneself against it?

Hate speech is a form of communication aimed at inciting hatred against individuals or groups, discriminating, or dehumanizing them. It targets characteristics attributed to a particular group of people, such as origin, religion, gender, disability, or sexual orientation. It implies that because of these characteristics individuals or groups have less value or fewer rights.

Online in particular, hate speech represents a growing problem that can lead to extreme polarization. Especially on social media, viral spaces can emerge in which hate speech dominates. Such dynamics transpire in the offline world and promote hatred and violence.

It is therefore all the more important to clearly identify hate speech on social medias and combat it effectively. Nevertheless, removing or restricting content to limit hate speech could also diminish someone's freedom of expression. From this tension arises complex content moderation dilemmas. 

Examples of Hate Speech

  • Comparisons with animals or dehumanizing descriptions: Members of a marginalised community with the characteristics mentioned above are degraded by being compared to animals or described as subhuman (e.g. equating people with Black skin with apes).
  • Generalized attribution of negative characteristics: Members of a marginalised communities are collectively attributed to criminal or dangerous behavior (e.g. “All young migrants are rapists and sex offenders.”).
  • Claimed superiority: The person expressing themselves places themselves above a marginalised community and makes derogatory statements about this community (e.g. sexist insults or discrimination against members of the LGBTQIA+ community).
  • Trivialization or approval of violence against protected groups: The destruction, oppression, or persecution of  through crimes or disasters is trivialized, relativized, glorified, or mocked (e.g. trivializing the Holocaust).

Is Hate Speech a Criminal Offense?

“Hate speech” is not a legally defined term and is therefore not, as such, a criminal offense. Whether a statement understood as hate speech also constitutes a criminal offense must be examined on a case-by-case basis. In Germany, relevant offenses may include, among others:

  • Insult (§ 185 German Criminal Code)
  • Defamation (§ 187 German Criminal Code)
  • Incitement to hatred (§ 130 German Criminal Code)
  • Threats (§ 241 German Criminal Code)

If such criminal offenses are fulfilled, this is often referred to as hate crime. This issue has gained significant importance in Germany in recent years. In July 2021, the Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität was adopted, and several criminal law provisions were tightened in order to more effectively prosecute offenses in the digital space.

Insult

Hate speech is often accompanied by a punishable insult. Derogatory posts or comments about members of a protected group generally violate their personal honor under § 185 of the German Criminal Code. Example: The term “Schwuchtel” was classified as a criminal insult by the Frankfurt am Main District Court (Urt. v. 15.01.2021, 907 Cs – 7680 Js 229740/19).

Defamation

There are also scenarios in which fabricated facts are attributed to members of a protected group, significantly degrading them and fueling hatred and discrimination. In such cases, the offense of defamation (§ 187 German Criminal Code) may additionally be fulfilled.

Incitement to Hatred

There is considerable overlap between hate speech and incitement to hatred (§ 130 German Criminal Code). In particular, incitement to hatred against a national, religious, or ethnic group, calls for violence, or violations of human dignity are punishable. Example: Referring to “foreigners” as “social parasites” was assessed as incitement to hatred by the Frankfurt am Main Higher Regional Court (judgment of 15 August 2000 – 2 Ss 147/00).

Threats

If hate speech is accompanied by a threat (§ 241 German Criminal Code), it not only involves degradation but also the announcement of a future unlawful act—often due to a protected characteristic of the affected person. For example, if someone insults people with a migration background and simultaneously threatens to commit bodily harm against them, this conduct is also punishable as a threat under § 241 of the German Criminal Code.

User Rights also regularly handles cases in which not only hate speech is present, but additional criminal content is identified.

As a Victim, How Can I Protect Myself Against Hate Speech ?

In addition to the legal provisions of the German Criminal Code that criminalize certain forms of hate speech, community guidelines on social media platforms are also decisive. Every major platform has rules that define and prohibit hate speech or hateful behavior. Content that violates these rules must generally be removed. In the event of repeated violations, an account may even be suspended.

What to Do 

1. Report the content to the platform

As a first step, you should report the content directly to the platform. Under Article 16 of the Digital Services Act (DSA), online platforms are obliged to remove illegal content once they have been notified of it. Large platforms such as Instagram, Facebook, or TikTok must also provide an internal complaint mechanism. This means that if the platform does not act after your report, you can file an appeal.

Important: Content may be prohibited under community guidelines but not be criminally punishable—or vice versa. You should therefore always pay close attention to why you are reporting something and which justification you choose. This can later be decisive, including before a court or an out-of-court dispute settlement body such as User Rights.

2. Seek Support

As a general rule: No one has to endure hate speech alone. A victim can seek help, for example through:

  • legal advice, e.g. lawyers specializing in media and freedom of expression law
  • HateAid, a non-profit organization that supports victims of digital violence
  • the Media Authority of North Rhine-Westphalia (Landesanstalt für Medien NRW), which bundles information and counseling services
  • trusted persons or psychological professionals if the burden becomes too great

3. File a Criminal Complaint

In many cases, it may be advisable to report hate speech to the police in order to initiate criminal proceedings against the author. The biggest problem is that perpetrators often act anonymously online. Identification can therefore be difficult and time-consuming.

4. Out-of-Court Dispute Resolution Under the DSA

Since the DSA was fully applied at the beginning of 2024, there is another course of action: affected persons can contact a certified out-of-court dispute settlement body under Article 21 DSA—such as User Rights. We can examine whether the platform was justified in leaving a reported hate comment online or not. Unlike court proceedings, this procedure is free of charge for users and is generally significantly faster.

5. Civil Law Action Against the Platform

If a platform clearly fails to remove criminal or prohibited hate speech despite being notified, civil law action against the platform is also possible. However, court proceedings can involve high costs and take a considerable amount of time.

6. Combining Different Options

The various options—reporting to the platform, filing a criminal complaint, civil court action, or out-of-court dispute resolution—are independent of one another and can be used in parallel. You can therefore involve an out-of-court dispute settlement body and go to court at the same time.

How User Rights Helps You

User Rights supports you in two hate speech scenarios on social media.

Scenario 1: You report a hate comment that affects you directly 

You have been confronted with hate speech on social media and believe that the comment violates the community guidelines on hate speech or hateful behavior? You can proceed as follows:

  1. Report the content to the platform: Large social networks such as Facebook, Instagram or TikTok are obliged under the Digital Services Act (DSA) to provide reporting mechanisms for hate speech content. Using this function, you can report the comment directly. If the platform concludes that there is a violation of the hate speech guidelines, it must take action—for example, by removing the content.
  2. Secure evidence: Take screenshots of the comment or save chat histories. These records are helpful if the platform removes the content or it later becomes no longer visible. You can then use them as evidence.
  3. Submit the case to User Rights: If the platform does not respond appropriately or rejects a measure, you can submit the case to User Rights. We independently examine whether the platform acted correctly or whether it must take action.

Scenario 2: The platform accuses you of hate speech and imposes a measure against you

Has content you published been removed or otherwise restricted because the platform accuses you of hate speech? If you believe that this moderation decision is unjustified, you can proceed as follows:

  1. File an appeal with the platform: Most major platforms such as Instagram, TikTok, or Facebook offer a reporting and remedy procedure. There you can directly appeal the measure and explain why you consider the accusation of hate speech to be unfounded.
  2. Secure evidence: Save all relevant information, such as screenshots of the affected content, and keep them safe. The more evidence, the better. Also take screenshots of the messages in which the platform informed you of the measure.
  3. Submit the case: If the platform does not respond appropriately and does not reverse its decision, you can submit the case to us. We will examine whether the platform acted in accordance with its guidelines.