TopicFair_im_Netz_en Questions and answers:Act to Improve Enforcement of the Law in Social Networks
TopicFair_im_Netz_enWhy do we need a new law?
Statutory compliance rules are needed to make social networks process user complaints more quickly and effectively.
We are currently witnessing a major shift in social discourse on the Internet, especially on social networks. Online discussions are often aggressive, abusive and hateful. Anybody can fall victim to hate crime and suffer defamation as a result of their opinions, skin colour, background, religion, gender or sexuality. Hate crime may seriously threaten the peace in a liberal, open and democratic society if it is not suppressed and prosecuted effectively. This is why we need to improve the way the law is enforced on social networks. Any content that fulfils the statutory definition of a criminal offence (e.g. incitement to hatred, insult, defamation, or disturbing the public peace by providing misleading information about a criminal offence) should be taken down as quickly as possible.
By 2015, the increasing spread of hate crime – especially on social networks such as Facebook, YouTube and Twitter – was serious enough for the German Federal Ministry of Justice and Consumer Protection to set up a Task Force including network operators and representatives of civil society. The companies involved in the Task Force promised to improve the way they handle hate crime reported on their websites. They committed to setting up user-friendly mechanisms for people to report offensive posts. They also pledged that reported content would be reviewed within 24 hours by teams of legally and linguistically qualified staff, and that any unlawful content would be taken down. Such decisions would be taken on the basis of German law.
The voluntary commitments undertaken by these companies have led to some initial improvements. But more is required. Criminally punishable content is still not being deleted in sufficient quantities. In January/February 2017, the youth protection organisation "jugendschutz.net" monitored the deletion activities of social networks and concluded that user-flagged hate crime is still not dealt with quickly and effectively enough. While YouTube now deletes criminal content in 90% of cases, Facebook managed only 39%. At Twitter, only 1% of user reports resulted in deletion.
Social networks must take responsibility when people misuse their platforms to commit hate crime and disseminate criminally punishable fake news. The tools currently available and the system of voluntary commitments by social networks are not enough: law enforcement continues to be fraught with problems. New legal rules for social networks, including fines for non-compliance, are therefore needed if quick and effective action is to be taken against hate crime on the web.
TopicFair_im_Netz_enWhich providers does the new law apply to?
The compliance rules set out in the new law will only affect social networks. They do not apply to all "service providers" under the Telemedia Act (Telemediengesetz, TMG). Subsection (1), second sentence, of the new law defines social networks as "telemedia service providers which, for profit-making purposes, operate Internet platforms which are designed to enable users to exchange and share any content with other users or to make such content available to the public". Services enabling individual communication, particularly email or messaging services, are not included in this definition.
Additionally, the new law does not apply to social networks dedicated to specific user groups or topics. This means that professional networks, special-interest communities, online gaming platforms and shopping websites are not covered by the new law either.
Platforms with journalistic/editorial content are also excluded. This also applies to websites that use the infrastructure of another social network to make their own journalistic/editorial content available, e.g. in the form of a Facebook page or profile.
Finally, social networks are exempt from the obligations stipulated in the new law if they have fewer than two million registered users in the Federal Republic of Germany.
TopicFair_im_Netz_enDoes the new law also apply to social networks that are not based in Germany?
Yes. The draft ensures that the compliance duties established, as well as the corresponding regulatory offences committed in the event of non-compliance, are also applicable to actions taken abroad.
TopicFair_im_Netz_enIn which cases and how fast will social networks have to delete or block unlawful content?
Social networks must remove or block access to content that is manifestly unlawful within 24 hours of receiving the complaint. In cases of other reported content, social networks must decide whether to delete "immediately", i.e. usually within 7 days of receiving the complaint.
The 7-day deadline can be exceeded if the decision on whether content is lawful depends on whether a factual allegation is true or false, or is clearly determined by other factual circumstances. In these cases, the social network may give the user an opportunity to respond to the complaint before the decision is made.
The 7-day deadline may also be exceeded if, within that period, the social network refers the matter to a "recognised institution of regulated self-governance", and agrees to accept the decision of that institution. Upon referral, the institution must decide whether the content is lawful or unlawful within 7 days.
TopicFair_im_Netz_enWho can register as a self-regulation institution?
In order to be recognised as an "institution of regulated self-governance", an organisation must hire independent experts to review content. Also, the organisation must have sufficient resources to ensure that decisions on the lawfulness of reported and referred content can be made within 7 days. A set procedure must be in place for reviewing content. Each institution must have established procedural rules and its own complaints service/hotline for reporting content. Appointments to the decision-making body should be diverse, and must involve Germany's regional media bodies.
In establishing a system of regulated self-governance, complaints services/hotlines shall be set up to ensure that users whose content has been unfairly taken down can also lodge a complaint. This will ensure that any legally permissible content that is unfairly blocked can be restored quickly and easily.
Furthermore, self-regulation institutions must be funded by a number of social network providers or organisations, thus guaranteeing that the appropriate resources are available. In addition, the institution must remain open to new members, especially social networks. Details of the exact system in place for reviewing referred content will be set out in the institution's own rules of procedure. The affected parties (social network, complainant, user) will have the opportunity to submit their opinion to the institution.
The system of regulated self-governance will itself be subject to oversight. Each self-regulation institution must be recognised by the Federal Office of Justice. Such recognition can be granted subject to supplementary requirements, or withdrawn. This may occur in cases where those reviewing content cannot be deemed to be independent and/or possessing the necessary expertise, or where a swift review is not guaranteed. It ensures that the compliance rules enacted in the new law are not undermined.
TopicFair_im_Netz_enMust unlawful content be secured prior to deletion for use as evidence?
Yes. The operators of social networks have a duty to retain deleted content for a further 10 weeks for use in prosecuting criminal offences. This content must be retained within the EU.
TopicFair_im_Netz_enWhat constitutes "manifestly unlawful" content?
The draft covers content that fulfils the statutory definition of a criminal offence. Content is manifestly unlawful if no detailed examination is required in order to recognise that it fulfils such definition.
TopicFair_im_Netz_enWhich criminal offences are covered?
The new law applies to content that constitutes one of the following offences under the German Criminal Code (Strafgesetzbuch, StGB):
- Section 86 (Dissemination of propaganda material of unconstitutional organisations),
- Section 86a (Using symbols of unconstitutional organisations),
- Section 89a (Preparation of a serious violent offence endangering the state),
- Section 91 (Encouraging the commission of a serious violent offence endangering the state),
- Section 100a (Treasonous forgery),
- Section 111 (Public incitement to crime),
- Section 126 (Breach of the public peace by threatening to commit offences),
- Section 129 (Forming criminal organisations),
- Section 129a (Forming terrorist organisations),
- Section 129b (Criminal and terrorist organisations abroad),
- Section 130 (Incitement to hatred),
- Section 131 (Dissemination of depictions of violence),
- Section 140 (Rewarding and approving of offences),
- Section 166 (Defamation of religions, religious and ideological associations),
- Section 184b (Distribution, acquisition and possession of child pornography) in conjunction with section 184d (Distribution of pornographic performances by broadcasting, media services or telecommunications services),
- Sections 185 to 187 (Insult, malicious gossip, defamation),
- Section 201a (Violation of intimate privacy by taking photographs),
- Section 241 (Threatening the commission of a felony), or
- Section 269 (Forgery of data intended to provide proof).
Both the physical and mental elements of these offences must be fulfilled. Furthermore, there must have been no justification for the act (e.g. the act served to exercise legitimate interests).
TopicFair_im_Netz_enWhat is the difference between deleting content and blocking it?
When content is blocked it is no longer available in the relevant jurisdiction, i.e. within Germany. However, it remains accessible in other countries. When content is deleted ("taken down"), it is no longer available anywhere in the world.
TopicFair_im_Netz_enWhy should social networks decide what is legal and what is not? Is that not the job of law enforcement agencies?
This obligation is already set out in section 10 of the Telemedia Act. According to this Act, service providers are obliged to delete unlawful content as soon as they become aware of it. It is therefore up to service providers themselves to decide whether content is unlawful when such content is reported. Those who operate services and/or infrastructure and make these available to third parties have a duty to limit their own role – whenever still possible – in any abuse of the infrastructure and/or services they provide. Operators themselves are responsible for doing this after receiving specific indications that such abuse is taking place. There is no general judicial-scrutiny provision in German civil law which would imply that any issues surrounding disputed content could, or indeed must, first be resolved by the courts.
Social networks do not set the criteria for determining which content must be taken down. They must make their decisions purely on the basis of German criminal law. The new law does not create any new duties to delete. Rather, it seeks to ensure that the existing law is respected and enforced.
The aim in ensuring that social networks take down criminal posts is to guarantee a free, open and democratic culture of communication, and to protect the groups and individuals affected by hate crime. Completely independently of this, those who post unlawful content on the Internet must also be prosecuted. This goes without saying. Ensuring that perpetrators face the full force of the law is and will remain the job of law enforcement.
TopicFair_im_Netz_enAre social networks only obliged to delete or block unlawful content once such content has been reported to them?
Yes. The new law does not require social networks to search their platforms proactively for unlawful content. The "notice and take down" procedure set out in the EU's E-Commerce Directive remains in place.
TopicFair_im_Netz_enWho can report unlawful content and how do they do this?
Individual users, specialised Internet complaints hotlines and civil-society organisations can report content via the channels established for them. The specific reporting channel in each case will depend on the social network being used to share, post or publish the content concerned. The new law stipulates that the operators of social networks must provide users with an easily recognisable, directly accessible and permanently available procedure for reporting unlawful content.
TopicFair_im_Netz_enWhat about the public reporting obligations now imposed on social networks?
In the future, social networks that receive more than 100 reports of unlawful content in any calendar year must report publicly on how they deal with criminal content on their platforms. In particular, reports must contain the following information:
- Mechanisms in place to submit reports about criminal content,
- Criteria applied in deciding whether to take down/block content,
- Number of complaints filed within the reporting period, broken down according to who reported the content (hotline or user?), the reason for the complaint,
- Number of complaints within the reporting period that resulted in take down/access blocking (also broken down according to who reported the content and the reason for the complaint).
Reports must also include the number of cases referred to recognised self-regulation institutions.
The reports must be drawn up twice a year and published in the electronic Federal Gazette, as well as on the social network's own website. In the latter case, the report must be easy to find.
TopicFair_im_Netz_enWhen can social networks be issued with a penalty?
According to the draft, intentional or negligent failure to comply with the reporting obligation constitutes a regulatory offence, as do violations of the obligation to maintain an effective complaints management system. Providers who do not name a person authorised to accept service and to receive information requests from German law enforcement authorities will also be committing a regulatory offence.
The regulatory offence of failing to implement an effective complaints management system is committed when a company does not meet organisational duties (not simply because it has failed to delete content in an individual case). Fines will therefore be imposed for systematic shortcomings: a regulatory offence is committed by anyone who intentionally or negligently fails to set up a complaints management system or fails to set one up correctly. The same applies to anyone who fails to rectify organisational deficiencies, or who fails to meet the statutory requirements for complaints management over a "not insignificant period". This reflects the aim of the new legislation: to ensure that social networks set up effective complaints management procedures to review content in individual cases and prevent the chilling effects of hate speech. Social networks will therefore not face fines for a misjudged decision in a single case.
Who receives the fine will depend mainly on the specific infraction in question and how the company concerned is structured. This might be the person in charge of the office responsible for handling complaints or another member of the social network's management.
Because section 130 of the Regulatory Offences Act (Ordnungswidrigkeitengesetz, OWiG) is applicable as well, it is also possible that the owner of the company operating the social network is prosecuted. This might happen in cases where violations could have been prevented or made much more difficult with proper supervision in place. According to section 30 of the Regulatory Offences Act, regulatory fines can also be imposed on legal persons and associations of persons.
TopicFair_im_Netz_enWho decides whether to impose a fine?
Under the new law, the Federal Office of Justice will be responsible for prosecuting such regulatory offences and imposing fines. If the Federal Office of Justice cites systematic failings in a social network's review practice – as attested by a number of erroneous decisions – as the basis for asserting that no real procedure is in place to deal with reports of unlawful content, the Office must refer the matter to court for a preliminary ruling. The competent court will then issue a binding decision as to whether the content that the network has not taken down in each of these cases was lawful or not.
The court with jurisdiction over the matter will also be the court that rules on any objection to the regulatory fine order.
TopicFair_im_Netz_enHow big might the fine be?
Infringements may be punished with a fine of up to 5 million euros. The maximum fine set out in the law increases to 50 million euros (section 30 (2), third sentence, of the Regulatory Offences Act) when issued against legal persons and associations of persons. Such regulatory fines for legal persons and associations can also be issued where the offence was committed abroad and/or the infringer is a foreign company.
TopicFair_im_Netz_enCan users turn to the authorities if a social network has not deleted criminal content or taken it down on time?
Yes. Users also have the option of submitting a complaint to the authority responsible for issuing fines. This authority is obliged to inform the user of the outcome of its review.
TopicFair_im_Netz_enHow can a fine notice or civil action be filed if the social network does not have a seat in Germany?
According to the new law, the providers of social networks will be obliged to have a person available in Germany who is authorised to receive service. In civil proceedings and in proceedings for the imposition of a fine pursuant to the new law, providers must name that person without delay.
This applies to all social networks, regardless of where they are based (in Germany or abroad). Civil lawsuits against social networks so far have shown that European service mechanisms (recorded delivery with acknowledgement of receipt in civil proceedings) are generally insufficient. This is why fast and secure channels are urgently required for service of process. In particular, this will help the courts in cases of criminally punishable fake news, and allow those affected by such content to take legal action quickly.
Providers will be required to make information easily available on their platforms about the person authorised to receive service.
TopicFair_im_Netz_enWho can law enforcement authorities turn to if they need information from network operators in the course of a criminal investigation?
The new law stipulates that social networks must name a person authorised to receive information requests from law enforcement authorities. This person is obliged to respond to information requests within 48 hours of receipt. Where the response provided does not fully answer the questions submitted in the request, reasons must be provided for this.
TopicFair_im_Netz_enWhat remedies are available to a company when fined?
The administrative decision imposing a fine can be challenged by way of an objection filed with the Local Court (Amtsgericht) in the district where the authority imposing the fine has its seat. Under certain circumstances a complaint on points of law (Rechtsbeschwerde) against the Local Court decision may be filed at the Higher Regional Court (Oberlandesgericht).
TopicFair_im_Netz_enDoes the new law deal with fake news?
Yes, insofar as the case in question crosses the line of constituting a criminal offence. Fake news may fulfil the elements of criminal offences such as insult, malicious gossip and defamation, which are covered by the new law.
TopicFair_im_Netz_enWhy does the law not include a "right of reply"?
The right of reply for telemedia operators is set out in the German Interstate Agreement on Broadcasting and Telemedia (Rundfunkstaatsvertrag) and is restricted to telemedia services with journalistic/editorial content. It would be up to Germany's constituent states (Länder) to extend the Interstate Agreement.
TopicFair_im_Netz_enDoesn't taking down or blocking criminal content restrict freedom of expression?
No. The statutory compliance rules for social networks set out in the new law simply serve to ensure that providers take swift and rigorous action to meet their existing legal obligations, i.e. to delete or block unlawful content at the latest upon being made aware that this content is available on their platform. They do not impose any new restrictions on Article 5 para. 1 of Germany's constitution, the Basic Law (freedom of expression). The law refers explicitly to criminal content. Such content is not protected by the freedom of expression.
TopicFair_im_Netz_enCan I demand that the operator of a social network disclose the identity of an infringer if that person has violated my personality rights?
Yes. Anyone whose personality rights are violated by way of the criminal offences falling under the new law (e.g. by way of a post constituting insult) can demand that the social network concerned disclose who committed the offence. Such entitlement to information is founded in existing general principles of civil law. With the new legislation, we are ensuring that these entitlements can actually be asserted. Operators of social networks are given the power to disclose the subscriber information of the infringer to the injured party.
TopicFair_im_Netz_enWho decides whether information about the author of a post violating a person's personality rights is disclosed to the injured party?
Social networks must be ordered to disclose this information by a civil court with jurisdiction over the matter. For a court to issue this sort of order, the injured party must have involved the court himself (application for an injunction). The court must examine whether the requirements for disclosure are met. These requirements include that personality rights must have been violated, that the injured party requires the information concerned in order to assert his claims arising from the violation (cease and desist, compensation), and that the disclosure is proportionate to the offence.
TopicFair_im_Netz_enIs the new law compatible with the E-Commerce Directive?
Yes. Pursuant to Article 14 (3) of the E-Commerce Directive, Member States are allowed to establish "procedures governing the removal or disabling of access to information".
TopicFair_im_Netz_enDid Germany have to notify the European Commission of the new law?
Yes. The new law was subject to notification. According to the Notification Directive (EU 2015/1535), Member States must notify the European Commission of every draft law pertaining to services in the information society, and may not adopt this law until three months have passed since notification. During this period, the Commission and other Member States have the opportunity to submit an opinion. No opinion has been submitted on this law by the Commission.