User Generated Content and the Fediverse: A Legal Primer

More and more people are trying federated alternatives to social media like Mastodon, either by joining “instances” hosted by other people or creating their own by running free, open-source software on servers they control. (See more about this movement and join fediverse here).

fediverse is not a single, massive social media platform like Facebook or Youtube. It’s an ever-expanding ecosystem of interconnected sites and services, and people can interact with each other no matter which of those sites and services they have an account with. That means people can customize and have more control over their social media experience, less dependent on a monoculture developed by a handful of tech giants.

However, it also means some legal risks for those hosting instances. Fortunately, there are some relatively easy ways to reduce this risk if you plan ahead. To help people do this, this guide covers some common legal issues, as well as some practical considerations.

Two Important Notes: (1) This guide focuses on the legal risks posed by custody other People’s Content, under U.S. law. In general, the safe harbors and immunities discussed below will not protect you if you are directly infringing copyright or defaming another person. (2) Many of us at EFF are lawyers, but we are not your lawyers. This guide is intended to provide a high-level overview of U.S. law and should not be considered legal advice specific to your situation.


Copyright law gives rights holders substantial control over the use of expressive works, subject to several important restrictions, such as fair use. If some of your users share infringing material through your instance, and you are found responsible for that infringement under the doctrine of “secondary liability” for copyright infringement, the violation could result in devastating damages.

However, the Digital Millennium Copyright Act (17 USC § 512) creates a “safe harbor” from copyright liability for service providers, including instance administrators, who “respond swiftly” to claims that they are hosting or linking to infringing material Notice. Utilizing the safe harbor protects you from litigation over complex secondary liability issues and avoids the risk that you will eventually be held liable.

Safe Harbor does not automatically apply. the first, the safe harbor is subject to two disqualifications: (1) actual or “red flag” knowledge of the specific infringement; and (2) profiting from the infringing activity in circumstances over which you have the right and ability to control. The standards for these categories are controversial; if you’re concerned about them, you might wish to consult an attorney.

secondsupplier some sure steps must be taken To confirm:

  1. Designated Copyright Office’s DMCA agent.

This could be the best $6 you’ll ever spend. Following the process discussed below, the DMCA Agent will serve as the official point of contact for copyright complaints. Note that your registration must be renewed every three years, and if you fail to register your representative, you may lose your safe harbor protections. You must also provide the agent’s contact information on your website, such as a link to a public page describing your instance and policies.

  1. Have a clear DMCA policy, including a repeat infringement policy, and abide by it.

To be eligible for the Safe Harbor, all service providers must “adopt and reasonably implement and notify subscribers and account holders . . . policies that provide for termination in appropriate circumstances. . . repeat infringers.” “Repeat infringers” There is no standard definition, but some services employ a “three strikes” policy, which means they will terminate accounts after three infringement claims are undisputed. Given that copyright is often abused to remove legitimate speech, you may want to consider a more flexible approach that gives users ample opportunity to appeal before termination. Courts that have examined what constitutes “reasonable performance” of termination proceedings have emphasized that service providers are not liable for regulatory violations.

Managed services, the most likely category of Mastodon instances, must also follow the “notice and takedown” process, which requires services to remove allegedly infringing material when notified. In order to be effective under the DMCA, the notice must contain the following information:

  • Name, address, and written or electronic signature of the complaining party
  • Identification of the infringing material and its Internet location (eg URL)
  • Information sufficient to identify the copyrighted work
  • A statement by the copyright owner that in good faith the alleged use has no legal basis
  • A statement, under penalty of perjury, that the complaining party is entitled to act on behalf of the copyright holder, as to the accuracy of the notice

Providers are not required to respond to DMCA notices that do not contain substantially all of these elements. Copyright owners must consider whether a targeted use is a legal and fair use before sending a notice.

Users can respond with a counter-notification if they believe they have been unfairly attacked. You should forward it to the rights holder. At that point, the copyright claimant has 10-14 days to file a lawsuit. If they don’t, you can put the material back and still be free from liability.

A proper counter notification must contain the following information:

  • User’s name, address, phone number, and physical or electronic signature [512(g)(3)(A)]
  • Identify material and its location before removal [512(g)(3)(B)]
  • A statement under penalty of perjury that the material was mistakenly or misidentified for removal [512(g)(3)(C)]
  • Consent to the jurisdiction of the local federal court, or if overseas, the appropriate judicial authority. [512(g)(3)(D)]

To help the process go smoothly, it’s a good idea to forward the original takedown notice to the user so they can understand who is complaining and why.

Finally, service providers must “adapt to and not interfere with standard technical measures … used by copyright owners to identify or protect copyrighted works.” To qualify as a “standard technical measure,” the measure must be “according to the copyright owner’s and service providers “by broad consensus” in an open, fair, voluntary, multi-industry standards process, rather than imposing “substantial” service provider costs”. As of 2022, nothing appears to qualify.

State Law and Federal Civil Claims, or Why Section 230 Isn’t Just “Big Tech” Protection

Thanks to Section 230 of the Communications Decency Act, online intermediaries that host or republish speech are protected by a range of state laws, such as defamation, that could otherwise be used to hold them legally accountable for what they say and do to their users . Section 230 basically applies to any online service that hosts third-party content, such as web hosting companies, domain registrars, email providers, social media platforms, and Mastodon instances.

Section 230 says “A provider or user of an interactive computer service shall not be considered the publisher or spokesperson of any information provided by another information content provider” (47 USC § 230). This protects providers from liability for what users say in various legal contexts. It also releases you from liability that may arise from the service’s removal of User Statements or other moderation decisions. Unlike the DMCA, Section 230 does not require service providers to take any affirmative steps to qualify for protection.

However, in 2018, Congress passed FOSTA/SESTA, which created new civil and criminal liabilities for anyone who “owns, manages, or operates an interactive computer service” and creates content (or hosts third-party content), with the aim of “promotes or facilitates the prostitution of others.” The law also expands criminal and civil liability to classify any online speaker or platform who allegedly facilitates, supports, or facilitates the sex trade as if they themselves were involved in a relationship with someone directly engaged in the sex trade Personal “venture capital”.

EFF represents several plaintiffs challenging the constitutionality of FOSTA/SESTA. As of this writing, the law is still on the books.

Privacy and Anonymity

Your users may sign up under pseudonyms, and people who want to reply to them may ask you to disclose any personally identifiable information you have about them. They may seek to use that information as part of legal action, but also retaliate in other ways. Law enforcement may also request information from you as part of a criminal investigation or prosecution.

If you receive a subpoena or other legal document requiring you to produce this information, consider consulting an attorney to determine whether you are required to comply. Best practice is to notify users as soon as possible so they can challenge the subpoena in court. Many of these challenges have been successful given the strong First Amendment protections for anonymous speech. You can delay notification of emergencies, gag orders, or when notification is ineffective, but the best practice is to publicly commit to providing notification after the emergency is over or the gag order expires.

Consider publishing a regularly updated transparency report, which should include useful data on how often governments seek user data and how often you provide it to governments. Even if you receive few or no government requests, publishing a report showing zero requests can provide useful data to users.

Also, consider whether you are collecting and/or retaining more information than necessary, such as logging IP addresses, date stamps, or reading activity. If you don’t have it, they won’t come looking for it. Therefore, it is best practice to publish guidance for law enforcement explaining how you can respond to government data requests, including information you cannot provide.

Child Sexual Abuse Material (CSAM)

Service Providers are required to report any CSAM on their servers to CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC), a private, nonprofit subject to criminal prosecution. NCMEC shares these reports with law enforcement. However, you don’t need to positively monitor your instances for the presence of CSAM.

other legal issues

In our litigious society, there are many “causes of action” — reasons for filing a lawsuit — that a creative and determined plaintiff can come up with. Aggressive plaintiffs sometimes use dubious claims in an attempt to suppress protected speech. If you have received threats or requests for information that appears inappropriate, please feel free to contact EFF ([email protected]) and we will do our best to help.

Leave a Reply

Your email address will not be published. Required fields are marked *