Unity Services Content Transparency

Unity Services Content Transparency

Published: February 15, 2024

This page contains policies that set out what content is and isn’t allowed to be provided by a user of Unity Services (collectively, “Provided Content”). Unity Services are all services governed by the Unity Terms of Service. Unity moderates Provided Content in our Offerings based on these policies with the aim of creating a safe online environment for users.

The supplemental policies and principles below constitute “Additional Terms” and are incorporated into the Unity Terms of Service, applicable Additional Terms, or such other applicable agreements between you and Unity Technologies SF (“Unity”) or its applicable Affiliates, which governs your use of the Offerings (collectively, “Unity Terms”). Capitalized terms used but not defined herein have the meanings given such terms in the Unity Terms of Service or the applicable Additional Terms.

Before using any Unity Offerings, please review the following information and ensure that you are in compliance when using our Offerings. Please note that examples provided are for illustrative purposes only and are not exhaustive.

Unacceptable Content

Unacceptable Content means any Prohibited Content or Restricted Content provided by a third party*.

  • Prohibited Content is any of Your Content that is not allowed on the product/ service pursuant to the terms applicable to such Offering.
  • Restricted Content is any of Your Content that it is not allowed on the product/ service pursuant to the terms applicable to such Offering without prior authorization by Unity or without additional requirements or restrictions.

*Please note, Unacceptable Content refers only to content provided by our customers, we do not have policy-based content restrictions for content provided by our customers’ end-users, e.g. a game-player. To understand what content restrictions may apply to our customers' end-users, please review the content policy of that customer.

Prohibited Content

  • Intellectual Property and Legal Infractions
    • Illegal Content, including content that infringes, misappropriates or attempts to infringe or misappropriate any third party right such as intellectual property or proprietary rights of any person (including privacy and publicity rights) or violates or attempts to violate any applicable laws or regulations.
  • Misleading and Deceptive Content
    • Fraudulent, false, misleading, deceptive, or defamatory content.
    • Unsolicited or unauthorized communications, such as promotional materials, email, junk mail, spam, chain letters or other forms of solicitation.
    • Internet “links” to content that is not associated with, connected or related to the original content.
  • Offensive and Harmful Content
    • Hateful or discriminatory content, including offensive content that is based on race, gender, color, religious belief, sexual orientation, disability
    • Profane or vulgar content.
    • Harmful, threatening, obscene, infringing, harassing, disturbing, violent or shocking content.
    • Pornographic or sexually suggestive content.
  • Malicious and Destructive Content
    • Malicious/ Deceptive Software or Operations such as viruses, worms, defects, malware, spyware, malicious code or other destructive content that could have an adverse impact on any software, data, computer systems, networks, hardware or and devices.
  • Promotional and Solicitation Content
    • Content promoting illegal or harmful activities or substances.
    • Content which promotes or incites any of the above.

Restricted Content

  • Content that implies Unity’s sponsorship or affiliation, including any form of content utilizing a Unity trademark, logo URL or product name.
  • Content containing personal information, including child, sensitive, or biometric personal information as may be defined by applicable laws.

Actions Due to Unacceptable Content

Depending on the product/ service the Unacceptable Content appears on, Unity may take some or all of following actions:

  • Provide information about a user
  • Remove/ disable access to/ restrict visibility of content
  • Not approve the content for upload
  • Require modification of content
  • Suspend/ terminate portions of the service
  • Suspend/ terminate an account
  • Suspend/ terminate/ restrict monetization of content
  • Alert local authorities if we are suspicious of a serious criminal offense

In determining whether to suspend or terminate an account, the following circumstances are taken into account:

  • If an account is responsible for providing prohibited content, particularly if such content is illegal.
  • Severity of the violation: Unity reserves the right to permanently suspend an account without warning for serious violations.
  • Repeated violations: If an account repeatedly violates Unity’s content policies

As a user of the Services, you direct Unity to action content in order to comply with the Digital Services Act (Regulation (EU) 2022/2065) (“DSA”) and any other applicable laws. Unity will provide its users of the Unity Offerings with aggregate reporting regarding Unity’s compliance with the DSA upon the user’s request.

If you are subject to an action under Article 17 of the DSA and we have your electronic contact details, we will provide you with a Statement of Reasons.


Procedures & Measures for Content Moderation

Unity conducts content moderation activities as a result of a report. You can see how to report restricted content below under "Reporting Unacceptable Content".

Unity also has additional procedures and measures for content moderation as outlined below.

Unity Muse

Automation tooling may be used to analyze content in the following ways:

  • To detect and restrict Unacceptable Content being provided by users

Where we use automation, we ensure safeguards are in place such as:

  • Relying on technology that engages in testing, feedback and human reinforcement
  • Human determination and review of testing of the threshold criteria
  • Communication with those subject to a moderation action

Unity Gaming Services

Procedures and measures are determined by our customers, who utilize the services in their applications. Below, you can find more information on the tools we provide.

Moderation

The Moderation platform ingests reports from integrated services e.g., Safe Voice, and our customer can review the reports and determine if or how to act on the detections.

Safe Voice

Automation tooling may be used to analyze content in the following ways:

  • Detect unwanted content based on customer defined thresholds

Where we use automation, we ensure safeguards are in place such as:

  • Regular evaluations and testing, feedback and human reinforcement
  • Customer defined thresholds
  • Customer ability to engage in human review
Vivox

Automation tooling may be used to analyze content in the following ways:

  • Detect unwanted content based on customer defined thresholds
  • Restrict unwanted content based on customer defined thresholds

Where we use automation, we ensure safeguards are in place such as:

  • Relying on technology that engages in testing, feedback and human reinforcement
  • Customer defined thresholds
  • Customer ability to engage in human review
User Generated Content

User Generated Content allows our customers to moderate provided content. Moderation may occur either proactively or retroactively. The customer can choose to allow content to be published immediately or require all content to be approved by a moderator before it's published. In either scenario, the customer may moderate the content based on their own policies, including if more than a certain number of users report it as inappropriate.

Reporting Unacceptable Content

If you are residing in the European Union, you can report Unacceptable Content here. Once received, a member of our team will review the report and take any necessary action. There is no automation or algorithmic decision making in this process.

Please note, the following When submitting a report, please include the following information to assist Unity in identifying Unacceptable Content:

  • Cloud SDK and Cloud Python SDK:
    • appID
    • appName
  • Asset Manager:
    • project id,
    • organization id,
    • if available: asset id & asset version
  • User Generated Content:
    • ProjectId
    • EnvironmentId
    • ContentId
    • Game name
    • Content name + description

Additional reporting mechanisms are outlined below.

Please note, while the below are valid mechanisms for reporting content, they are not intended to satisfy a Notice and Action mechanism (Article 16) under the DSA. For Article 16 notices, please use the European Union mechanism listed above.


Additional reporting mechanisms are outlined below.

Please note, while the below are valid mechanisms for reporting content, they are not intended to satisfy a Notice and Action mechanism (Article 16) under the DSA. For Article 16 notices, please use the European Union mechanism listed above.

Safe Voice

You can additionally report content through in-app mechanisms as may be configured by our customer. Please note, these reports will be sent to our customer to review. Unity will not take action on such reports and if you wish to make a report to be reviewed by Unity please do so through the ticketing system.

Vivox

You can additionally report content through in-app mechanisms as may be configured by our customer. Please note, these reports will be sent to our customer to review. Unity will not take action on such reports and if you wish to make a report to be reviewed by Unity please do so through the ticketing system.

User Generated Content

You can additionally report content through in-app mechanisms as may be configured by our customer. Please note, these reports will be sent to our customer to review. Unity will not take action on such reports and if you wish to make a report to be reviewed by Unity please do so through the ticketing system.

If you would like to report an Intellectual Property infringement under the DMCA, please see IP Policy & Takedown Requests

Appealing Content Moderation Restrictions

If you are residing in the European Union and believe we have made an incorrect decision about a content moderation restriction imposed on your content or account, you may submit an appeal here within six months of being notified of the restriction. When you submit an appeal, it will be reviewed by a member of our team. Regardless of the outcome, you will be notified of our decision as well as available possibilities for redress.

Unity Gaming Services

If the moderation action was taken by our customer, you should submit any appeals with them.


Termination of Services

You can terminate your use of the Services by giving notice. You can find the grounds for termination as well as any notice requirements in the applicable Terms of Service.

Digital Services Act

This section sets out the provisions applicable to individuals residing in the European Union under the Digital Services Act (“DSA”).

Transparency Reports

Unity has prepared the following transparency reports to comply with our obligations under the DSA.

Redress Options

If you are an individual or entity residing in the EU, you will have a number of redress options available as outlined in this section.

The redress options do not preclude you from seeking judicial redress or any rights under the Unity Terms or such other applicable agreements between you and Unity or its applicable Affiliates, which governs your use of the Service

Notices submitted under Article 16

If you submitted an Article 16 notice through Unity’s Content Report Ticketing System, and have concerns regarding the decision made you may submit a complaint here.

Appealing a Decision

If we have taken an action on your content or account and you wish to appeal it, you may submit an appeal here within six (6) months from action. The appeal should include the following information:

  • Your contact information;
  • Identification of the content and moderation action in question
  • A statement explaining the reasons why you believe the content or account was wrongfully removed/disabled
  • Any supporting evidence or legal arguments to substantiate your claims
Out of Court Dispute Settlement

If you remain dissatisfied with the outcome of our internal review, you have the option to engage in a dispute settlement process outside of the court system. This is a non-binding process that allows you to have your dispute reviewed by a neutral third party. You are entitled to select any out-of-court dispute settlement body that has been certified by your Member State.

Judicial Proceeding

If you believe that your concerns are not adequately addressed through our internal mechanisms or out-of-court settlement, you have the option to pursue legal action through the appropriate legal channels, such as filing a lawsuit or complaint in accordance with the applicable laws and regulations.

Suspension of Users under the DSA

Submitting Manifestly Unfounded Notices & Complaints

If you misuse our complaint notification system by frequently submitting complaints that are manifestly unfounded, we may suspend your access to the complaint notification system. We will notify you prior to enacting a suspension.

We consider three unfounded notices or two unfounded notices alleging offenses referred to in Articles 3 to 7 of Directive 2011/93/EU to be sufficient for a suspension.

A suspension will last thirty (30) days, and we will issue a warning prior to enacting it. A second suspension will increase to sixty (60) days, a third suspension to 90 days, and so on.

Designated Point of Contact

Pursuant to Articles 11 and 12 of the DSA, DSA Compliance Lead has been designated as Unity’s point of contact for communications with Member State authorities, the European Commission, the European Board for Digital Services, and recipients of the service.

The EU Member State in which we have our main establishment is Denmark.The language(s) which can be used to communicate with Unity are English and Danish.