Unity Services Content Transparency
Unity Services Content Transparency
Published: February 15, 2024
Last Edit/Review: Apr 10, 2026
On April 10, 2026 we implemented updates to improve clarity around the UK Online Safety Act (“OSA”). These changes expand definitions of unacceptable content, detail enforcement actions such as suspensions, and revise the set of applicable services. The policy now provides more information on automation and moderation processes, introduces enhanced child safety measures, clarifies reporting procedures, and updates contact information for OSA inquiries.
This page contains policies that set out what content is and isn’t allowed to be provided by a user of Unity Services (collectively, “Provided Content”). Unity Services are all services governed by the Unity Terms of Service. Unity moderates Provided Content in our Offerings based on these policies with the aim of creating a safe online environment for users.
The supplemental policies and principles below constitute “Additional Terms” and are incorporated into the Unity Terms of Service, applicable Additional Terms, or such other applicable agreements between you and Unity Technologies SF (“Unity”) or its applicable Affiliates, which governs your use of the Offerings (collectively, “Unity Terms”). Capitalized terms used but not defined herein have the meanings given such terms in the Unity Terms of Service or the applicable Additional Terms.
Before using any Unity Offerings, please review the following information and ensure that you are in compliance when using our Offerings. Please note that examples provided are for illustrative purposes only and are not exhaustive.
Unacceptable Content
Unacceptable Content means any Prohibited Content or Restricted Content provided by a third party*.
- Prohibited Content is any of Your Content that is not allowed on the product/ service pursuant to the terms applicable to such Offering.
- Restricted Content is any of Your Content that it is not allowed on the product/ service pursuant to the terms applicable to such Offering without prior authorization by Unity or without additional requirements or restrictions.
*Please note, Unacceptable Content refers only to content provided by our customers, we do not have policy-based content restrictions for content provided by our customers’ end-users, e.g. a game-player. To understand what content restrictions may apply to our customers' end-users, please review the content policy of that customer.
Prohibited Content
- Intellectual Property and Legal Infractions
- Illegal Content, including content that infringes, misappropriates or attempts to infringe or misappropriate any third party right such as terrorist content, intellectual property or proprietary rights of any person (including privacy and publicity rights) or violates or attempts to violate any applicable laws or regulations.
- Misleading and Deceptive Content
- Fraudulent, false, misleading, deceptive, or defamatory content.
- Unsolicited or unauthorized communications, such as promotional materials, email, junk mail, spam, chain letters or other forms of solicitation.
- Internet “links” to content that is not associated with, connected or related to the original content.
- Offensive and Harmful Content
- Hateful or discriminatory content, including offensive content that is based on race, gender, color, religious belief, sexual orientation, disability
- Profane or vulgar content.
- Harmful, threatening, obscene, infringing, harassing, disturbing, violent or shocking content.
- Pornographic or sexually suggestive content.
- Malicious and Destructive Content
- Malicious/ Deceptive Software or Operations such as viruses, worms, defects, malware, spyware, malicious code or other destructive content that could have an adverse impact on any software, data, computer systems, networks, hardware or and devices.
- Promotional and Solicitation Content
- Content promoting illegal or harmful activities or substances.
- Content which promotes or incites any of the above.
Restricted Content
- Content that implies Unity’s sponsorship or affiliation, including any form of content utilizing a Unity trademark, logo URL or product name.
- Please see our Guidelines for Using Unity Trademarks for more information.
- Content containing personal information, including child, sensitive, or biometric personal information as may be defined by applicable laws.
- Please see the Unity Terms of Service or applicable Additional Terms for when this is authorized.
Actions Due to Unacceptable Content
Depending on the product/ service the Unacceptable Content appears on, Unity may take some or all of following actions:
- Provide information about a user
- Remove/ disable access to/ restrict visibility of content
- Not approve the content for upload
- Require modification of content
- Suspend/ terminate portions of the service
- Suspend/ terminate an account
- Suspend/ terminate/ restrict monetization of content
- Alert local authorities if we are suspicious of a serious criminal offense
We may remove an account if we have reasonable grounds to infer that an account is operated by or on behalf of a terrorist organization (LINK: https://www.gov.uk/government/publications/proscribed-terror-groups-or-organisations--2/proscribed-terrorist-groups-or-organisations-accessible-version). Such accounts will be promptly removed in accordance with our legal obligations, including the Online Safety Act.
In determining whether to suspend or terminate an account, the following circumstances are taken into account:
- If an account is responsible for providing prohibited content, particularly if such content is illegal.
- Severity of the violation: Unity reserves the right to permanently suspend an account without warning for serious violations.
- Repeated violations: If an account repeatedly violates Unity’s content policies
As a user of the Services, you direct Unity to action content in order to comply with the Digital Services Act (Regulation (EU) 2022/2065) (“DSA”), UK Online Safety Act 2023 (“OSA”) and any other applicable laws. Unity will provide its users of the Unity Offerings with aggregate reporting regarding Unity’s compliance with the DSA upon the user’s request.
If you are subject to an action under Article 17 of the DSA and we have your electronic contact details, we will provide you with a Statement of Reasons.
For users in the United Kingdom, Unity will take additional actions as required under the UK Online Safety Act, including reporting and escalating content that poses a risk of serious harm to children or vulnerable users. Unity will cooperate with Ofcom (the UK’s online safety regulator) as required by law.
Procedures & Measures for Content Moderation
Unity conducts content moderation activities as a result of a report. You can see how to report restricted content below under "Reporting Unacceptable Content".
Unity also has additional procedures and measures for content moderation as outlined below.
Al Powered Tools, including Assistant
Automation tooling may be used to analyze content in the following ways:
- Proactive, automated AI content classification system to enhance user safety. This system automatically scans all user prompts before they are processed by the AI, and all AI-generated responses before they are shown to the user.
- The system works by analyzing text and images to detect content that may violate our policies, specifically looking for content that falls into four primary harm categories: Harassment, Hate Speech, Sexually Explicit, and Dangerous (the "how it works"). If content is detected that violates these policies, it is automatically blocked to minimize the risk of users encountering illegal or harmful content.
Where we use automation, we ensure safeguards are in place such as:
- Relying on technology that engages in testing, feedback and human reinforcement
- Human determination and review of the threshold criteria
- Communication with those subject to a moderation action
Unity Gaming Services
Procedures and measures are determined by our customers, who utilize the services in their applications. Below, you can find more information on the tools we provide.
Moderation
The Moderation platform ingests reports from the integrated services Vivox and our customer can review the reports and determine if or how to act on the detections.
Vivox
Automation tooling may be used to analyze content in the following ways:
- Detect unwanted content based on customer defined thresholds
- Restrict unwanted content based on customer defined thresholds
Where we use automation, we ensure safeguards are in place such as:
- Relying on technology that engages in testing, feedback and human reinforcement
- Customer defined thresholds
- Customer ability to engage in human review
Reporting Unacceptable Content
If you are residing in the European Union or in the United Kingdom, you can report Unacceptable Content here. Once received, a member of our team will review the report and take any necessary action. There is no automation or algorithmic decision making in this process. We aim to resolve complaints, where possible, within 1 to 7 days, depending on severity, ensuring they are addressed without undue delay.
Please note, the following When submitting a report, please include the following information to assist Unity in identifying Unacceptable Content:
- Cloud SDK and Cloud Python SDK:
- appID
- appName
- Asset Manager:
- project id,
- organization id,
- if available: asset id & asset version
Additional reporting mechanisms are outlined below.
Please note, while the below are valid mechanisms for reporting content, they are not intended to satisfy a Notice and Action mechanism (Article 16) under the DSA. For Article 16 notices, please use the European Union mechanism listed above.
Vivox
You can additionally report content through in-app mechanisms as may be configured by our customer. Please note, these reports will be sent to our customer to review. Unity will not take action on such reports and if you wish to make a report to be reviewed by Unity please do so through the ticketing system.
Appealing Content Moderation Restrictions
If you are residing in the European Union or the United Kingdom and believe we have made an incorrect decision about a content moderation restriction imposed on your content or account, you may submit an appeal here within six months of being notified of the restriction. When you submit an appeal, it will be reviewed by a member of our team. Regardless of the outcome, you will be notified of our decision as well as available possibilities for redress.
Unity Gaming Services
If the moderation action was taken by our customer, you should submit any appeals with them.
Termination of Services
You can terminate your use of the Services by giving notice. You can find the grounds for termination as well as any notice requirements in the applicable Terms of Service.
- Unity Terms of Service
- Parsec supplemental provisions
- Mars supplemental provisions
- Unity Al supplemental provisions
- Unity Pulse supplemental provisions
- Unity Simulation Services supplemental provisions
- Unity Consulting Services supplemental provisions
- Unity Certification Program supplemental provisions
- Unity Learning Partner Instructor-Led Training supplemental provisions
- Unity Self-Paced Courseware Additional Terms supplemental provisions
Digital Services Act
This section sets out the provisions applicable to individuals residing in the European Union under the Digital Services Act (“DSA”).
Transparency Reports
Unity has prepared the following transparency reports to comply with our obligations under the DSA.
Redress Options
If you are an individual or entity residing in the EU, you will have a number of redress options available as outlined in this section.
The redress options do not preclude you from seeking judicial redress or any rights under the Unity Terms or such other applicable agreements between you and Unity or its applicable Affiliates, which governs your use of the Service
Notices submitted under Article 16
If you submitted an Article 16 notice through Unity’s Content Report Ticketing System, and have concerns regarding the decision made you may submit a complaint here.
Appealing a Decision
If we have taken an action on your content or account and you wish to appeal it, you may submit an appeal here within six (6) months from action. The appeal should include the following information:
- Your contact information;
- Identification of the content and moderation action in question
- A statement explaining the reasons why you believe the content or account was wrongfully removed/disabled
- Any supporting evidence or legal arguments to substantiate your claims
Out of Court Dispute Settlement
If you remain dissatisfied with the outcome of our internal review, you have the option to engage in a dispute settlement process outside of the court system. This is a non-binding process that allows you to have your dispute reviewed by a neutral third party. You are entitled to select any out-of-court dispute settlement body that has been certified by your Member State.
Judicial Proceeding
If you believe that your concerns are not adequately addressed through our internal mechanisms or out-of-court settlement, you have the option to pursue legal action through the appropriate legal channels, such as filing a lawsuit or complaint in accordance with the applicable laws and regulations.
Online Safety Act
Judicial Proceeding
You have an option to bring a claim to the appropriate court for breach of contract if:
- anything that you generate, upload or share is taken down, or access to it is restricted, in breach of the terms of service, or
- you are suspended or banned from using the service in breach of the terms of service.
Suspension of Users under the DSA & OSA
Submitting Manifestly Unfounded Notices & Complaints
If you misuse our complaint notification system by frequently submitting complaints that are manifestly unfounded, we may suspend your access to the complaint notification system. We will notify you prior to enacting a suspension.
We consider three unfounded notices or two unfounded notices alleging offenses referred to in Articles 3 to 7 of Directive 2011/93/EU to be sufficient for a suspension.
We consider the Safety Duties about illegal content in Section 10 of the OSA, which require us to take proportionate measures, including blocking users from accessing the service or particular content, to prevent, mitigate, and manage the risks of illegal content and harm to individuals.
A suspension will last thirty (30) days, and we will issue a warning prior to enacting it. A second suspension will increase to sixty (60) days, a third suspension to 90 days, and so on.
Designated Point of Contact
Pursuant to Articles 11 and 12 of the DSA, DSA Compliance Lead has been designated as Unity’s point of contact for communications with Member State authorities, the European Commission, the European Board for Digital Services, and recipients of the service.
- For inquiries, please email DigitalServicesAct@unity3d.com
- To submit reports under the DSA, please do so here.
The EU Member State in which we have our main establishment is Denmark. The language(s) which can be used to communicate with Unity are English and Danish.
Pursuant to Section 21 of the OSA and this content policy, you are able to submit reports of illegal content here (LINK TO PORTAL). This is for users and our designated point of contact for UK authorities.
Child Safety and Content Moderation
Unity does not make its services for children or expect children to use them.
If children (those under 18) can use a Unity experience, these child safety rules apply to both users and creators, and are designed to follow laws like the OSA and DSA.
Unity reviews its services and features to find things that could be unsafe for children and teens. This includes illegal content, like child sexual abuse material, and dangerous behavior, like grooming. Unity also adds safety features that are meant to fit different age groups and help protect young people from harmful content or contact.
As explained above, some kinds of content and behavior are not allowed. These include:
- Child sexual abuse material (CSAM)
- Sexual content involving minors, whether real or fictional
- Pornographic content
- Content about harmful substances
- Grooming
- Predatory behavior
- Bullying
- Content that encourages hate
- Sharing private information about minors, like their location or personal details
- Abusive content
- Violent content, especially violence toward people or animals, or content that gives instructions for violence
- Content that encourages eating disorders or harmful body shame
- Content that encourages self-harm, suicide, or dangerous behavior by young people
- Any other content that could seriously harm children
Unity uses human moderators to find and block content or behavior that breaks these rules. If content breaks the rules, Unity may remove it or block people from seeing it, try to stop it from being uploaded again, and keep evidence if the law requires it. If Unity finds child sexual abuse material, we will report it to the proper authorities as required by law.
Unity has reporting tools you can use to report unsafe situations involving children or content that breaks the rules and includes minors. Reports about minors are treated as especially important. When possible, Unity may update the person who made the report and share links to safety help. If someone posts, shares, or helps spread harmful or banned content, Unity may warn them, limit what their account can do, suspend their account, or permanently ban it. If there is a serious and immediate danger, Unity may also contact the proper authorities, as required by law.
Unity keeps records of its safety and moderation decisions, including ones about child safety. We check whether our safety rules and tools are working and update them when risks change or when the law, such as the DSA and OSA, require it. Unity will also make important updates and share enforcement information when needed. Everyone using Unity services must follow these rules and the law. If someone breaks them, Unity may remove content, limit account features, ban the account, or report the issue to law enforcement.