Introduction

Fandom is the world’s largest platform for in-depth information about gaming, TV, movies, and pop culture. It is comprised of a network of fan-curated wikis (editable sites), forums, and other venues – a one-stop shop for fans of all types to explore and discuss their favorite stories, characters, and lore.

Fandom encompasses over 50 million content pages about every fictional universe ever created. Our user-generated content spans over Fandom wikis and discussions, GameFAQs, GameSpot, Comic Vine, Metacritic and Giant Bomb. The latter five properties are referred to in this report as “Fandom Brands”.

This report covers the dates February 1, 2024 to January 31, 2025.

Fandom discloses other elements of our site moderation via our primary Transparency Report, which is linked here. The following statements are specifically set aside for compliance with the European Commission’s Digital Services Act.

Orders from EU Member States

During the covered period, Fandom did not receive any orders from Member States’ authorities, either in the form of “Orders to act against illegal content” (DSA Article 9) or “Orders to provide information” (Article 10). 

Notice submissions.

The following are the numbers and details from our overall support system for 2/1/2024 – 1/31/2025.

Type of problem reported for Fandom wikis and discussions Number of notices via support desk Number of notices via Trusted Flaggers Number of notices processed by automated means
Copyright or trademark issue (DMCA) 214 0 0
Request to be forgotten (DMCR) 589 0 0
Underage user (GDPR and equivalents) 252 0 0
Private information exposure 669 0 0

Our median time for taking action on these notices was 595 minutes.

Replies to requests we receive are handled by our support staff, or escalated to the Trust and Safety Team or other relevant departments. Where necessary the Legal Team is consulted.

For DMCA takedown requests, the support team assesses the request to ensure it is legally compliant, then removes the content. The reporter is notified of the removal by email.

The uploader’s name is recorded and any administrative actions taken: we show a notice on the public removal log for each violation, give an official warning on the second offense, and the user is banned if they violate the rules a third time.

“Requests to be forgotten” are assessed by support staff to ensure they are genuine and that the user is in a country/state that requires this service. If valid, we use a dedicated tool to remove all details of that user from wikis and discussions.

Underage users (under 16 in the European Economic Area; under 13 in most other cases) have their accounts disabled and any private personal information removed when they are identified.

Private information and explicit content are removed by support staff as needed.

Fandom Brands issues are dealt with separately and included 15 DMCA notices in the given time-frame. Other issues were not tagged individually, so are not included in this report.

Content moderation.

On Fandom wikis and discussions, most moderation on the site is performed by users of the platform. Some users are given extra tools to better moderate the wikis they use, but content is managed collaboratively with all users empowered to and able to moderate content.

Our most trusted users utilise scripts and bots to identify problem content, including using machine learning. The final actions (removing content, banning users, etc.) are manual except where the probability of bad faith editing is very high. In that case, actions can be taken automatically. They are also able to make tools that speed up and improve moderation on the site, for example by adding a one-click option to close wikis that violate our creation policies.

In addition to the user/moderators, images uploaded to the site are scanned by machine learning models. This removes images deemed to be outside our Terms of Use, removing an average of 0.01% of the ~2 million images uploaded a month. Images that are scored as borderline are directed to further review by manual reviewers. Images that don’t pass this second-level assessment are directed to our Trust and Safety Team for a final decision.

Manual reviewers are trained by Trust and Safety Team using in-house documentation, and are monitored for accuracy.

Fandom Brands also enables volunteer users of the site to moderate forums and other user generated content. Actions taken by moderators include setting users to “ignored” status, removing content (including “nuking” or removing all content from a specified user), and warning and banning users. These actions are logged and can be reviewed by staff. In the case of Game FAQs all bans must be reviewed by staff before implementation.

Moderator Actions on Fandom Brands 2024
Month GameFAQs GameSpot Comic Vine Giant Bomb
January 1792 794 1413 928
February 2063 832 1432 953
March 2032 892 1397 935
April 1996 791 1684 803
May 2061 897 1710 830
June 1640 878 2639 1034
July 2239 832 1666 1047
August 1530 873 1922 995
September 1616 827 1340 822
October 1757 805 1375 887
November 1194 731 1336 850
December 940 757 1766 983

Restricting Access.

Restrictions on Fandom consist of bans on registered users or users identified by their IP, preventing them from using the interactive aspects of the network. We do not differentiate between users from the EU and those from other countries so the numbers below reflect global actions.

 

For Fandom wikis and discussions, we banned 367,081 accounts and logged out visitors 2/1/2024 – 1/31/2025. This included 299,622 blocks for posting spam to the site, 14,048 for adding problem content or other damaging submissions, and 10,272 for other Terms of Use violations.

 

For Fandom Brands, the number of moderator actions is combined into one figure for each property as shown above.

Automated Content Moderation.

As noted above, we use automated systems for a first-level review of all images uploaded to Fandom wikis and discussions. We are also currently trialing a system which will do similarly for text, based on assessment of high risk submissions (for example, those made by very new users).Our volunteer users also utilise machine learning and filtering scripts to find spam wikis and content. These are then manually reviewed, unless the content is found to be very high risk, in which case it is automatically deleted.

We also automatically delete content in the form of wikis created by users who then abandon them. These are wikis with very little content, visitors, and activity.

Appeals.

In the period 2/1/2024 – 1/31/2025 we did not have a formal internal appeals system. Appeals were received on an ad hoc basis via webform and email. The number of queries tagged with “user block” in our support system was 3748, which gives an approximation of the ban appeals we received but includes other queries about banning. We did not record detailed information about the basis of each complaint, the decisions taken, or the number of times the initial decision was reversed. This total includes all appeals, not EU only appeals.

In the period in question, we had no disputes submitted to out-of-court settlement bodies.