4.3 (new feature): Automatic Community Moderation

Discussion in 'IPS' started by tony45, Jan 8, 2018.

  1. tony45

    tony45 Participant

    One huge benefit of running your own Invision Community is the moderation tools.

    Out of the box, Invision Community allows you to turn members into moderators. Better still, you can define what these moderators have permission to do.

    Part of this moderation suite is the report system. The report system allows your members to flag posts that need a moderator's attention.

    There comes a time when your community is so successful that it can be a little tough to keep up with all the content and reports.

    Community Moderation

    This new feature leverages your member reports to automatically remove objectionable content from public view.

    You as the admin will define thresholds for the content. For example, you may say that to hide content, a post needs 5 reports.

    This reduces the workload for your moderators and enables you to crowd source moderation.

    Let's take a look at this feature in a little more detail.

    Reporting Content

    When a member reports a piece of content, they now have the option to set a type, such as "Spam" or "Offensive". These options can count towards the threshold. Once the threshold has been passed the item is hidden.


    The threshold can be set up by creating rules in the Admin CP.

    Admin Set Up

    At its heart of the system are the rules. You can create custom rules in the Admin CP to determine the thresholds.


    For example, you may decide that:

    A member with less than 10 posts only needs 5 reports to hide the content.

    But you may want to give more experienced members a higher threshold as there is more trust.

    You simply add a new rule:

    A member who joined over a year ago with over 500 posts needs 10 reports to hide content.

    You can do that easily with the rules system as it will scan them all and pick the one most suitable for this member.


    It's as simple as that.


    Once an item has received enough reports to match the threshold, it is automatically hidden from view.


    A notification is sent to all moderators who opt in for notifications. This notification shows inline in the notifications center.


    It can also optionally be sent via email for those who want to know without checking the site.


    Restoring the content

    Of course, a moderator may decide that the content is fine and un-hide it. Once a piece of content has been un-hidden, automatic moderation will not hide it again.

    Report Types

    Depending on your community, the default types may not be suitable or relevant. You may also want to set up other report types.


    You can do this via the Admin CP.

    Preventing Abuse

    Your first thought may be that a single member can report a single item multiple times to force content to be hidden.

    The system will only count a unique member as one point towards the threshold. This means a single member can report an item 5 times, but they are only counted once towards the threshold.


    You can also set a time limit between reporting the same item. This will prevent a member reporting a single item multiple times in succession.


    Of course, the member can delete their report if it was in error.


    Report Center

    The Report Center is the hub for all reported content. Invision Community 4.3 adds a filter to view a specific report type. The reports themselves also show the type of report.


    We hope that this new feature will be a huge help and time saver for you and your moderators.

    We'd love to hear your thoughts, please let us know what you think and if you have any questions.


    Attached Files:

    • 1.jpg
      File size:
      26.2 KB
    • 1.jpg
      File size:
      26.2 KB
    • Like Like x 5
    • Informative! Informative! x 3
    • List
  2. Alfa1

    Alfa1 Moderator

    That is really nice and compelling!
  3. Paul M

    Paul M Dr Pepper Addict

    On the surface, it seems a cool feature - but also open to abuse.
    (allowing members to gang up on posts or other members (it happens)).

    Of course, you can have rules in place to punish abusers, but thats just creating more work for your team.
    I would be very hesitant to allow this on any forum I ran, I prefer moderating decisions to be taken by moderators, not "crowd driven".
  4. Ramses

    Ramses Adherent

    This feature is usergroup permission based, so you could assign some trustful users into a new group.
    Or you set the number of reports necessary for hiding a post to a higher value if you have too many abusive members.
    For my community it would be a very valuable addition.
  5. LeadCrow

    LeadCrow Apocalypse Admin

    I have to agree with Paul.

    Automated moderation is risky if it doesnt include a procedure to automatically upgrade and degrade the worth of a report or trustworthiness of specific users doing the reporting. Users could brigade unpopular users or content, and it would succesfully get censored unless a moderator reverted it.

    Imagine how messy things would get if staff wasnt around and active enough to fix this. Trolls and bots could quickly hide a forum's content.
  6. Alfa1

    Alfa1 Moderator

    I already use much of this functionality on xenforo.(I do not have report types unfortunately) if a post/member receives a lot of negative reputation or if a post is reported more than X times, then it goes to moderation.

    At the start there was some abuse by members ganging up, but this was really easy to solve. Firstly I increased the threshold for various usergroups, so that it was beyond the scope of a few members teaming up. At the same time I publicly announced that abuse is seen as a serious offence.
    That fixed the issue completely.

    Also: its not complete crowd moderation, but it just hides problematic posts from public until staff has reviewed it. So if its a false negative its no issue. Just approve the post and nothing is lost.
  7. we_are_borg

    we_are_borg Moderator

    One way to handle this would be to make a system that sees patrons in reporting if to many of the same people report then a warning is given to moderators so they can check if people team up.
  8. Joel R

    Joel R Adherent

    All it does is hide the post. The crowd moderation does NOT trigger any final moderation like deleting / editing / banning. Moderators are also immediately notified, so they can look at the post in more detail to confirm.

    In general, this is a rules based system. If you have people constantly abusing your rules, then you need to re-evaluate your rules (as well as your community, but that's another issue).
    • Like Like x 1
    • Agree Agree x 1
    • List
Draft saved Draft deleted
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.