4.3 (new feature): Automatic Community Moderation

Discussion in 'IPS' started by tony45, Jan 8, 2018.

  1. tony45

    tony45 Participant

    62
    35
    +105
    One huge benefit of running your own Invision Community is the moderation tools.

    Out of the box, Invision Community allows you to turn members into moderators. Better still, you can define what these moderators have permission to do.

    Part of this moderation suite is the report system. The report system allows your members to flag posts that need a moderator's attention.

    There comes a time when your community is so successful that it can be a little tough to keep up with all the content and reports.


    Community Moderation

    This new feature leverages your member reports to automatically remove objectionable content from public view.

    You as the admin will define thresholds for the content. For example, you may say that to hide content, a post needs 5 reports.

    This reduces the workload for your moderators and enables you to crowd source moderation.

    Let's take a look at this feature in a little more detail.


    Reporting Content

    When a member reports a piece of content, they now have the option to set a type, such as "Spam" or "Offensive". These options can count towards the threshold. Once the threshold has been passed the item is hidden.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_0fe157d3f97bb44776ce4715a2bb06ade.jpg

    The threshold can be set up by creating rules in the Admin CP.


    Admin Set Up

    At its heart of the system are the rules. You can create custom rules in the Admin CP to determine the thresholds.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_0d99d336ee007ec523870cf5b2d34e2f3.jpg

    For example, you may decide that:

    A member with less than 10 posts only needs 5 reports to hide the content.

    But you may want to give more experienced members a higher threshold as there is more trust.

    You simply add a new rule:

    A member who joined over a year ago with over 500 posts needs 10 reports to hide content.

    You can do that easily with the rules system as it will scan them all and pick the one most suitable for this member.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_084daa65dfd0afca0bd574b89b26f4bcb.jpg

    It's as simple as that.


    Notifications

    Once an item has received enough reports to match the threshold, it is automatically hidden from view.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_08f8a7ad0ddea8ffbf43101e6273ecd65.jpg

    A notification is sent to all moderators who opt in for notifications. This notification shows inline in the notifications center.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_00ac9b9cb4fe73035b3da71a31e7c19c8.jpg

    It can also optionally be sent via email for those who want to know without checking the site.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_05594e0481a1f5f8213c50a01daa17586.jpg

    Restoring the content

    Of course, a moderator may decide that the content is fine and un-hide it. Once a piece of content has been un-hidden, automatic moderation will not hide it again.


    Report Types

    Depending on your community, the default types may not be suitable or relevant. You may also want to set up other report types.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_004177fb277aa339ea3109a56e53fa254.jpg

    You can do this via the Admin CP.


    Preventing Abuse

    Your first thought may be that a single member can report a single item multiple times to force content to be hidden.

    The system will only count a unique member as one point towards the threshold. This means a single member can report an item 5 times, but they are only counted once towards the threshold.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_0c6ca35833a4d39f61f2aaacf8bb33057.jpg



    You can also set a time limit between reporting the same item. This will prevent a member reporting a single item multiple times in succession.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_0fe157d3f97bb44776ce4715a2bb06ade.jpg

    Of course, the member can delete their report if it was in error.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_07c74a72204117cb1798417fb437f6961.jpg

    Report Center

    The Report Center is the hub for all reported content. Invision Community 4.3 adds a filter to view a specific report type. The reports themselves also show the type of report.

    adne4i5cb88590.cloudfront.net_invisionpower_com_monthly_2018_08398cdb7af183bfd127a1e3dfb99d91f.jpg

    We hope that this new feature will be a huge help and time saver for you and your moderators.

    We'd love to hear your thoughts, please let us know what you think and if you have any questions.

    https://invisioncommunity.com/news/product-updates/43-automatic-community-moderation-r1059/
     

    Attached Files:

    • 1.jpg
      1.jpg
      File size:
      26.2 KB
      Views:
      1
    • 1.jpg
      1.jpg
      File size:
      26.2 KB
      Views:
      0
    • Like Like x 5
    • Informative! Informative! x 3
    • List
  2. Alfa1

    Alfa1 Moderator

    3,388
    1,202
    +2,314
    That is really nice and compelling!
     
  3. Paul M

    Paul M Dr Pepper Addict

    3,645
    1,097
    +1,957
    On the surface, it seems a cool feature - but also open to abuse.
    (allowing members to gang up on posts or other members (it happens)).

    Of course, you can have rules in place to punish abusers, but thats just creating more work for your team.
    I would be very hesitant to allow this on any forum I ran, I prefer moderating decisions to be taken by moderators, not "crowd driven".
     
  4. Ramses

    Ramses Adherent

    294
    75
    +98
    This feature is usergroup permission based, so you could assign some trustful users into a new group.
    Or you set the number of reports necessary for hiding a post to a higher value if you have too many abusive members.
    For my community it would be a very valuable addition.
     
  5. LeadCrow

    LeadCrow Apocalypse Admin

    6,205
    1,232
    +2,008
    I have to agree with Paul.

    Automated moderation is risky if it doesnt include a procedure to automatically upgrade and degrade the worth of a report or trustworthiness of specific users doing the reporting. Users could brigade unpopular users or content, and it would succesfully get censored unless a moderator reverted it.

    Imagine how messy things would get if staff wasnt around and active enough to fix this. Trolls and bots could quickly hide a forum's content.
     
  6. Alfa1

    Alfa1 Moderator

    3,388
    1,202
    +2,314
    I already use much of this functionality on xenforo.(I do not have report types unfortunately) if a post/member receives a lot of negative reputation or if a post is reported more than X times, then it goes to moderation.

    At the start there was some abuse by members ganging up, but this was really easy to solve. Firstly I increased the threshold for various usergroups, so that it was beyond the scope of a few members teaming up. At the same time I publicly announced that abuse is seen as a serious offence.
    That fixed the issue completely.

    Also: its not complete crowd moderation, but it just hides problematic posts from public until staff has reviewed it. So if its a false negative its no issue. Just approve the post and nothing is lost.
     
  7. we_are_borg

    we_are_borg Moderator

    4,279
    777
    +1,650
    One way to handle this would be to make a system that sees patrons in reporting if to many of the same people report then a warning is given to moderators so they can check if people team up.
     
  8. Joel R

    Joel R Adherent

    493
    172
    +459
    All it does is hide the post. The crowd moderation does NOT trigger any final moderation like deleting / editing / banning. Moderators are also immediately notified, so they can look at the post in more detail to confirm.

    In general, this is a rules based system. If you have people constantly abusing your rules, then you need to re-evaluate your rules (as well as your community, but that's another issue).
     
    • Like Like x 1
    • Agree Agree x 1
    • List
Verification:
Draft saved Draft deleted
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.