Online Community Moderation: How to deal with disruptive members
Online Community Moderation: How to deal with disruptive members
Online Community Moderation: How to deal with disruptive members
Online Community Moderation: How to deal with disruptive members
Online Community Moderation: How to deal with disruptive members
Mike
Harrower
in
Community engagement
Dec 17, 2019
9
min read
Mike
Harrower
in
Dec 17, 2019
Community engagement
9
9
min read
Contents
Title
Title
Online community moderation is a full-time job, and larger businesses often have entire departments dedicated to the task. But these efforts can pay off enormously if you take a strategic approach to online community moderation and management.
Any online community can attract undesirable types who, hiding behind the anonymity of the internet, indulge in all manner of bad behaviour. With spammers, trolls, and social engineering scammers presenting constant threat, brands need to be proactive when it comes to protecting themselves and their customers. That’s where online community moderation comes in.
But there’s more to running a healthy community than simply laying down and enforcing the rules. That’s why you also need community management. Whereas community moderators help to maintain a positive atmosphere by cracking down on disruptive behaviour and assisting managers in their efforts, community managers take more of a leadership role. In doing so, they’re often tasked with creating community rules and guidelines, defining responsibilities, and leading the conversation by setting a good example.
Know when to draw the lines
Before you can deal with disruptive community members, you first need to decide exactly what constitutes bad behaviour. This definition won’t just define your guidelines and policies; it will also set the standards for your community moderators and managers. While some provocation is obvious and outright, not all cases of trolling, for example, are actually intentional. After all, people often feel more comfortable speaking their mind in online forums and social networks than they do in real life.
One of the most important things in any brand community is to recognise the line between constructive criticism and abuse. A brand community should exist for the benefit of both you and your customers, and no one’s going to stick around long if they feel they’re being silenced. Occasional negative feedback, for example, can actually be beneficial to a business, since it helps them identify key customer concerns and improve their offers accordingly.
Sometimes, it’s necessary to have more than one community manager and/or moderator to weigh in on more contentious discussions and ensure they keep on track.
Other infractions are clearer but often not especially severe. For example, people posting off-topic in the wrong forum or group can become a problem but are rarely ill-intentioned. Another common frustration is when certain people post excessively so that no one else has much of a chance to get heard. Again, while not usually malicious, such behaviour can be disruptive to your community if it’s left unchecked.
Keep your community rules and guidelines visible
It’s an unfortunate yet unavoidable fact that most people will blissfully ignore your community rules, but that doesn’t mean you shouldn’t have them. If nothing else, an acceptable use policy provides some recourse if you ever need to warn or ban a member. Your rules and guidelines should always be clearly visible and should take precedence over everything else, save for the key value proposition of joining your community in the first place. A short reminder should ideally be visible on every page, and the entire policy should itself be as short and concise as possible.
You can be sure that no one’s going to read through reams of legalese to get into your brand community! Another tactic is to make your community guidelines a regular part of your discussions. Chances are, there’ll be many situations where you can refer to them and, in doing so, encourage members to learn more about how you handle moderation.
Enable peer-to-peer moderation
Online community moderation quickly becomes a practical impossibility at scale. When moderators find themselves working overtime trying to remove spam comments or tackle minor infractions, the chance of human error increases too. In the end, moderators and managers can pend more time policing their communities than getting actively involved in the discussions.
To prevent this problem and keep your community healthy, you need to think about scalability from the outset. And that shouldn’t mean constantly increasing your roster of moderators. The better approach is to enable peer-to-peer moderation to let your members choose what sort of content they find more valuable. Social media ‘likes’ are the most basic form of peer-to-peer moderation, since they provide a quick and simple endorsement. Other platforms, like Reddit and most community forums, allow members to upvote and downvote content so that with the highest ratings earns the most visibility. Others go even further to add gamification tactics like ranks, karma points, and badges. Finally, any online community should include a reporting feature to allow members to flag spam and other policy infractions.
With peer-to-peer moderation enabled, moderators and managers will have to spend less time dealing with otherwise trivial issues. Instead, they’ll be able to focus on proactively modelling the behaviour they want to see.
Lead by example to provide a sense of purpose
Brand communities are driven by a unified sense of purpose, whether that’s to offer peer-to-peer support, provide a space for feedback and testing, or facilitate customer success. When a customer visits your online community for the first time, they’ll often do what other members are doing. That’s why positive developments don’t come from rules and guidelines alone, but from the actions of your managers, moderators, and super users. To get started on the right track, team members should always greet new members. In fact, regular participation from brand representatives is a must for maintaining a healthy community.
You can also add some consistency to the mix by organising thematic online events and discussion threads, such as Monday Motivations or Throwback Thursdays. Another highly effective way to lead by example is to highlight user-generated content (UGC). This includes any posts, pictures, or other content shared by your community members. It can be something as simple as a motivational meme or as complex as an in-depth video review of your latest product. Most of the time, members who have gone to the effort of creating a great post will appreciate the recognition. With UGC in the spotlight, the ability to set a good example shifts to your most valuable community members, which simplifies online community moderation.
How to ban the most troublesome users
There will come a time for every community manager or moderator to unleash the banhammer. While conventional banning should keep out most trolls and spammers, there are others who might not be so quick to get the message. The most determined of malefactors will just keep coming back, opening new accounts and even going so far as to hide their real IP addresses. That’s where the benefits of shadow banning come into play. Shadow banning is a discreet approach in online community moderation that bans users without their knowledge. Instead, they can keep posting, but their posts will be hidden from the community. Instead, their determined efforts to damage your community will land on deaf ears, thus discouraging them from further trolling.
Blatant but less serious policy infractions are usually best off being met with an initial warning from one of your moderators. Further disruptive behaviour might then result in a temporary or permanent ban. Temporary bans are ideal for giving troublesome members a chance to cool off and reevaluate their behaviour. This saves a lot of time over manually keeping notes.
Combined, these tactics will help you maintain a healthier and more productive community to the benefit of both your brand and its biggest fans.
Disciple social spaces help brands enjoy all the benefits of community with an independent, valuable, and trusted platform in a safe space that they own and control. Start building your brand community by telling us about your goals.
Improve Your Online Community Moderation with Disciple
The need for online community moderation is not open for debate. Without effective moderation, community building efforts are destined to fail as loose cannons drive away valued community members along with their friends and associates.
Community builders need to devise effective community moderation guidelines and then have a simple and effective way to implement them. That is exactly what you are able to do when you choose Disciple as your online community platform.
Enforce Community Guidelines with Disciple Moderation Features
Disciple's suite of online community moderation tools is extensive and puts control over user-generated content squarely in your hands where it belongs.
Of course, with control over user-generated content comes the responsibility to effectively prevent members from infringing on copyrights, bullying or badgering other members, or generally being a counterproductive member of the community.
Disciple moderation tools provide the ability to fulfil all forum moderator responsibilities in a simple and timely fashion. Those tools include, but are not limited to:
Shadow banning - Shadow banning is when the moderator decides to make a post invisible to everyone but the person who posted it. This is a subtle way to prevent questionable content from creating ill-will within the community.
Removal of content - Forum moderator responsibilities include the need to occasionally disable or even unpublish user content. Disciple provides this ability, as well as the ability to reverse these actions if the issue that created the problem is resolved in a satisfactory fashion.
Email verification - It’s common for blocked users on social media or online forums to try and return using fake email addresses. Disciple provides instant email verification that prevents this from happening.
Automatic post scores - Disciple community software incorporates Google AI technology that automatically assigns an “appropriateness” score to every post. Moderators can use these scores to help them make online community moderation decisions.
Members blocking members - If for any reason a member wishes to block content generated by another user they have the ability to do so with Disciple. They can also report other member’s posts if they feel those posts violate community guidelines.
With Disciple, you’re able to provide a lively environment that is both safe and responsible.
Online community moderation is a full-time job, and larger businesses often have entire departments dedicated to the task. But these efforts can pay off enormously if you take a strategic approach to online community moderation and management.
Any online community can attract undesirable types who, hiding behind the anonymity of the internet, indulge in all manner of bad behaviour. With spammers, trolls, and social engineering scammers presenting constant threat, brands need to be proactive when it comes to protecting themselves and their customers. That’s where online community moderation comes in.
But there’s more to running a healthy community than simply laying down and enforcing the rules. That’s why you also need community management. Whereas community moderators help to maintain a positive atmosphere by cracking down on disruptive behaviour and assisting managers in their efforts, community managers take more of a leadership role. In doing so, they’re often tasked with creating community rules and guidelines, defining responsibilities, and leading the conversation by setting a good example.
Know when to draw the lines
Before you can deal with disruptive community members, you first need to decide exactly what constitutes bad behaviour. This definition won’t just define your guidelines and policies; it will also set the standards for your community moderators and managers. While some provocation is obvious and outright, not all cases of trolling, for example, are actually intentional. After all, people often feel more comfortable speaking their mind in online forums and social networks than they do in real life.
One of the most important things in any brand community is to recognise the line between constructive criticism and abuse. A brand community should exist for the benefit of both you and your customers, and no one’s going to stick around long if they feel they’re being silenced. Occasional negative feedback, for example, can actually be beneficial to a business, since it helps them identify key customer concerns and improve their offers accordingly.
Sometimes, it’s necessary to have more than one community manager and/or moderator to weigh in on more contentious discussions and ensure they keep on track.
Other infractions are clearer but often not especially severe. For example, people posting off-topic in the wrong forum or group can become a problem but are rarely ill-intentioned. Another common frustration is when certain people post excessively so that no one else has much of a chance to get heard. Again, while not usually malicious, such behaviour can be disruptive to your community if it’s left unchecked.
Keep your community rules and guidelines visible
It’s an unfortunate yet unavoidable fact that most people will blissfully ignore your community rules, but that doesn’t mean you shouldn’t have them. If nothing else, an acceptable use policy provides some recourse if you ever need to warn or ban a member. Your rules and guidelines should always be clearly visible and should take precedence over everything else, save for the key value proposition of joining your community in the first place. A short reminder should ideally be visible on every page, and the entire policy should itself be as short and concise as possible.
You can be sure that no one’s going to read through reams of legalese to get into your brand community! Another tactic is to make your community guidelines a regular part of your discussions. Chances are, there’ll be many situations where you can refer to them and, in doing so, encourage members to learn more about how you handle moderation.
Enable peer-to-peer moderation
Online community moderation quickly becomes a practical impossibility at scale. When moderators find themselves working overtime trying to remove spam comments or tackle minor infractions, the chance of human error increases too. In the end, moderators and managers can pend more time policing their communities than getting actively involved in the discussions.
To prevent this problem and keep your community healthy, you need to think about scalability from the outset. And that shouldn’t mean constantly increasing your roster of moderators. The better approach is to enable peer-to-peer moderation to let your members choose what sort of content they find more valuable. Social media ‘likes’ are the most basic form of peer-to-peer moderation, since they provide a quick and simple endorsement. Other platforms, like Reddit and most community forums, allow members to upvote and downvote content so that with the highest ratings earns the most visibility. Others go even further to add gamification tactics like ranks, karma points, and badges. Finally, any online community should include a reporting feature to allow members to flag spam and other policy infractions.
With peer-to-peer moderation enabled, moderators and managers will have to spend less time dealing with otherwise trivial issues. Instead, they’ll be able to focus on proactively modelling the behaviour they want to see.
Lead by example to provide a sense of purpose
Brand communities are driven by a unified sense of purpose, whether that’s to offer peer-to-peer support, provide a space for feedback and testing, or facilitate customer success. When a customer visits your online community for the first time, they’ll often do what other members are doing. That’s why positive developments don’t come from rules and guidelines alone, but from the actions of your managers, moderators, and super users. To get started on the right track, team members should always greet new members. In fact, regular participation from brand representatives is a must for maintaining a healthy community.
You can also add some consistency to the mix by organising thematic online events and discussion threads, such as Monday Motivations or Throwback Thursdays. Another highly effective way to lead by example is to highlight user-generated content (UGC). This includes any posts, pictures, or other content shared by your community members. It can be something as simple as a motivational meme or as complex as an in-depth video review of your latest product. Most of the time, members who have gone to the effort of creating a great post will appreciate the recognition. With UGC in the spotlight, the ability to set a good example shifts to your most valuable community members, which simplifies online community moderation.
How to ban the most troublesome users
There will come a time for every community manager or moderator to unleash the banhammer. While conventional banning should keep out most trolls and spammers, there are others who might not be so quick to get the message. The most determined of malefactors will just keep coming back, opening new accounts and even going so far as to hide their real IP addresses. That’s where the benefits of shadow banning come into play. Shadow banning is a discreet approach in online community moderation that bans users without their knowledge. Instead, they can keep posting, but their posts will be hidden from the community. Instead, their determined efforts to damage your community will land on deaf ears, thus discouraging them from further trolling.
Blatant but less serious policy infractions are usually best off being met with an initial warning from one of your moderators. Further disruptive behaviour might then result in a temporary or permanent ban. Temporary bans are ideal for giving troublesome members a chance to cool off and reevaluate their behaviour. This saves a lot of time over manually keeping notes.
Combined, these tactics will help you maintain a healthier and more productive community to the benefit of both your brand and its biggest fans.
Disciple social spaces help brands enjoy all the benefits of community with an independent, valuable, and trusted platform in a safe space that they own and control. Start building your brand community by telling us about your goals.
Improve Your Online Community Moderation with Disciple
The need for online community moderation is not open for debate. Without effective moderation, community building efforts are destined to fail as loose cannons drive away valued community members along with their friends and associates.
Community builders need to devise effective community moderation guidelines and then have a simple and effective way to implement them. That is exactly what you are able to do when you choose Disciple as your online community platform.
Enforce Community Guidelines with Disciple Moderation Features
Disciple's suite of online community moderation tools is extensive and puts control over user-generated content squarely in your hands where it belongs.
Of course, with control over user-generated content comes the responsibility to effectively prevent members from infringing on copyrights, bullying or badgering other members, or generally being a counterproductive member of the community.
Disciple moderation tools provide the ability to fulfil all forum moderator responsibilities in a simple and timely fashion. Those tools include, but are not limited to:
Shadow banning - Shadow banning is when the moderator decides to make a post invisible to everyone but the person who posted it. This is a subtle way to prevent questionable content from creating ill-will within the community.
Removal of content - Forum moderator responsibilities include the need to occasionally disable or even unpublish user content. Disciple provides this ability, as well as the ability to reverse these actions if the issue that created the problem is resolved in a satisfactory fashion.
Email verification - It’s common for blocked users on social media or online forums to try and return using fake email addresses. Disciple provides instant email verification that prevents this from happening.
Automatic post scores - Disciple community software incorporates Google AI technology that automatically assigns an “appropriateness” score to every post. Moderators can use these scores to help them make online community moderation decisions.
Members blocking members - If for any reason a member wishes to block content generated by another user they have the ability to do so with Disciple. They can also report other member’s posts if they feel those posts violate community guidelines.
With Disciple, you’re able to provide a lively environment that is both safe and responsible.
Mike
Harrower
in
Dec 17, 2019
9
min read
Community engagement
Mike
Harrower
in
Community engagement
Dec 17, 2019
9
min read
See how a Disciple community app can elevate your business
Online community moderation is a full-time job, and larger businesses often have entire departments dedicated to the task. But these efforts can pay off enormously if you take a strategic approach to online community moderation and management.
Any online community can attract undesirable types who, hiding behind the anonymity of the internet, indulge in all manner of bad behaviour. With spammers, trolls, and social engineering scammers presenting constant threat, brands need to be proactive when it comes to protecting themselves and their customers. That’s where online community moderation comes in.
But there’s more to running a healthy community than simply laying down and enforcing the rules. That’s why you also need community management. Whereas community moderators help to maintain a positive atmosphere by cracking down on disruptive behaviour and assisting managers in their efforts, community managers take more of a leadership role. In doing so, they’re often tasked with creating community rules and guidelines, defining responsibilities, and leading the conversation by setting a good example.
Know when to draw the lines
Before you can deal with disruptive community members, you first need to decide exactly what constitutes bad behaviour. This definition won’t just define your guidelines and policies; it will also set the standards for your community moderators and managers. While some provocation is obvious and outright, not all cases of trolling, for example, are actually intentional. After all, people often feel more comfortable speaking their mind in online forums and social networks than they do in real life.
One of the most important things in any brand community is to recognise the line between constructive criticism and abuse. A brand community should exist for the benefit of both you and your customers, and no one’s going to stick around long if they feel they’re being silenced. Occasional negative feedback, for example, can actually be beneficial to a business, since it helps them identify key customer concerns and improve their offers accordingly.
Sometimes, it’s necessary to have more than one community manager and/or moderator to weigh in on more contentious discussions and ensure they keep on track.
Other infractions are clearer but often not especially severe. For example, people posting off-topic in the wrong forum or group can become a problem but are rarely ill-intentioned. Another common frustration is when certain people post excessively so that no one else has much of a chance to get heard. Again, while not usually malicious, such behaviour can be disruptive to your community if it’s left unchecked.
Keep your community rules and guidelines visible
It’s an unfortunate yet unavoidable fact that most people will blissfully ignore your community rules, but that doesn’t mean you shouldn’t have them. If nothing else, an acceptable use policy provides some recourse if you ever need to warn or ban a member. Your rules and guidelines should always be clearly visible and should take precedence over everything else, save for the key value proposition of joining your community in the first place. A short reminder should ideally be visible on every page, and the entire policy should itself be as short and concise as possible.
You can be sure that no one’s going to read through reams of legalese to get into your brand community! Another tactic is to make your community guidelines a regular part of your discussions. Chances are, there’ll be many situations where you can refer to them and, in doing so, encourage members to learn more about how you handle moderation.
Enable peer-to-peer moderation
Online community moderation quickly becomes a practical impossibility at scale. When moderators find themselves working overtime trying to remove spam comments or tackle minor infractions, the chance of human error increases too. In the end, moderators and managers can pend more time policing their communities than getting actively involved in the discussions.
To prevent this problem and keep your community healthy, you need to think about scalability from the outset. And that shouldn’t mean constantly increasing your roster of moderators. The better approach is to enable peer-to-peer moderation to let your members choose what sort of content they find more valuable. Social media ‘likes’ are the most basic form of peer-to-peer moderation, since they provide a quick and simple endorsement. Other platforms, like Reddit and most community forums, allow members to upvote and downvote content so that with the highest ratings earns the most visibility. Others go even further to add gamification tactics like ranks, karma points, and badges. Finally, any online community should include a reporting feature to allow members to flag spam and other policy infractions.
With peer-to-peer moderation enabled, moderators and managers will have to spend less time dealing with otherwise trivial issues. Instead, they’ll be able to focus on proactively modelling the behaviour they want to see.
Lead by example to provide a sense of purpose
Brand communities are driven by a unified sense of purpose, whether that’s to offer peer-to-peer support, provide a space for feedback and testing, or facilitate customer success. When a customer visits your online community for the first time, they’ll often do what other members are doing. That’s why positive developments don’t come from rules and guidelines alone, but from the actions of your managers, moderators, and super users. To get started on the right track, team members should always greet new members. In fact, regular participation from brand representatives is a must for maintaining a healthy community.
You can also add some consistency to the mix by organising thematic online events and discussion threads, such as Monday Motivations or Throwback Thursdays. Another highly effective way to lead by example is to highlight user-generated content (UGC). This includes any posts, pictures, or other content shared by your community members. It can be something as simple as a motivational meme or as complex as an in-depth video review of your latest product. Most of the time, members who have gone to the effort of creating a great post will appreciate the recognition. With UGC in the spotlight, the ability to set a good example shifts to your most valuable community members, which simplifies online community moderation.
How to ban the most troublesome users
There will come a time for every community manager or moderator to unleash the banhammer. While conventional banning should keep out most trolls and spammers, there are others who might not be so quick to get the message. The most determined of malefactors will just keep coming back, opening new accounts and even going so far as to hide their real IP addresses. That’s where the benefits of shadow banning come into play. Shadow banning is a discreet approach in online community moderation that bans users without their knowledge. Instead, they can keep posting, but their posts will be hidden from the community. Instead, their determined efforts to damage your community will land on deaf ears, thus discouraging them from further trolling.
Blatant but less serious policy infractions are usually best off being met with an initial warning from one of your moderators. Further disruptive behaviour might then result in a temporary or permanent ban. Temporary bans are ideal for giving troublesome members a chance to cool off and reevaluate their behaviour. This saves a lot of time over manually keeping notes.
Combined, these tactics will help you maintain a healthier and more productive community to the benefit of both your brand and its biggest fans.
Disciple social spaces help brands enjoy all the benefits of community with an independent, valuable, and trusted platform in a safe space that they own and control. Start building your brand community by telling us about your goals.
Improve Your Online Community Moderation with Disciple
The need for online community moderation is not open for debate. Without effective moderation, community building efforts are destined to fail as loose cannons drive away valued community members along with their friends and associates.
Community builders need to devise effective community moderation guidelines and then have a simple and effective way to implement them. That is exactly what you are able to do when you choose Disciple as your online community platform.
Enforce Community Guidelines with Disciple Moderation Features
Disciple's suite of online community moderation tools is extensive and puts control over user-generated content squarely in your hands where it belongs.
Of course, with control over user-generated content comes the responsibility to effectively prevent members from infringing on copyrights, bullying or badgering other members, or generally being a counterproductive member of the community.
Disciple moderation tools provide the ability to fulfil all forum moderator responsibilities in a simple and timely fashion. Those tools include, but are not limited to:
Shadow banning - Shadow banning is when the moderator decides to make a post invisible to everyone but the person who posted it. This is a subtle way to prevent questionable content from creating ill-will within the community.
Removal of content - Forum moderator responsibilities include the need to occasionally disable or even unpublish user content. Disciple provides this ability, as well as the ability to reverse these actions if the issue that created the problem is resolved in a satisfactory fashion.
Email verification - It’s common for blocked users on social media or online forums to try and return using fake email addresses. Disciple provides instant email verification that prevents this from happening.
Automatic post scores - Disciple community software incorporates Google AI technology that automatically assigns an “appropriateness” score to every post. Moderators can use these scores to help them make online community moderation decisions.
Members blocking members - If for any reason a member wishes to block content generated by another user they have the ability to do so with Disciple. They can also report other member’s posts if they feel those posts violate community guidelines.
With Disciple, you’re able to provide a lively environment that is both safe and responsible.
Online community moderation is a full-time job, and larger businesses often have entire departments dedicated to the task. But these efforts can pay off enormously if you take a strategic approach to online community moderation and management.
Any online community can attract undesirable types who, hiding behind the anonymity of the internet, indulge in all manner of bad behaviour. With spammers, trolls, and social engineering scammers presenting constant threat, brands need to be proactive when it comes to protecting themselves and their customers. That’s where online community moderation comes in.
But there’s more to running a healthy community than simply laying down and enforcing the rules. That’s why you also need community management. Whereas community moderators help to maintain a positive atmosphere by cracking down on disruptive behaviour and assisting managers in their efforts, community managers take more of a leadership role. In doing so, they’re often tasked with creating community rules and guidelines, defining responsibilities, and leading the conversation by setting a good example.
Know when to draw the lines
Before you can deal with disruptive community members, you first need to decide exactly what constitutes bad behaviour. This definition won’t just define your guidelines and policies; it will also set the standards for your community moderators and managers. While some provocation is obvious and outright, not all cases of trolling, for example, are actually intentional. After all, people often feel more comfortable speaking their mind in online forums and social networks than they do in real life.
One of the most important things in any brand community is to recognise the line between constructive criticism and abuse. A brand community should exist for the benefit of both you and your customers, and no one’s going to stick around long if they feel they’re being silenced. Occasional negative feedback, for example, can actually be beneficial to a business, since it helps them identify key customer concerns and improve their offers accordingly.
Sometimes, it’s necessary to have more than one community manager and/or moderator to weigh in on more contentious discussions and ensure they keep on track.
Other infractions are clearer but often not especially severe. For example, people posting off-topic in the wrong forum or group can become a problem but are rarely ill-intentioned. Another common frustration is when certain people post excessively so that no one else has much of a chance to get heard. Again, while not usually malicious, such behaviour can be disruptive to your community if it’s left unchecked.
Keep your community rules and guidelines visible
It’s an unfortunate yet unavoidable fact that most people will blissfully ignore your community rules, but that doesn’t mean you shouldn’t have them. If nothing else, an acceptable use policy provides some recourse if you ever need to warn or ban a member. Your rules and guidelines should always be clearly visible and should take precedence over everything else, save for the key value proposition of joining your community in the first place. A short reminder should ideally be visible on every page, and the entire policy should itself be as short and concise as possible.
You can be sure that no one’s going to read through reams of legalese to get into your brand community! Another tactic is to make your community guidelines a regular part of your discussions. Chances are, there’ll be many situations where you can refer to them and, in doing so, encourage members to learn more about how you handle moderation.
Enable peer-to-peer moderation
Online community moderation quickly becomes a practical impossibility at scale. When moderators find themselves working overtime trying to remove spam comments or tackle minor infractions, the chance of human error increases too. In the end, moderators and managers can pend more time policing their communities than getting actively involved in the discussions.
To prevent this problem and keep your community healthy, you need to think about scalability from the outset. And that shouldn’t mean constantly increasing your roster of moderators. The better approach is to enable peer-to-peer moderation to let your members choose what sort of content they find more valuable. Social media ‘likes’ are the most basic form of peer-to-peer moderation, since they provide a quick and simple endorsement. Other platforms, like Reddit and most community forums, allow members to upvote and downvote content so that with the highest ratings earns the most visibility. Others go even further to add gamification tactics like ranks, karma points, and badges. Finally, any online community should include a reporting feature to allow members to flag spam and other policy infractions.
With peer-to-peer moderation enabled, moderators and managers will have to spend less time dealing with otherwise trivial issues. Instead, they’ll be able to focus on proactively modelling the behaviour they want to see.
Lead by example to provide a sense of purpose
Brand communities are driven by a unified sense of purpose, whether that’s to offer peer-to-peer support, provide a space for feedback and testing, or facilitate customer success. When a customer visits your online community for the first time, they’ll often do what other members are doing. That’s why positive developments don’t come from rules and guidelines alone, but from the actions of your managers, moderators, and super users. To get started on the right track, team members should always greet new members. In fact, regular participation from brand representatives is a must for maintaining a healthy community.
You can also add some consistency to the mix by organising thematic online events and discussion threads, such as Monday Motivations or Throwback Thursdays. Another highly effective way to lead by example is to highlight user-generated content (UGC). This includes any posts, pictures, or other content shared by your community members. It can be something as simple as a motivational meme or as complex as an in-depth video review of your latest product. Most of the time, members who have gone to the effort of creating a great post will appreciate the recognition. With UGC in the spotlight, the ability to set a good example shifts to your most valuable community members, which simplifies online community moderation.
How to ban the most troublesome users
There will come a time for every community manager or moderator to unleash the banhammer. While conventional banning should keep out most trolls and spammers, there are others who might not be so quick to get the message. The most determined of malefactors will just keep coming back, opening new accounts and even going so far as to hide their real IP addresses. That’s where the benefits of shadow banning come into play. Shadow banning is a discreet approach in online community moderation that bans users without their knowledge. Instead, they can keep posting, but their posts will be hidden from the community. Instead, their determined efforts to damage your community will land on deaf ears, thus discouraging them from further trolling.
Blatant but less serious policy infractions are usually best off being met with an initial warning from one of your moderators. Further disruptive behaviour might then result in a temporary or permanent ban. Temporary bans are ideal for giving troublesome members a chance to cool off and reevaluate their behaviour. This saves a lot of time over manually keeping notes.
Combined, these tactics will help you maintain a healthier and more productive community to the benefit of both your brand and its biggest fans.
Disciple social spaces help brands enjoy all the benefits of community with an independent, valuable, and trusted platform in a safe space that they own and control. Start building your brand community by telling us about your goals.
Improve Your Online Community Moderation with Disciple
The need for online community moderation is not open for debate. Without effective moderation, community building efforts are destined to fail as loose cannons drive away valued community members along with their friends and associates.
Community builders need to devise effective community moderation guidelines and then have a simple and effective way to implement them. That is exactly what you are able to do when you choose Disciple as your online community platform.
Enforce Community Guidelines with Disciple Moderation Features
Disciple's suite of online community moderation tools is extensive and puts control over user-generated content squarely in your hands where it belongs.
Of course, with control over user-generated content comes the responsibility to effectively prevent members from infringing on copyrights, bullying or badgering other members, or generally being a counterproductive member of the community.
Disciple moderation tools provide the ability to fulfil all forum moderator responsibilities in a simple and timely fashion. Those tools include, but are not limited to:
Shadow banning - Shadow banning is when the moderator decides to make a post invisible to everyone but the person who posted it. This is a subtle way to prevent questionable content from creating ill-will within the community.
Removal of content - Forum moderator responsibilities include the need to occasionally disable or even unpublish user content. Disciple provides this ability, as well as the ability to reverse these actions if the issue that created the problem is resolved in a satisfactory fashion.
Email verification - It’s common for blocked users on social media or online forums to try and return using fake email addresses. Disciple provides instant email verification that prevents this from happening.
Automatic post scores - Disciple community software incorporates Google AI technology that automatically assigns an “appropriateness” score to every post. Moderators can use these scores to help them make online community moderation decisions.
Members blocking members - If for any reason a member wishes to block content generated by another user they have the ability to do so with Disciple. They can also report other member’s posts if they feel those posts violate community guidelines.
With Disciple, you’re able to provide a lively environment that is both safe and responsible.
Online community moderation is a full-time job, and larger businesses often have entire departments dedicated to the task. But these efforts can pay off enormously if you take a strategic approach to online community moderation and management.
Any online community can attract undesirable types who, hiding behind the anonymity of the internet, indulge in all manner of bad behaviour. With spammers, trolls, and social engineering scammers presenting constant threat, brands need to be proactive when it comes to protecting themselves and their customers. That’s where online community moderation comes in.
But there’s more to running a healthy community than simply laying down and enforcing the rules. That’s why you also need community management. Whereas community moderators help to maintain a positive atmosphere by cracking down on disruptive behaviour and assisting managers in their efforts, community managers take more of a leadership role. In doing so, they’re often tasked with creating community rules and guidelines, defining responsibilities, and leading the conversation by setting a good example.
Know when to draw the lines
Before you can deal with disruptive community members, you first need to decide exactly what constitutes bad behaviour. This definition won’t just define your guidelines and policies; it will also set the standards for your community moderators and managers. While some provocation is obvious and outright, not all cases of trolling, for example, are actually intentional. After all, people often feel more comfortable speaking their mind in online forums and social networks than they do in real life.
One of the most important things in any brand community is to recognise the line between constructive criticism and abuse. A brand community should exist for the benefit of both you and your customers, and no one’s going to stick around long if they feel they’re being silenced. Occasional negative feedback, for example, can actually be beneficial to a business, since it helps them identify key customer concerns and improve their offers accordingly.
Sometimes, it’s necessary to have more than one community manager and/or moderator to weigh in on more contentious discussions and ensure they keep on track.
Other infractions are clearer but often not especially severe. For example, people posting off-topic in the wrong forum or group can become a problem but are rarely ill-intentioned. Another common frustration is when certain people post excessively so that no one else has much of a chance to get heard. Again, while not usually malicious, such behaviour can be disruptive to your community if it’s left unchecked.
Keep your community rules and guidelines visible
It’s an unfortunate yet unavoidable fact that most people will blissfully ignore your community rules, but that doesn’t mean you shouldn’t have them. If nothing else, an acceptable use policy provides some recourse if you ever need to warn or ban a member. Your rules and guidelines should always be clearly visible and should take precedence over everything else, save for the key value proposition of joining your community in the first place. A short reminder should ideally be visible on every page, and the entire policy should itself be as short and concise as possible.
You can be sure that no one’s going to read through reams of legalese to get into your brand community! Another tactic is to make your community guidelines a regular part of your discussions. Chances are, there’ll be many situations where you can refer to them and, in doing so, encourage members to learn more about how you handle moderation.
Enable peer-to-peer moderation
Online community moderation quickly becomes a practical impossibility at scale. When moderators find themselves working overtime trying to remove spam comments or tackle minor infractions, the chance of human error increases too. In the end, moderators and managers can pend more time policing their communities than getting actively involved in the discussions.
To prevent this problem and keep your community healthy, you need to think about scalability from the outset. And that shouldn’t mean constantly increasing your roster of moderators. The better approach is to enable peer-to-peer moderation to let your members choose what sort of content they find more valuable. Social media ‘likes’ are the most basic form of peer-to-peer moderation, since they provide a quick and simple endorsement. Other platforms, like Reddit and most community forums, allow members to upvote and downvote content so that with the highest ratings earns the most visibility. Others go even further to add gamification tactics like ranks, karma points, and badges. Finally, any online community should include a reporting feature to allow members to flag spam and other policy infractions.
With peer-to-peer moderation enabled, moderators and managers will have to spend less time dealing with otherwise trivial issues. Instead, they’ll be able to focus on proactively modelling the behaviour they want to see.
Lead by example to provide a sense of purpose
Brand communities are driven by a unified sense of purpose, whether that’s to offer peer-to-peer support, provide a space for feedback and testing, or facilitate customer success. When a customer visits your online community for the first time, they’ll often do what other members are doing. That’s why positive developments don’t come from rules and guidelines alone, but from the actions of your managers, moderators, and super users. To get started on the right track, team members should always greet new members. In fact, regular participation from brand representatives is a must for maintaining a healthy community.
You can also add some consistency to the mix by organising thematic online events and discussion threads, such as Monday Motivations or Throwback Thursdays. Another highly effective way to lead by example is to highlight user-generated content (UGC). This includes any posts, pictures, or other content shared by your community members. It can be something as simple as a motivational meme or as complex as an in-depth video review of your latest product. Most of the time, members who have gone to the effort of creating a great post will appreciate the recognition. With UGC in the spotlight, the ability to set a good example shifts to your most valuable community members, which simplifies online community moderation.
How to ban the most troublesome users
There will come a time for every community manager or moderator to unleash the banhammer. While conventional banning should keep out most trolls and spammers, there are others who might not be so quick to get the message. The most determined of malefactors will just keep coming back, opening new accounts and even going so far as to hide their real IP addresses. That’s where the benefits of shadow banning come into play. Shadow banning is a discreet approach in online community moderation that bans users without their knowledge. Instead, they can keep posting, but their posts will be hidden from the community. Instead, their determined efforts to damage your community will land on deaf ears, thus discouraging them from further trolling.
Blatant but less serious policy infractions are usually best off being met with an initial warning from one of your moderators. Further disruptive behaviour might then result in a temporary or permanent ban. Temporary bans are ideal for giving troublesome members a chance to cool off and reevaluate their behaviour. This saves a lot of time over manually keeping notes.
Combined, these tactics will help you maintain a healthier and more productive community to the benefit of both your brand and its biggest fans.
Disciple social spaces help brands enjoy all the benefits of community with an independent, valuable, and trusted platform in a safe space that they own and control. Start building your brand community by telling us about your goals.
Improve Your Online Community Moderation with Disciple
The need for online community moderation is not open for debate. Without effective moderation, community building efforts are destined to fail as loose cannons drive away valued community members along with their friends and associates.
Community builders need to devise effective community moderation guidelines and then have a simple and effective way to implement them. That is exactly what you are able to do when you choose Disciple as your online community platform.
Enforce Community Guidelines with Disciple Moderation Features
Disciple's suite of online community moderation tools is extensive and puts control over user-generated content squarely in your hands where it belongs.
Of course, with control over user-generated content comes the responsibility to effectively prevent members from infringing on copyrights, bullying or badgering other members, or generally being a counterproductive member of the community.
Disciple moderation tools provide the ability to fulfil all forum moderator responsibilities in a simple and timely fashion. Those tools include, but are not limited to:
Shadow banning - Shadow banning is when the moderator decides to make a post invisible to everyone but the person who posted it. This is a subtle way to prevent questionable content from creating ill-will within the community.
Removal of content - Forum moderator responsibilities include the need to occasionally disable or even unpublish user content. Disciple provides this ability, as well as the ability to reverse these actions if the issue that created the problem is resolved in a satisfactory fashion.
Email verification - It’s common for blocked users on social media or online forums to try and return using fake email addresses. Disciple provides instant email verification that prevents this from happening.
Automatic post scores - Disciple community software incorporates Google AI technology that automatically assigns an “appropriateness” score to every post. Moderators can use these scores to help them make online community moderation decisions.
Members blocking members - If for any reason a member wishes to block content generated by another user they have the ability to do so with Disciple. They can also report other member’s posts if they feel those posts violate community guidelines.
With Disciple, you’re able to provide a lively environment that is both safe and responsible.