Greetings, glorious testers!
Check out Alpha Two Announcements here to see the latest news on Alpha Two.
Check out general Announcements here to see the latest news on Ashes of Creation & Intrepid Studios.
To get the quickest updates regarding Alpha Two, connect your Discord and Intrepid accounts here.
Check out Alpha Two Announcements here to see the latest news on Alpha Two.
Check out general Announcements here to see the latest news on Ashes of Creation & Intrepid Studios.
To get the quickest updates regarding Alpha Two, connect your Discord and Intrepid accounts here.
In-Game Community Regulation Tools
superhero6785
Member, Alpha Two
Sitting around last night I had some thoughts around in-game tools & settings we could be given to help 'self regulate' the community. Let me know what you think.
Obviously games have features to report players, but I thought it would be nice that if enough people reported someone for say "spamming" or "harassment" or even "foul language", that they would be put on a corresponding list. Then, in your settings there could be different checkboxes to "Automatically mute spammers/language/harassment". This would allow the community to very quickly identify and flag players accordingly and allow each individual player to regulate their own chat.
The added benefit this led to is that this could be useful even for younger players, or parents who let their kids play (maybe even with them) who don't want to be subjected to more "adult" chat that perhaps isn't exactly against ToS or censored, but that most people in game have identified as such, or perhaps you even self identify as.
My goal here is to give as much freedom to players to be who they are and play how they want to play, but having a way to automatically filter players out of chat, or even identify players via some small icon on their profile, gives players a way to choose the type of people they want to interact with in chat and party up with in the game.
Some caveats - you'd need need to be notified when your account is placed on a list, and there'd have to be an appeal process to prevent things like false mass reporting - in which case I think anyone caught false reporting should be put on a "can no longer report players" list.
Thoughts? Additions? Tell me why I'm wrong and this is a horrible idea?
Obviously games have features to report players, but I thought it would be nice that if enough people reported someone for say "spamming" or "harassment" or even "foul language", that they would be put on a corresponding list. Then, in your settings there could be different checkboxes to "Automatically mute spammers/language/harassment". This would allow the community to very quickly identify and flag players accordingly and allow each individual player to regulate their own chat.
The added benefit this led to is that this could be useful even for younger players, or parents who let their kids play (maybe even with them) who don't want to be subjected to more "adult" chat that perhaps isn't exactly against ToS or censored, but that most people in game have identified as such, or perhaps you even self identify as.
My goal here is to give as much freedom to players to be who they are and play how they want to play, but having a way to automatically filter players out of chat, or even identify players via some small icon on their profile, gives players a way to choose the type of people they want to interact with in chat and party up with in the game.
Some caveats - you'd need need to be notified when your account is placed on a list, and there'd have to be an appeal process to prevent things like false mass reporting - in which case I think anyone caught false reporting should be put on a "can no longer report players" list.
Thoughts? Additions? Tell me why I'm wrong and this is a horrible idea?
0
Comments
I've got minimal general concerns about it, it's worked out for me when I've used/implemented such systems as well.
Obviously it results in some 'abuse' but it also lets oversensitive people avoid drama.
I'd be more concerned about the sorts of people who explicitly make it their goal to make people uncomfortable, putting in even more effort when they realize no one is paying attention to them, since a nontrivial amount of those people do those things to get it.
True. But that's where the regular ToS enforcement systems would kick in, ideally.
If players want more control there should be options to make it so you can only hear friends, guild members, party members, etc and customize your own options to do what you want with it.
We would all be on some kind of list at some point. Annoy me? Strait to the spammer list. Think about it.
We don't have community policing in real life because it would subject everone to abuse (sometimes it does sneak in, and the results are always bad). We would amplify that behavior in an online world where there are no true consequences.
You are thinking about how YOU would use it. You are not expanding that to the broader community.
In my case I'm considering this and saying the following:
1. With enough separate lists, this is fine
2. With high enough thresholds, this is fine
3. Most of the time this has no meaningful effect on any given player because of 1 and 2, in my experience
So, while you might not agree or share my experiences, I'm definitely saying that I accept the things you're talking about as 'positive' in an MMO community chat microcosm.
"Mute all players reported for harrassment over 200 Times". Set to off by default.
Be abused, by people like you, who go around looking for reasons to be insulted
All these games need is /block.
From there move on.
But no, rather cry about everything.
If something needs to be off by default there most likely is a issue since that had to be the standard.
It leads to more toxic uses in players minds and they will act on it and waste time needing to appeal. It is better to have report taken seriously if that is the case than it becoming some weapon (even if dulled). A PvX mmorpg is one of the worse places for it as well with the nature of pvp that has a big impact on the world.
I remember in new world I got back stabbed and some guilds were able to use that to try tot urn a entire faction against me lmao. Things like these are just used as weapons and i would have been reported for harassment just cause of peoples desire to try to inflict damage large or small.
Could you clarify what the 'damage' was, though?
I thought we were just talking about people not being able to see things a person says in a specific chat setting (such as Node Chat vs 'global' chat, etc).
What was the effect on your account? I think OP is talking about muting primarily.
Whew, you guys really keep me on my toes, for sure...
Yes, that was an example, please help me by providing a better one, I'm a little busy so I'd appreciate anyone providing an example that doesn't mislead about the nature of the game's chat system while still allowing me to make the point.
If you believe the point itself is flawed because no global chat exists, please lmk that too.
In this case it would be trying to get you muted and hoping as many people have it turned on or advertising it to people to do so to try to isolate people they don't like.
So if i were to do it, sensing a weakness to exploit in someone you get your guild and others to report them using rumors or mass report. Know the area they farm in and try to advertise to people about the feature and turning it on to grow the amount of people that won't be able to hear them speak.
That mind set will grow the desire for people to turn the mute on at of spite, annoyance, anger, etc until it is the norm. Making it even more effective when you get a lot of people to report someone, even more so if you can make it a standard on the game / server for people to have it on.
Then thank you, at least it's clear.
The thing you are talking about it literally the point of the system, so I understand your disagreement with it.
One solution I've used for this is to make it so that reports from the same day don't count or only put the person on the list for a short period, like a 3 day listing.
That way if a group wants to target a specific person, they must coordinate multiple people to report that person every day for the same supposed offense.
No such thing is perfect and all have to be tuned to the goals of the game, just adding to the thread.
If it is like that i don't see why it would be abused would be too much energy. Also if it were being abused it have to be certain people constantly.
I think that would be more fair on top of something detecting what they are saying if it follows certain patterns. And we don't get into a monthly war with people trying to get people comm muted to give them a disadvantage so they can lose a max level node.
I'd agree but it is more of a overarching thing with the effort involved to do it constantly. The more difficult the less the whole community will agree on it being a universal thing or getting people to make it so.
So less elements means less chance to make it standard for most people to do it. If most people don't turn it on it changes nothing except for a few people.
Needing a good amount of people and repetitive fashion and word detection is effective, id say more than 200 people though imo.
But it have to be something that is watched if people are abusing it.
In my experience this is actually the point.
If we agree that spamming and harrassment are bad, but that heavyhanded responses to reporting are also bad, then we have a sliding scale. Set it correctly for the game and that's it. Being muted in open chat just doesn't seem like that big a deal in a PvX game with heavy community functions if implemented even close to right.