Few XBO Nation News Updates
Cliff Bleszinski is Working on a new game. Picture is below.
Microsoft launches Xbox Enforcement United program
New player-run program will crowdsource community opinion to determine if certain content violates Xbox Live Code of Conduct.
Microsoft today launched a service called Xbox Enforcement United, a program that allows users to help determine if other players are violating the Xbox Live Code of Conduct.
A beta version of the program is live now and is open only to Xbox Live Ambassadors who meet age, Gamerscore, and Xbox Live tenure requirements. Microsoft intends to open the program up to a broader community when the beta comes to a close. Gamers interested in taking part in the beta can head to the program's new website.
The Xbox Enforcement United program will crowdsource community opinion about content found on Xbox Live, beginning with Gamertags, that may be in violation of the Xbox Live Code of Conduct.
Those in the program will be asked to identify questionable content as it relates to profane words/phrases, topics or content of a sexual nature, hate speech, sensitive historic/current events, as well as various "sound alike" or "look alike" words or phrases, potentially masked by l33t speek, phonetic tricks, or multiple languages.
These opinions are then fed into an algorithm developed by Microsoft's enforcement team to determine if "enforcement action" is needed, like mandating a user change their Gamertag to something more appropriate.
More information about the Xbox Enforcement United program is available in a blog post from Xbox Live policy and enforcement team member Glenn Kaleta.
Microsoft also today gave new details on the Xbox One's reputation system that promises "no more cheats or jerks."
Xbox One reputation system promises "no more cheats or jerks"
Microsoft details community-powered reputation system for next-gen console that aims to help "filter out" users gamers don't want to play with.
Microsoft's new reputation system for Xbox Live on Xbox One promises "no more cheats or jerks." In a blog post, Xbox Live program manager Michael Dunn explained that the new community-powered system will help "filter out" users that gamers do not want to play with.
"No question that Xbox Live is a distinct community of passionate gamers. We love that. But just like in life, there are all types of people--some shy, some polite, some aggressive, some snarky, some annoying, and some that can’t avoid swearing at #$%^ happens to them," Dunn said. "Most Xbox Live players are polite online and know how to socially adjust to people they're playing with. But not everyone does this. And, it can be challenging to pick up on social cues when you are connected online and not face-to-face in the same room."
This is why Microsoft conceived the new reputation system for Xbox One. The new model will "expose" people who "aren't fun to be around" and will implement "real consequences" for gamers who harass others.
To do this, Microsoft will include actions like "block" or "mute player" into the feedback model. This model will take a player's online ratings and put it into a system with a "crazy algorithm" created by Microsoft and "validated" by a Microsoft Research PhD.
A player's reputation score will determine which category they are assigned to: "Green = Good Player," "Yellow = Needs Improvement," or "Red = Avoid Me." Gamers will be able to view this data by looking at someone's gamer card.
"And, your reputation score is ultimately up to you. The more hours you play online without being a jerk, the better your reputation will be," Dunn said. "Similar to the more hours you drive without an accident, the better your driving record and insurance rates will be."
Overall, Dunn said most players will have good reputations, as the algorithm is designed to identify players who are "repeatedly disruptive" on Xbox Live. Dunn described the algorithm as "sophisticated" and one that won't penalize players for "a few bad reports."
"Even good players might receive a few player feedback reports each month and that is OK," Dunn said. "The algorithm weighs the data collected so if a dozen people suddenly reporting a single user, the system will look at a variety of factors before docking their reputation. We'll verify if those people actually played in an online game with the person reported--if not, all of those player's feedback won't matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user and, a number of other factors."
Dunn did not detail any of the "real consequences" gamers will suffer if they are found to be harassing other players.