This gist intends on clearing up some of the misinformation surrounding signed chat/the reporting feature Mojang has added to Minecraft 1.19.1. Here you can find both technical information as well as a general explanation of how these work.
After joining a server, clients now send a profile key used for verifying a message's authenticity. This key and thus the whole signing process is optional, but by default, servers enforce secure profiles for clients to send chat messages. Whenever the player sends a chat message and has a key associated, the message will be signed using their own private key, which the server then verifies using the public key sent after join. Assuming signature, timestamp, and message contents line up, the message goes through.
On the other end, clients can also require all broadcasted player messages to be signed, disregarding the ones without sender verified signatures.
Every signed message's signature include the sender's UUID matching the profile key's identity, a randomly initialized session UUID, a timestamp (though that cannot be fully verified with untrusted clients/servers), the signed message, and a random salt.
A message's signature also includes a message index that the client increments with each message. With the index being verified by both server and other clients, you cannot reorder messages or leave them out without knowing something happened between two messages (although you cannot determine when, to whom, or whether it was commands and/or messages inbetween).
Another important part of the signature is the list last seen messages. The signatures of the last 20 messages the client has seen will be cached and be packed into chat messages and their signature. This is used to verify that, up to a certain point to guarantee faithful context, no messages of other players have been omitted in a report and that no messages have been added to the given context after the fact either.
With signed messages, Mojang finally introduced a packet servers can use to retroactively remove already sent out messages. So if you want to clear chat or remove individual messages without having to spam empty messages that only move up the previous messages, you can properly remove them now using the ClientboundDeleteChatPacket
packet - the only requirement for this is that the message to be removed is a properly signed player message. Since 1.19.3, messages will be displayed for at least 3 seconds before being removed, and a removal still leaves a stub behind (saying that a chat message has been removed where the message previously was).
Since 1.19.3, you can freely cancel signed chat messages or send them to a limited number of players without breaking the chat chain. Players will instead use the signed index to make sure messages are send and included incrementally.
... You don't anymore; chat previews have been removed entirely in 1.19.3. With this, changed messages look a lot less scary (as opposed to unsigned messages) and you can still see the original message's content.
Since commands such as /say, as well as custom commands to broadcast messages or send them to a certain group of people also result in "player messages" that you would want to have verified, text arguments in commands will also be signed by the client. With the given signature, you can then distribute the message yourself and still have it show up as a signed player message.
In the wild, you can see this being used in Vanilla's say, me, msg, teammsg, ban, banip, and kick commands.
There are two different kinds of chat messages now; player chat and system chat. Player chat is accompanied by the message signature, system chat has no special format or signature attached.
You can optionally attach an unsigned component to any player chat message, which will give the message a light gray indicator on the left with a hover over the message reveiling the original message's content. If only styling changed on the message and the actual plain text is still the same, the message will appear without the indicator.
You can also apply filter masks to messages without making them appear as unsigned messages, where bad words are replaced with a string of #
characters.
If you go as far as sending a player chat message with an invalid signature, it will look like this:
System chat messages also have a gray indicator.
While the message always needs to be verified by the player that sent it, player display name, team name, and surrounding format can be freely defined by the server.
One of the default chat types looks like this when serialized:
{
"name":"minecraft:team_msg_command",
"id":3,
"element":{
"chat":{
"translation_key":"chat.type.team.text",
"parameters":[
"team_name",
"sender",
"content"
]
},
"narration":{
"translation_key":"chat.type.text.narrate",
"parameters":[
"sender",
"content"
]
}
}
},
The decoration format for the chat display here resolves as %s <%s> %s
, then using the 3 parameters team_name
, sender
, and content
. Even though the decoration element only takes a translatable argument, you can simply enter a plain string as the key that will be displayed; you can try this out by using the following command: /tellraw @s {"translate":"Hello [%s]", "with":["world"]}
Chat type formats can be easily made custom, e.g. turning the translatable into plain text like: 🚩 Broadcast by %s: %s 🚩
and only taking the sender and content parameters, to give just one example. In addition to the text display, you can also define the message to be narrated (also using a different number of arguments and a different surrounding format) and/or displayed in the actionbar as "game info". In the style field you can also apply custom formatting (color, font, italics, hover/click events, etc.) to the entire message/until the sender or content component changes the format again.
Custom chat types can be added using datapacks or by modifying the chat_type registry in the server (which modded servers such as Paper will need to add API for in the future). Custom chat types will then be sent to each player once when they join. With this, you can in theory also send the same message using different formats to different players, only the actual content is always fixed as part of the signed message.
You can find a full list of the Vanilla chat types here.
Before we part ways again, here are answers to some of the more common questions. Mojang's FAQ has been updated to answer more of the pressing questions, so it's definitely worth taking a look at.
No, only reported messages are sent away for processing.
Mostly non-issues: guardian always leaves a trace when you're reported, gaslightv2 usually leaves a trace or just becomes silly when you report someone else, and gaslightv3 falls into the same category, where "yes" or "I hate them" are nothing that will reasonably be acted upon (also see below for more info). Basically, this Tweet.
Additionally, since 1.19.3, messages can only be removed 3 seconds after first appearing, and will leave a stub instead of fully removing the line.
No, Mojang have made clear they only intend on hunting down the worst of the worst (suicide threats, racial slurs, doxing, etc.). All reports will be handled in human review (aside from them most likely pre-filtering malicious reports before the final decision is made). See here for a detailed list of punishment reasons. You can still dick around with your friends.
Then they get temporarily or permanently banned; the number of reports does not matter.
No, they need the private key only you and Mojang have to sign messages as coming from your account. You cannot be impersonated unless you download a stupidly malicious client/mod, and even then you can still appeal.
Reports require and automatically send a handful of messages around the selected ones to be included as context. You cannot omit or add messages from/to reports without making it look fishy. There are yet to be given examples of messing with context that would realistically get you banned, even just temporarily.
That's simply not going to happen considering how different the underlying tech of filtering vs. reporting/chat signing is and the general nature of 3rd party servers.
No, and if you think you were banned without reason, you can make an appeal.
Yes, very easily. However, considering this comes at the cost of effectively taking power away from your users, making them more vulnerable to repeated bullying, it'd not be as merciful of a move as you might think it is.
Players may also opt-in to only display signed (and thus reportable) messages.
A lot of people have voiced concerns regarding Mojang possibly outsourcing message moderation and thus having a poor quality of report processing. While it is a somewhat reasonable fear, this is still based on extremely high amounts of speculation. Looking at the facts, Microsoft already has a well working chat moderation at xbox live, where no such drama of false bans or being banned because you spoke out negatively about Microsoft has occurred - the rules regarding Minecraft chat are also a lot more lenient compared to that.
With this in mind, such speculation does not make for a good argument and I implore you to wait and see what actually happens. If your worst fears do end up coming true and false bans occur with an additional lack of appeal processing, I myself will be sure to join the riot as well and provide easy to use means to disable reporting.
You're using Mojang's client, Mojang's server, and Mojang's services on a massive social platform they still have the responsibility to moderate; they're very much in their right to do that. You won't be banned if either your friends don't feel attacked by your messages or you just disable reporting with a plugin or mod.
Howvever, opinion time: Everything you do or say has consequences, even towards friends, and even if you don't realize they exist. You're not going to be banned for a playful and harmless insult, but considering the large number of children and young adults playing the game, such a reporting feature was long overdue.
Someone who is toxic on one server is likely to behave the same on other servers as well. You might be capable of handling simple disputes and insults, but Mojang is better equipped to properly deal with people putting out personal threats, child predators and the alike than you are. This also includes the smaller or even private servers.
Proper moderation takes time, and a lot of servers aren't able to provide that or willfully neglect it. Nevertheless, you can still easily lever out reporting on your server if you wish to do so.
Whatever you do, don't join the angry mob; instead, provide constructive and useful feedback either on Minecraft's feedback site or open a ticket on their bug tracker - and remember to keep it civil.
You already have one. Ignoring the technical side of it, just imagine what Mojang would look like if they gave bad people the option to disable industry-standard player safety features.
If you know what exactly is broken, why not PR a fix?