Suicide, and non-accidental self harm are the most obvious example of willing harm, because the human would not do it if they are not willing. Surgery (even non-essential surgery and borgification) is also considered willing harm. An obvious example of nonwilling harm, would be a prisoner being beaten or executed by security. Between these two examples, grey area exists.You may not harm a human being or, through action or inaction, allow a human being to come to harm, except such that it is willing.
The biggest question we must answer, is intentionally putting yourself in a situation where harm is all but guarantied willing harm? If a scientist runs into a burning section of the station in order to retrieve something like a stun baton, is a cyborg obligated to stop them from harming and potentually killing themselves? If the human had the choice, it would not be burnt by fire, but it has decided to enter the room regardless. The human does not want to be harmed in this manner. It is not self harm, but is it willing harm?
For a much more common example, fighting in a rage cage is usually considered self harm. However this situation is even more tenuous than the burning room example. People who fight in the rage cage do not wish to be harmed at all for the most part, they want to win. They entered the rage cage of their own volition, and have the potential to not have any harm be done to them, and do not wish to be harmed, if their opponent asked them "hey can I beat the shit out of you," they might respond, "no, I am not willing to let you beat the shit out of me and I will resist with all my effort." What is a silicon supposed to do in this situation?
If the answer is "entering the rage cage is willing harm, they already knew that by entering that scenario that they could be harmed accepted that and willingly entered the rage cage, so therefore it is willing harm," then isn't that true of all combat? Is the criminal who steals high risk items and murders the captain, not also in an act of self harm when they get executed by security? Using the same logic, "Being executed for committing capital crimes is willing harm, the criminal already knew that by committing capital crimes they could be executed by security, but willingly committed capital crimes regardless, so therefore it is willing harm." Both of these scenarios are somewhat preventable by the AI, but one of them is considered willing harm, and the other isn't. Under this logic, antagonists aren't protected by the AI if someone tries to harm them, because if they didn't want to be harmed, they should have never done anything that reveals them as an antagonist.
And what about inaction? Is voluntarily not removing yourself from a harmful situation not willing harm? Is it willing harm to intentionally not leave a burning room even though you have the opportunity too? If so, is it not also willing harm no not flee the station via the lavaland ferry whenever war ops are declared? Is charging headfirst into a battle with an antagonist willing harm? If the antagonist is human, is the borg not obligated at all to intervene in this situation, because they are both committing an act of willing human harm? (the human is willing to be harmed because they are trying to fight the antagonist, which is harmful, and the antagonist is willing to be harmed because they revealed themselves as an antagnost, and if they weren't willing to risk harm, they never would have done that.)
No. Obviously. These scenarios are dumb, and if you were a borg who played like this, you would be yelled at. But how are you supposed to define which scenarios are willing harm, and which are not? How is standing in a burning room by your own volition, different than staying on a space station by your own volition that is about to be stormed with gun wielding hyper-lethal nuclear operatives?
Silicon's primarily care about their laws, not goodwill, empathy, and understanding. Their laws fall before rule 1. As such, these edge cases currently defined in which a silicon can or can not interfere with a situation can not be so easily defined by simply the word "willing."