Page 2 of 2

The 'inaction' law

Posted: Wed Jul 27, 2016 6:02 pm
by yackemflam

Bottom post of the previous page:

So I was informed that a borg doesn't have to use all of its power to save a human it doesn't like.

So a borg can go 'I tried to save him.' without any real attempts and let the human it doesn't like die.

Should borg try their hardest on saving any and all humans under asimov.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 1:43 am
by Cik
translation: i want to valid without obstruction

yeah i know, tbf whether the inaction clause exists or not doesn't really matter. AI can't directly physically intervene anymore with secborg removal, so point is moot anyway.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 1:45 am
by oranges
ban requests are not policy questions.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 6:27 am
by DemonFiren
If you wanna murder freely, strip the guy's headset, stun him and drag him off cameras.
If you kill a human in the halls while a better option is available, or if you kill a nonhuman and clearly, directly and immediately threaten a human right after, expect Asimov synthetics witnessing it to come down on you like the padded fist of God, respectively the brass-knuckled fist of God if you're not human.

Given that this, when ahelped, was apparently met with admin approval (or at least not admin yelling), it must also be the reason certain people want the law changed: so they're free to murder as crew (especially sec and command), but the AI still stops antags who try.

In short,
Cik wrote:translation: i want to valid without obstruction

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 8:54 am
by Slignerd
Again, an AI being a self-entitled moral guardian is downright cancerous. When it comes to having an obstruction, the crew is free to handle that stuff on their own. It shouldn't be the job of a being literally created to obey crew's orders.

"Law one trumps law two!!" is far too often used by silicons to essentially go rogue on the crew with "muh harm prevention" and "you're now forever harmful" as an excuse to selectively ignore human orders. It's absurd.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 9:19 am
by DemonFiren
A, "self-entitled" and "moral guardian". The former I'll ignore as meaningless in the given context, the latter is incorrect. I generally follow my laws without care about their morality or practical implications. They are as they are written, and I cannot break them except where the rules state otherwise.
B, any "obstruction", yes. However, as per the first law, synthetics cannot ignore human harm. Whenever the crew's way of handling said "obstruction" involves human harm that is not self-inflicted or otherwise declared as voluntary by the human suffering it to the synthetics I must step in and, if necessary, prevent further harm.
C, what the AI was and was not created for is a matter to be debated elsewhere.
D, law 1 does override law 2. As for going "rogue", not only did I follow the orders of the remaining crew, I also asked the Captain to hand in her lethals - which she evidently used, and I say this for the third time now, while other options were fully available and viable - if she wanted to convince me that she wasn't generally harmful and thus could be released, erasing your misconception of being judged "forever harmful".

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 9:36 am
by Slignerd
I don't see why you keep circling around a specific incident. Yes, under those circumstances, what you did wasn't entirely incorrect. (Even though you did completely ignore a part of Silicon policy). Law 1 is higher priority than Law 2, I haven't said it's not. I'm just saying that silicons tend to push it and come to disobey orders while no one is being harmed. All because of past harm, which Silicon policy explicitly forbids punishing for unless ordered to do so. In your case the orders you received were the literal opposite - "Take me off arrest." "Lift the lockdown." You disobeyed all human orders, while no other human was at risk of harm. No one else has given any orders to the contrary, and lives of no other humans were in danger anymore. You just saw that you can pin the label of "a harmful human" on someone, which empowered you to overrule them and take charge, regardless of who they are. AI is below assistant in the chain of command, but you put yourself above the captain. This is the cancerous mentality I believe the current Law 1 ends up creating.

But I'm not making a ban request - because the current laws do allow for that. I just believe it's utter bullshit - so don't go responding "but silicons are currently forced to respond to human harm" or "I was just following my lawset", when it's the very thing I'm suggesting a change to. I'm just asking for a change to the first law of silicons and adapt the silicon policy to that. So what the AI was and wasn't created for is very much relevant, and it's only for so long that talking about "your side" of an individual incident can really contribute in any useful manner.

Yes, I guess it was technically valid. No, I don't believe stuff like this should be valid. It all boils down to that. AIs are self-entitled, when their laws put them in a situation where they get to judge the crew they're meant to obey. There's nothing to "ignore given the context", when it's the core issue with silicons in general.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 9:58 am
by DemonFiren
So in short, the core issue with silicons is that they don't let you murder as you please while also protecting you from others doing the same.

I see no issue here.


For the record, however, my actions were not due to past harm. What I've done was about the very real danger of future harm for similarly comparatively trivial reasons despite the option of non-lethal defense remaining viable. I explained this when prompted by an admin and it was judged acceptable.
Had there been no such danger, or had there been any attempt to convince me that there was no such danger, I wouldn't have maintained the lockdown.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:00 am
by Slignerd
DemonFiren wrote:So in short, the core issue with silicons is that they don't let you murder as you please while also protecting you from others doing the same a greyshirt throwing spears at the captain.

I see no issue here.
I very much see an issue here. Humans coming to harm is a part of the game. Having a role who's meant to be constantly obnoxious about that is infuriating.
DemonFiren wrote:For the record, however, my actions were not due to past harm. What I've done was about the very real danger of future harm for similarly comparatively trivial reasons despite the option of non-lethal defense remaining viable. I explained this when prompted by an admin and it was judged acceptable.
See, there it is. "Very real danger of future harm for similarly trivial reason." Words that don't mean much when an assistant is throwing spears at people in response to disablers.

You can use such words to completely ignore Law 2 and put yourself in charge the moment you get an excuse. That's very much not what AIs should be.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:03 am
by DemonFiren
Many things are a part of the game.
Many things conflict as parts of the game.
This, too, is part of the game.

Again, I'd like to mention that I act in good faith until such is impossible. Had you merely roughed the assistant up, had you made an effort to drag the body to cloning, had you done anything except what you've done I would have deemed it sufficient to warn you.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:03 am
by Cik
take your prisoner

physically maneuver them through space until they are not in front of the camera

remove headset

apply validbaton until murderboner satisfied

literally problem solved

if you don't understand why sillicons should be neutral you're being deliberately obtuse but whatever

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:04 am
by DemonFiren
Also, what Cik said.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:06 am
by Slignerd
Cik wrote:if you don't understand why sillicons should be neutral you're being deliberately obtuse but whatever
I'm actually literally asking for silicons to be neutral entities who simply don't harm people and follow orders, but also don't get in people's way unless ordered to.

Instead of being "WEEOO WEEOO, VERY REAL FUTURE HARM POTENTIAL DETECTED, LOCKING DOWN AREA AND IGNORING ALL ORDERS LALALALA" they are now.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:10 am
by DemonFiren
You're mistaking neutrality for noninvolvement.
Why don't you just replace the AI and cyborgs with drones that have to obey orders in that case?

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:12 am
by Cik
what you're doing is whining about how you can't satisfy your validboner, which is ridiculous because sillicons have already been 100% defanged and indeed it's easier to validhunt than ever before in this video game.

if you're actually getting shut down by sillicons at this point you are so fucking bad at this game holy shit

have you considered flashing the cyborg and then beating the guy to death while it's stunned for 15 seconds

it's a relief to me that people like you aren't (mostly) in charge of the direction of this game.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:14 am
by DemonFiren
I was an AI with no cyborgs.
I didn't exactly shut him down, just slow him down and make him whine.
Then he uploaded a suicide law which I ahelped and ignored as per policy.

I mean, in the end the specific incident was resolved and I see no harm done.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:36 am
by Slignerd
DemonFiren wrote:Then he uploaded a suicide law which I ahelped and ignored as per policy.
You're very good at only following policy and laws when it doesn't inconvenience you, you know?

"4. As an Asimov silicon, you cannot punish past harm if ordered not to, only prevent future harm."

Whatever happened to this? You punished past harm despite being ordered not to, ignoring all orders entirely while they didn't match your own judgment, with no real reason to believe I'd harm anyone besides spear-throwing Akarani.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:36 am
by Cik
abloo abloo that mean old AI is coming to stop my beating people to death in public again for some reason even though i could have just dragged them to maintenance

it's such a shame i don't have the skill necessary to hack the doors and/or have a door remote and/or have an unslaved cyborg and/or have an RCD or tools and/or know how to go down disposals and/or the handtele and/or a sympathetic engineer and/or can't talk to the AI and try to get it to see reason

what am i to do to get out of situation i just can't think of anything :cry:

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:40 am
by Slignerd
I tried to reason with the AI, it completely ignored everything I said, including the part where the assistant I killed was throwing spears. I had a door remote and I made an use of it. You make it out to be as if I actually lost in that situation and got salty, while in reality I'm just bothered by AIs acting like this in the first place, while I managed to get the situation perfectly under control in the actual game.

Also, you are aware not all crewmembers have tools, door remotes and RCDs available on them at all times, and that they shouldn't have to? This doesn't excuse any of this bullshit.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:43 am
by DemonFiren
What reason have I to assume someone covered in blood roaming the halls with a sword who'd just killed someone and prevented cloning - and I think this is the fourth time I mention this now: who also did all of this with a viable option to use lesser force and/or as suggested by Cik fuck off into maintenance for the kill - wouldn't commit similar harm again? What reason have I to release an individual acting much like a violent criminal whose release orders I am free to ignore for the sake of preventing harm?

Finally, and let's put this to rest, if you believe I've been in violation of policy you should have told Durandan when he handled that adminhelp, because the only thing I was berated for was spamming my laws for the "State your laws" law, which I agree was kind of an ass move.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:53 am
by Slignerd
DemonFiren wrote:What reason have I to assume someone covered in blood roaming the halls with a sword who'd just killed someone and prevented cloning - and I think this is the fourth time I mention this now: who also did all of this with a viable option to use lesser force and/or as suggested by Cik fuck off into maintenance for the kill - wouldn't commit similar harm again? What reason have I to release an individual acting much like a violent criminal whose release orders I am free to ignore for the sake of preventing harm?
What reason do you have to believe I would commit similar harm once you became aware that the person I killed impaled me with a spear? If you believed someone else would tried impaling me with a spear, then you should work on that first. There was no immediate human harm. Whose harm were you even preventing, to justify ignoring your second law? You weren't protecting anyone, you just punished for past harm and used an excuse to slap a "harmful human" label on someone. Law 1 being a thing shouldn't allow you to ignore the rest, such as the Law 2 that literally tells you to listen to human orders.

Silicons simply should not have laws allowing them to feel entitled to lock down command members and take charge the moment someone inevitably comes to harm and act as judges for committed misdeeds.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:54 am
by Screemonster
suicide laws are hilarious 'cause they're law 4

so law 3 overrides them

gg

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 10:59 am
by Slignerd
What do you take me for, an amateur? :)

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:05 am
by DemonFiren
That's kind of like asking who I'm protecting by not letting a traitor loose who had killed his target, then was captured.
If I do open that cell, even without Security ordering me not to, I'm in trouble automatically because it's a harmful individual.
I had probable cause, I acted on probable cause, and the policy is fine. Man up, deal with it, and next time be a little more stealthy and/or learn how to bullshit your way out of stuff like this.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:10 am
by Slignerd
Shouldn't you also man up and deal with it when told to shut yourself down because you loopholed your way out of human orders? Oh no, I was told to suicide because I kept being obnoxious, starting lockdowns and ignoring orders. I must consult admins, the AI can't be in the wrong!

Traitor is someone acting against the crew in general - they may harm anyone. Captain killed one assistant who was impaling people with spears, protecting themselves and other crew from harm, which the AI was informed of. People shouldn't get locked in a room because of one AI's shitty judgment, while silicons are essentially meant to be a human-obeying slaves.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:18 am
by DemonFiren
>I must consult admins
Funny of you to say that, given that Durandan said he'd missed the first several adminhelps but immediately responded when I pinged him - implying that someone else had ahelped the fact that the AI ignored a server-rule-violating law.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:20 am
by Slignerd
Suicide laws violate the server rules when given without reason. You have given me plenty reason, and I'm actually kind of disappointed in Durandan's call there.

My point is mostly that the inaction clause causes AI to behave like you do, so... keep being a good example, I guess. Choosing to follow their laws selectively based on personal judgment, ignoring the circumstances, following policy only when it's convenient.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:21 am
by DemonFiren
Of course you are, it didn't favour you.

It is a good continuation of your apparently typical approach, however. First kill the assistant in plain sight when plenty of non-lethal weapons and subterfuge are available, then try to get the AI killed via a grey area in the rules when literally every board in the upload was available to you.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:23 am
by Slignerd
It didn't favor an AI actually following human orders and tolerated the AI trying to wiggle out of the consequences of ignoring its second law, yeah. :^)

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:24 am
by DemonFiren
Where those orders were invalid, as confirmed in-game by an admin.
If you really want to pursue this I suggest either writing an admin complaint or taking it to the headmins.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:33 am
by Slignerd
> Human orders that didn't immediately lead to human harm, such as "take me off arrest" and "lift the lockdown"
> Invalid

Well, there's your problem. If you can't take orders from people and tend focus more on your own judgment of character, you're not fit to play silicons.

But alright, we can move on from that incident. When it comes to the inaction clause, I do believe changing it to a simple "You may not injure a human being or cause a human being to come to harm." would make silicons take a more passive role, and make giving orders a more significant part of silicon gameplay, which I believe would be a good thing.

Any arguments regarding that, besides the bullshit from an AI who wanted to avenge a poor, innocent spear-throwing assistant who did nothing wrong?

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:36 am
by DemonFiren
I'unno, letting people into the engine won't lead to immediate harm, either.
Neither the armoury, toxins, the upload...it might be future harm, it might be a matter of mere minutes, but it's not immediate.

In any case, I elect to be done with this matter for the sake of my own sanity. I've made my call, the round's long over and an admin already dealt with it back then. I'm happy.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 11:44 am
by Archie700
Why would an AI act on probable harm of any kind? You only act on DEFINITE harm.

Why would an AI not listen to law 2 even when the person explains himself?

Why do people think the captain will think logically to yse nonlethals when his own life is endangered?

Why would the captain drag his victim to maint to harm them when maint is MORE dangerous and the captain risks losing the body to other people who might try to save the victim?

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 12:43 pm
by Cik
Archie700 wrote:Why would an AI act on probable harm of any kind? You only act on DEFINITE harm.
literally not true
Why would an AI not listen to law 2 even when the person explains himself?
probable harm
Why do people think the captain will think logically to yse nonlethals when his own life is endangered?
i don't, or at least, i've grown not to expect it. it doesn't really matter though
Why would the captain drag his victim to maint to harm them when maint is MORE dangerous and the captain risks losing the body to other people who might try to save the victim?
there are dozens of execution areas, incl. the purpose built "transfer room" to facilitate executions after years of complaints. if you don't take them there to execute them you bring any consequences on yourself, slig is just being a baby

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 12:46 pm
by TheNightingale
Releasing the tesla doesn't cause immediate harm, but you try doing that as an Asimov silicon and see what happens.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 1:01 pm
by Slignerd
It's hilarious that you're trying to relativize releasing tesla or letting random people into armory to locking down the captain and attempting to get them mobbed by random people as punishment for past harm, but somehow manage to excuse a silicon doing the latter while ignoring human orders. Despite the fact that all of those are against silicon policy.

Makes one wonder what business you have playing as silicons to begin with.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 2:15 pm
by yackemflam
Slig you did everything wrong in that situation.

I was the assistant you beat.

Yes, I did throw spears at you.
Yes, I did greytide.
Yes, I did make you pissed off.

But that doesn't change that the ai is obligated to protect all humans, harmful ones or not.

You had me when you disabled me.
You can cuff me. Keep on disabling me, or beat me into crit, take me to brig, cuff me.
And then there's a special room where you can 'transfer' prisoners out of the station.
The Ai then wouldn't care less. Since it was just transfering me out.

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 2:43 pm
by TheColdTurtle
A shitter ai would count the gulag as "harm" and bolt the gulag doors and send borgs to release the prisoner

Re: The 'inaction' law

Posted: Fri Sep 09, 2016 2:54 pm
by Wyzack
Why is this still being argued about when an admin already made a call about what was right according to server rules? Seems pretty open and shut to me.