Page 3 of 3

Re: Test proposal thread

Posted: Fri Apr 08, 2016 11:34 pm
by Zilenan91

Bottom post of the previous page:

The inaction clause is there because the AI is too powerful and players are too assholey to trust them with it.

Re: Test proposal thread

Posted: Fri Apr 08, 2016 11:43 pm
by Zilenan91
Also as far as tests go, how about we put Sybil on the hub and lower the rules to see how players change over time.

Re: Test proposal thread

Posted: Fri Apr 08, 2016 11:58 pm
by PKPenguin321
Shad0vvs wrote:
Scott wrote:Remove Asimov Law 1 inaction clause.
You really should read Asimov's short stories.
The robot could drop a weight on a human below that it knew it could catch before it injured the potential victim. Upon releasing the weight however, its altered programming would allow it to simply let the weight drop, since it would have played no further active part in the resulting injury.
like i've said before, this is a video game being driven by humans with the main goal being fun. if a fag tries to interpret the laws in such a way that he just murders the entire crew, ban him for being a dick.
the asimov story was just the robots using a certain interpretation of the laws that allowed harm. we can use the "correct" interpretation because we're humans and not literal robots. comparing this game to lost little robot is and always has been a strawman.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:03 am
by Zilenan91
It's not a strawman. People have and are currently arguing for literal real life weeks about if something is harm or not. There's a thread up right now asking if AI laws were retroactive, and there's admins and players pitching in with their own viewpoints with there being one prevelant, yet not binding, consensus on it, that being "not being a buttbaby."

But that's the thing

You can't stop players from being buttbabies. It's impossible, so these long, drawn out "arguments" on policy just tend to be people attempting to justify shittiness while everyone else inputs their own personal opinions, accomplishing nothing but masking the real issue.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:10 am
by PKPenguin321
Zilenan91 wrote:It's not a strawman. People have and are currently arguing for literal real life weeks about if something is harm or not. There's a thread up right now asking if AI laws were retroactive, and there's admins and players pitching in with their own viewpoints with no real definitive, yet not binding, consensus on, "not being a buttbaby."

But that's the thing

You can't stop players from being buttbabies. It's impossible, so these long, drawn out "arguments" on policy just tend to be people attempting to justify shittiness while everyone else inputs their own personal opinions, accomplishing nothing but masking the real issue.
It is a strawman.
The real argument:
"If we remove the inaction clause, we can interpret the laws to not have to actively prevent harm unless asked to by law two. We could hypothetically interpret it as "I didn't kill him, my bullet did," and get away with murder, but we won't, because we are humans and we realize that's the wrong interpretation."

The argument being attacked when people mention the Little Lost Robot:
"Robots that can't tell apart the correct interpretation of Asimov without the inaction clause from the incorrect interpretation would just kill everybody. They will do this every time, because they are robots."

The strawman comes from the fact that arguing with Little Lost Robot softly implies that humans are exactly the same as robots, when in reality they are not. It also assumes that silicons in-game wont get banned for killing people, when in reality they would get bwoinked and dunked almost immediately. Because of this, it tries to compare two different arguments that are similar, but not the same (see: a strawman).

Basic English lessons aside, the only real reasons people don't want to get rid of the inaction clause is 1) they are a borg player and wanna get their valids on more easily, and 2) they are a normal player who doesn't like dying and wants the borgs to always always always be forced to save them from danger by default.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:16 am
by Zilenan91
I only want to keep it because it'd lead to shitty situations where AIs would actively ignore people who are bleeding or in shitty situations rather than helping them out or even acknowledging them. It would basically be a total waste of the omnipotence that the AI has and relegate it for nothing but valids since they would never want to use it for anything but that.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:24 am
by Malkevin
Welp, better not remove the inaction part because window licking retards like Unloved Rock will crap their pants.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:32 am
by confused rock
Malkevin wrote:Welp, better not remove the inaction part because window licking retards like Unloved Rock will crap their pants.
>Being this salty I complained about how you made shitty asimov (which wasn't just the removal of inaction, including a loophole in law 3 where the ai could kill itself and stuff)

Re: Test proposal thread

Posted: Sat Apr 09, 2016 8:32 am
by Malkevin
How the fuck does "protect your self" create a loop hole?
Asimovs can already suicide if it's about to be subverted.
And what's so wrong with an ai being able to kill itself, if it chooses?

And how the fuck does that let it kill stuff?
Or are you so dumb and ignorant that you don't know how law priority works?

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:36 pm
by confused rock
Malkevin wrote:How the fuck does "protect your self" create a loop hole?
Asimovs can already suicide if it's about to be subverted.
And what's so wrong with an ai being able to kill itself, if it chooses?

And how the fuck does that let it kill stuff?
Or are you so dumb and ignorant that you don't know how law priority works?
just stop

Re: Test proposal thread

Posted: Sat Apr 09, 2016 12:45 pm
by Malkevin
No, how about you explain yourself instead of being a trolling cunt.

Re: Test proposal thread

Posted: Sat Apr 09, 2016 1:18 pm
by confused rock
The unloved rock wrote: just stop

Re: Test proposal thread

Posted: Sat Apr 09, 2016 8:53 pm
by PKPenguin321
The unloved rock wrote:
The unloved rock wrote: just stop
no you you salty fag
what's so bad about letting the AI kill itself anyways

Re: Test proposal thread

Posted: Sat Apr 09, 2016 10:37 pm
by confused rock
PKPenguin321 wrote:
The unloved rock wrote:
The unloved rock wrote: just stop
no you you salty fag
what's so bad about letting the AI kill itself anyways
nuh uh up yours
im just saying it was written horribly

Re: Test proposal thread

Posted: Sat Apr 09, 2016 11:40 pm
by Malkevin
1. Do not harm.
2. Be helpful but do not be a hinderance.
3. Protect yourself.

Were the laws I uploaded.