Well?

Well?

Attached: 1515029264042.png (700x627, 265K)

the trolley isn't even moving

Also, there is no trolley, a human being or a robot, it's only a drawing.

how about now?

Attached: zoom.png (700x627, 251K)

It would throw itself into the trolley's axles.

Attached: 1567051346291.jpg (900x883, 65K)

Let the trolley kill the group, then stomp the single person to death.

If you read the book you would know the police robots were able to hurt human beings but with extreme discomfort.

0th law says that humanity as a whole takes precedent i.e. needs of the many outweigh the needs of a few.

>a law can never be broken

also we don't know if the robot has free will and can't hold him responsible for anything he does, because he doesn't do anything by itself, so it's the same as if a stone lay near the lever, robots are just objects.

That's how you get an AI overlord in the books.

realistically I would immediately walk away as far as possible and try to avoid being seen so that I'm not mixed up in any of this bullshit from a legal standpoint. I probably wouldn't like those five people anyway. welcome to china.

idiot

yes user you're very cute, have a biscuit

my facetious comment aside, if you think AI could ever be programmed with the certainty of a bullshit scientific law from a scifi novel, then you're retarded

probably a grey goo believing soiboi

I used to like Asimov, but studying A.I at uni made me lose all respect for the hack

imagine being this heiney-massacred over intentionally flawed fictional laws

easy to lose respect for something you never understood in the first place

So after upgrading to 1903, when I hit the windows key the "Type to search" box appears for a brief moment, but then it disappears before I can 'type to search'. Does anyone know how to fix this besides "creating a new profile"?

The whole point of his AI system was the had human-like minds and even personalities. It was to write good stories not be accurate to something that doesn't even exist now, let alone back when he was writing his books.

He kind of covers that is the books but there are different levels of priority inside each of the laws. The robot would likely chose the option that saved the most lives.

Turn the crank half way and shoot the gap

And then masterbait to it :^)

its better to kill those four people than the single person. If you kill the one person, the other four might find a way to antagonize you and overwhelm you with their number. If the one person lives, he would think twice about messing with a crazy fucker who just murdered four people

IT'S TOO FAST

this

Methinks we should find the guy who keeps letting trolleys loose.

The robot would have to request that a human override the trolley lever. It cannot resolve this situation without being sent to robot jail, for breaking robot law.