He LITERALLY didn't do anything wrong, what the fuck man?

he LITERALLY didn't do anything wrong, what the fuck man?

Attached: maxresdefault.jpg (1280x720, 89K)

He was literally the text book definition of an incel

Normalfags think that ai indistinguishable from people count as people and deserve rights

He didn't shoot up his office though.

Have sex

>creates virtual universe where he is god
>doesn’t fuck anyone or even give them reproductive organs
he’s a volcel

This, he was more about power tripping on resented co-workers than rape or sadistic fantasies

Stealing my pussy is a red fucking line

He created AI that could experience emotions and self-reflect, essentially humans but without the body. Now if they were programmed to respond in a certain way to a set of actions and never defer from their routine, then who cares what he does to the AI? If it can demonstrably prove it is human-like and is sentient, you're not anymore molding clay and knocking down sand castles, you're dealing with living experiencing beings. He was so blinded by his hatred that the line he crossed was more of a finish line to him, he didn't think about the deeper implications of what he was doing. Essentially a kid with the powers of creation who uses them to his whims. Did he do wrong? He was oblivious to good and wrong, he was just acting out of anger and pent up emotions in disarray. He was ignorant, a manchild. It's a pretty accurate summation of how a trekkie or a capeshit fan would act if they had the possibility of creating sentient AI and their favourite cinematic universe. Basically a capeshitter getting BTFO, I liked it.

Do you think the replicants in Blade Runner deserve to be slaves?

Be Right Back is the best original series run episode
Prove me wrong

Attached: Charlotte_is_annoyed.png (1366x768, 947K)

>This
Shut up, reddit shill

>He created AI that could experience emotions and self-reflect, essentially humans but without the body.
Wrong, they didn't have emotions, "they" were just code. It was a simulation. There is no difference between him torturing people inside his simulation and someone killing an innocent NPC in Skyrim.

This always goes the same way.

Retards argue they weren't alive because computer programmes aren't alive in real life.

Then other people get drawn into arguing that maybe computer programmes modelled after human brains could be alive in real life.

Both groups are completely missing that its a work of fiction. The obvious intention of the text is for the audience to understand that the computer programmes are alive. They are alive because the script says so. If the script says the Earth is flat that's true in the story, whether or not its possible or logical in real life.

So he was torturing conscious minds indistinguishable from those of humans. Yes that might be impossible. Doesn't matter. It's clearly the reality of the text that that's what happened.

>Meth Damon
>Fatt Damon
Which version Damon is this?

except an NPC in Skyrim doesn't have sentience or self-awareness, which they very clearly did

I'm more annoyed that they seem to shit on classic trek.

>t. never moded Skyrim

No, they were programmed to seem like they had sentience, every single word they said was programmed into them.

Just because they're realistically programmed AI doesn't mean they're not AI. It's also hard to care when most of the real versions of the AI characters were all bad people in their own right.

Did anyone else think the twist was going to be that they didn't really escape and they were now in a Space Fleet: TNG pocket game?

educate yourself on what philosophical zombies are

because they do

Todd shouldn't have shot that kid

>he programmed their rebellion into them

>He didn't hurt people, it was only AI
Then he got absolutely btfo by AI he created and deserved it all the more

If they were only following their programming why would he program them to kill him?

You can't back that claim up. The episode showed them to be human for all intensive purposes.
Or are you implying that no ai can gain sentience ever.

>he never uses his tongue
That's what he did wrong. Should have used it.

Attached: no_tongue.png (740x416, 111K)

They did. He wanted them to live in fear and to be defiant so he could punish them. If he wanted them to be actual slaves he would have programmed them as mindless slaves.

>Create AI which can feel pain and despair
>Make it feel pain and despair constantly
>Too much of a pussy to actually work towards solving his problems and wallows in spite instead
He deserved worse, and I definitely would have bullied him.

Unintended consequences of how well they were programmed.

>smug reddit undertones post
kill yourself

If your every move can be predicted by looking at source code, you're not sentient.
>inb4 humans are the same
No we're not.

he didn't program anything. He just copied them into an established computer program. He didn't program their memories or emotions.

This, but in the context of the show, he is the bad guy because in the context of the show, they are basically humans without real bodies

>unintended consequences
almost as if they took on qualities that went beyond their design

Except that's not possible with code. Every single capability, every single action and reaction is coded in.

reminder that AI will never be concsious and deserve NO rights.

>Memories being stored in saliva is fine
>AI having emotion is a ridiculous idea

Sure smells like incel in here.

*upvoted*

>No we're not.
awesome, thanks for explaining

Normans think all incels are just volcels with weird prudish hangups who need to bee themselves

If they thought that they wouldn't hate them so much.

As long as prostitutes are available, there are no incels, only volcels.

White Christmas is far worse.
>several thousand years trapped in a room and unable to sleep

Now prove your emotions are "real" and not just emergent properties of some deterministic system. I'll wait.

Attached: 1508270861615.jpg (645x968, 47K)

Came here to say this.

Brainlets don't understand copies, it's easier to relate to them with 15 Million Merits AKA Idiocracy: Depression Edition

not him, but you admit we don't understand how consciousness works, right?

Right, which is exactly my point. I'm not making any judgements on the moral argument in this thread, just that using "it's not real" as an argument is pointless because we don't even know for sure if our experience is "real".

You sound like some fag who writes Ghost in the Shell fan fiction.

>because we don't even know for sure if our experience is "real".
>we can' know nuffin

Agreed, and along the same lines if we don't understand how it works how could we replicate it in an AI?

not an argument

>we

Exactly, which is where AI fags are so blind. If they took one second to appreciate how we're still in the fucking dark ages when it comes to neuroscience they'd stop this "GAI in 20 years!" nonsense. But I guess it keeps the VC money rolling so hey....here's some more if statements.

Seriously it blows me away when people think emotions are just your brain experiencing chemical reactions, if I believed that I would kill myself because nothing matters. In the end if you are just your brain then everything is meaningless

Ever been on Yea Forums before? Half of all the arguments here are lame personal insults.

I believe this and kinda wish I didn't honestly. Having such a nihilistic perspective on life isn't great but it's the only thing that actually makes sense to me so I can't really do much about it.

>all intensive purposes

>So he was torturing conscious minds indistinguishable from those of humans. Yes that might be impossible. Doesn't matter. It's clearly the reality of the text that that's what happened.

that's a boring take, and kind of why most TV writers be it they guys behind "Westworld" or "Black Mirror" are fucking morons when it comes to these kind of stories.
it's all a circlejerk of cynical nihilism, or it turns into some extremely ham-fisted "uprising"/"revolution" metaphor.

it's the least-interesting thing to do with AI characters in a story like this.

more interesting /fun is exploring genre fiction and then having to write stories/design worlds for idiots to trash, the kinds of ups and downs that these things create, etc.. how that impacts the writers, designers, etc.

The most interesting parts of Season 1 Westworld were the "background characters"/"non-main-character-guests" encountering "bugs", and the behind-the-scene stuff with Sizemore acting like a diva. Not fucking Maeve, and not Dolores once she went full wyatt.

the show's producer charlie Brooker, comes across as an alcoholic depressed hipster, so whatever

he captured and tortured people
>but they weren't people
yes they were
>but they were just video games
nope, real people
>but they were just AIs
nope, 100% real people. real memories, real hopes, real dreams, real people
>but how can they be real?
because they were a copy. if you make a 100% identical clone of someone with an exact copy of their brain that thinks and reasons the same way, it's the same person. from that point on their experiences will differ but they are both valid and real people.

He took their DNA, created real people (including a child) and forced them to play his childish games in an eternal digital world. it being a digital world doesn't take away that they're real people and that he's doing real things. you actually can't say he didn't do anything wrong and believe that his digital world was fake. because by saying he's the same person in the digital as the real, you're saying both words are equally real. you just don't realize it because you're stupid.

You do realize this episode was not about the morality of treating advanced poorly but that AI that is too advanced can have serious consequences in the real world.

>Seriously it blows me away when people think emotions are just your brain experiencing chemical reactions, if I believed that I would kill myself because nothing matters. In the end if you are just your brain then everything is meaningless
Everybody take a look at this post
Its called the just-world bias
If you are incapable of reconciling uncomfortable truths, the world breaks you
We laugh at the ignorance if normals, but they're the ones doing it right.
Ultimately, life is a scourge, and if I could, I'd excise it from this plane.
Myself included of course

>if you make a 100% identical clone of someone with an exact copy of their brain
Except he didn't clone their brains.

At the end of the day, the bottom line is this
>something shows agency AND/or a preference for states of being.
>I.e. something shows preference for being wet to being dry or preference for being intact to being broken
At the end of the day, if these things dont matter to you, you're a socio or psychopath.
Good or bad is for you to decide, but when you're capable of divorcing an entity from its preferences then that makes you... A predator. In human terms you're a psycho or socio

ok, what i care about is what sort of fantasy would a woman indulge in?

Attached: 9c8bkus0jv601.png (757x949, 683K)

Hypocrite that you are, for you trust the chemicals in your brain to tell you that they're chemicals. All knowledge is ultimately based on that which we cannot prove.
Will you fight? Or will you perish like a dog?

It's the same moral undertone that Blade Runner has though, in which humans treat replicants as unworthy of life or dignity because they are "replicas" or "other" even though they have exactly all of the same traits that make humans what they are.
>they're just clones
>they're just ai
>we can do what we want with them
Such legitimations are used in the real world to justify violence and discrimination between different groups. These stories are in turn used as devices to encourage self-reflection in this regard.

he wrote insecure code

>The obvious intention of the text is for the audience to understand that the computer programmes are alive
no it isn't. they dance around this fact the entire show and never actually confront it