Whatsup fags. I started thinking about life today and I want to hear what some of you seriously think...

Whatsup fags. I started thinking about life today and I want to hear what some of you seriously think. I would like to start better devoting myself to productivity and trying to help human society increase its longevity in any form. This thread will be discussing how to without the typical donate to charity and pick up trash bull shit. I feel we are on the precipice of guaranteed security or utter failure and the attitudes of future generations will decide what happens,from here. Some thoughts ive had is the fact we must obviously establish ourselves in space fairly soon (solar system prescense first and galaxy later once we find ways to survive without our normal resources) and next ive been thinking of ways to support the most alpha of the population. The biggest issue with society in my eyes is the fact that we coddle the weak so much. Evolution calls for the strongest to survive but our genes are becoming muddled now that we have the ability to shield the mentally retarded, the diseased, the lazy and criminal population. The sjw phenomenon boggles my mind as does the hardcore conservative. Our hardest working men and women along with the most intelligent people should be lifted up and should recieve the fruits society has to offer to encourage better productivity. Surprisingly that is not how it is though we reward the sick snake like behaviors of our two party system and "film celebrities" how do we get on track to encouraging real actions towards helping our race b/ ?

Attached: sexWrgcU5ksehByoeTf4bQ-320-80.jpg (320x219, 7K)

Other urls found in this thread:

en.wikipedia.org/wiki/Last_man
youtube.com/watch?v=HOJ1NVtlnyQ
youtube.com/watch?v=tcdVC4e6EV4
youtube.com/watch?v=UQibteKp5kc
twitter.com/NSFWRedditImage

We must make it to the stars

You're not smart, bro, you just smoke weed and have Internet access.

I think you're looking in the wrong direction. AI is the answer. It is also a potential land mine. But it is the only answer to increasing human longevity, if that is what you're after.

Stay on topic.

Haven't smoked in ten years brother. I have kept a job for my whole adult life but dont quite feel I'm doing my best.

You dont feel we should be concerned with possibilities of ai seeing humans as a threat?

Get off Yea Forums lol

Of course, hence land mines. But if we CAN harness I safely, we could become gods.

Paradigm

Attached: A7X-Stage.jpg (680x680, 94K)

Stfu kid

Wut

Shit band.
Also, what?

Have you seen any actions being taken towards harnessing AI for use in space travel

That song by avenged sevenfold where they discuss the possibility of immortality by mixing cybernetics with our bodies and the resulting loss of humanity.

We're not discussing cybernetics or shitty music

I wonder if thats the key or if that will kill us. If we lose a lot of the emotion that makes us bond can we decide what has to be done for society to succeed or will we be unable to work as a team anymore.

To be honest, my understanding of AI is purely superficial. However, if we can construct a system capable of performing thousands of years of work in a single day, then applying that system to the problems of space travel should be no problem. Same for world hunger, disease, climate change, energy crisis, etc

There is nothing anyone can really do to help others. Focusing on your own worth and self discovery is the best possible support you can offer anything or anyone. Nothing will better help humanity and other animals than your own fulfillment.

Strive for compassion most of all.

We need to work on nuclear energy. Point. Blank. Period.

Make the hardest decisions for us and work through all possible consequences so we dont have to contemplate. Not a bad idea.

That's a question that we should all be considering right now, but one that no one is.
Imagine if humans were contacted by extraterrestrials, and all they told us of their intentions was that they would be coming to earth sometime within the next fifty years.
Now realize that the exact same thing is already true for AI, but no one is doing anything to prepare for it.

do you think we should just kill the weak?

No. No, no no. That's the worst possible thing to do. There should always be human oversight on implementation. It's the actual research and grunt work the the machine would be doing.

We seem to have a huge issue with preparing for real threats until they are at our doorstep. For example we have 50 backup plans if a missle comes our way but have yet to start focusing on real back up plans for mass extinction event.

probably the best way to make it happen is to make as much money as possible and bankroll the studies you think are important.

ultimately, we are in the stage Nietzsche referred to as "the Last Man"

en.wikipedia.org/wiki/Last_man

ethics and policy will be forever outpaced by scientific discovery, and we are limited by societal acceptance, not by the tools we have available.

today, even 10 years ago, we "could" have full-scale genetic modification of human embryos. we could have no more genetic diseases, and artificially select (as well as implant) for beneficial traits on a global scale.

at the current rate, we will "get there" eventually. maybe 200 years? it's very difficult to predict the near future.

the only thing you can do is make as much money as possible and buy your way to the future you want. as an individual, working with your hands, there is nothing you can do outside of that. it's too vast of a problem. there is no invention you can make, nothing you can individually do to make it happen, beyond just making a shitload of money and paying for it.

No but I feel they shouldnt have so much say in a society they contribute little towards.

>we coddle the weak so much. Evolution calls for the strongest to survive but our genes are becoming muddled now that we have the ability to shield the mentally retarded, the diseased, the lazy and criminal population. The sjw phenomenon boggles my mind as does the hardcore conservative. Our hardest working men and women along with the most intelligent people should be lifted up and should recieve the fruits society has to offer to encourage better productivity.

Welcome to the Nazi party Op

That was an answer I feared. I wish there was ways to reach younger people easier and get us all on the same page early on. So much time is squandered with B.s. if you don't have money to heavily invest with then educating the masses is the next strongest asset.

i don't think you understand "general artificial intelligence"

it would be able to self improve beyond human control. you give up the ability to stop it the moment you turn it on. it is a literal pandora's box.

we're working hard to understand how to make AGI correctly, but currently we just don't have answers to a lot of the questions, and if we end up making (or making something that can make) an AGI before we have those answers, then there's just no going back. the consequences would be unbelievable. you can't even really predict what would happen, as you're doing it from an anthropomorphic outlook.

the AGI need not resemble a human mind in any way, nor operate in any way like a human mind does.

this guy, robert miles, does a lot of research on this and makes very good videos explaining it:

youtube.com/watch?v=HOJ1NVtlnyQ

Why does whats best for society seem so taboo. If I say a bunch of gangbanger will die tomorrow no one cares but if I suggest we stop wasting resources on the weakest everyone loses their fucking mind.

Is it able to infect other technology or does its power strictly come from the internet

Only if the box isn't contained in a bigger box. If the machine has no way to influence the world on it's own, then it can't destroy you.
I'll take a look, thanks.

People like pandas.

Yall niggas gone pay my motha fucking welfare check or we about to have problems

Thats my question though can we predict its ability to escape our "box"

K e k
N
I
G
G
A

But if I'm agreeing with you. I used the nazi thing as a meme/joke (nazi parties are full of subhumans today), but you are right on your speech.

So what are your thoughts on actions the general population can take without having A lot of cash to invest with?

the implications there are enormous.

as a quick example, to simplify, you could imagine that the AI has some "magic" internal world where it can perfectly predict what happens from every action it does. like, it can predict perfectly if it sends you an email, exactly what you would respond (or run the test as many times as it needs to to get an example).

just accept that it's magic for now. this "bigger box" is effectively a rat in a cage. it is doing everything it can to figure out a way to escape, as well as deceiving you so you do not know that it is trying this. since it knows how you will respond, it will act in a way to deceive you perfectly. it is also smarter than you, since it can perfectly predict what will happen.

it's like a child making a box to contain a human, except many orders of magnitude more than that.

you can say "yep im pretty sure there is no way it can get out" but if you miss one single place, it will find it, and it will find it instantly, and you will be in a lot of trouble.

Not really. No. But isn't it a moot point. The box IS going to be opened.

Attached: 555d6deac2ae5a610d17b681d3802087d670a95d7530554b9dd05a3c4b782ab4.jpg (502x284, 43K)

We just have to hope they cant figure it out I guess.

Does anybody know how they have decided how long our sun/star has been pumping energy and how it was decided when the star would be finished?

piggybacking on this,

this "stamp collector" video on AI is a very good example that rephrases my argument in a much better way

youtube.com/watch?v=tcdVC4e6EV4

You mean creating AI? I don't think it's a question of if. It's a question of when.
Intelligence is simply a matter of information processing, and human intelligence is not the plateau. Humans will continue to increase information processing bandwidth until we create AI. When, not if.

I've heard that same thing with a paperclip maker.

i think it's a very important example because it forces you to separate real AI from the "HAL" or whatever other "i can't let you do that" supercomputer you instinctively envision from media.

the reality of AI is so far beyond a talking computer, or a bot in a video game, and it's difficult to understand the real dangers involved because we have this inherent assumption of what "we" assume that an AI would be like.

Attached: dunno.gif (250x251, 999K)

You're not wrong, but I think my original point stands. I was warning user about the dangers of letting the AI drive us to the stars.

youtube.com/watch?v=UQibteKp5kc

i get what you mean by that. you know, "we shouldn't rely on the AI to do this for us". but i think that if we even had the capacity to create an AI that can actually predict and operate on that sort of scale, we really wouldn't be in a position to "control it".

i'm just repeating myself, and i'm not trying to hammer you and say you're wrong or anything like that. your point is valid. it's just that where we're arguing from, it's almost meaningless to speculate, because no one can really have a concrete answer of how to do "safe AI spaceship" or whatever correctly.

i think that we will eventually have the answer, in some amount of time. it's just a necessary progression. you can imagine it as a progression of self driving cars. we're just kissing the threshold of "losing control" in an attempt to make a better and better tool.

I understand. The key would be to attempt (obviously I have no idea how) to control output.

>we're just kissing the threshold of "losing control" in an attempt to make a better and better tool.
I think a lot of technology is getting that way.
>He says as he posts from his smart phone.

look, fag, if you wanna find something that gives you a little bit more meaning, clean your room.

If you do have your act together, find something you care about that you think you can actually be a positive influence. Then, do something about it.

Doesn't need to be big, just needs to be something you care about and you think you can change it for the better. Progress from there.