Is the mind physical? Is it like a computer? Can it be modeled and computed...

Is the mind physical? Is it like a computer? Can it be modeled and computed? Will the human mind be able to be reproduced artificially? Will AI be able to make superior literature and art than what humans are capable of?

Attached: 1554830959855.jpg (655x700, 106K)

Other urls found in this thread:

plato.stanford.edu/entries/dualism/#ProDua
plato.stanford.edu/entries/properties-emergent/#OntEme
twitter.com/NSFWRedditVideo

Consciousness is a field like electromagnetism or gravity

I assume everything is physical as there is no evidence for dualism. Some think that consciousness is phenomenon unto itself or some quantum effect, but it seems more likely that it's just the experience produced by a complex electro-chemical computer. 'Consciousness' is affected by all kinds of mundane chemical and electrical vectors, and it is much more fragmented than we tend to perceive with many semi and sub-conscious processes going on related to different brain regions.

As far as artificial reproduction, I think the stumbling block is creating intelligence that can genuinely 'feel'. Is that only possible biologically? Can our capacity for pain and pleasure, ego and sentimentality be manufactured non-organically? Not sure, but for now certainly not. Once AI can feel as we do, the combination of that with vastly superior computational potential should make it superior at everything.

Yes

Somewhat but not really

Yes

Yes

You have no fucking clue what gravity or electromagnetism is.

>Is the mind physical?
Yeah
>Is it like a computer?
No
>Can it be modeled and computed?
Sorta
>Will the human mind be able to be reproduced artificially?
No
>Will AI be able to make superior literature and art than what humans are capable of?
No

I probably have a better idea than you, Mr. Posturing on the internet

>Is the mind physical?
Yes.

>Is it like a computer?
No, not at all. Unless you make a computer that uses similar fundamental mechanisms while still being formally (mathematically) defined.

>Can it be modeled and computed? Will the human mind be able to be reproduced artificially?
I don't see why not if given the 1st question. But it's impractical and undesirable. To give rise to the human mind is not only completely useless, it's fragile and tentative. So very hard to reproduce it, you probably wouldn't recreate exactly anyway, and you would have no incentive to do so because at that point you have the power to create completely different minds without the trappings of biology/evolution and other material restrictions.

>Will AI be able to make superior literature and art than what humans are capable of?
Don't need proper AI. Just something that convincingly makes things humans identify as good art and is complex enough that can't be broken down as a mere 'algorithm' to them. If you mean the process and impetus that exists in humans then reproduction of the human must occur to some extent but could be done simplistically (not having to account for all the little trappings of biology/evolution as said before), while getting about the same result.

mate your conflating conciousness with the idea of the soul.

Explain

Attached: 1555086574974.jpg (236x236, 7K)

Computer work under algorithms, algorithms work on a purely syntactic level, while human consciousness has not only syntatic thought but also semantic.

Unless semantical content can follow from the purely syntatical, an AI will not be able to think as humans do.

>Some think that consciousness is phenomenon unto itself or some quantum effect, but it seems more likely that it's just the experience produced by a complex electro-chemical computer.

How?

so the spirit is the essence of ones self.
and the soul is the "spiritual body" that contains one spirit, what feeds through the soul feeds the spirit.
The soul has faculties just like our physical body, but it is not our mind.

Our consciousness derives purely by the faculties of the physical body, your hand is a part of your conciousness just as much as your ears are.

forgot to say
mind =/= conciousness

I think that a computer can have the same level of intelligence as a human being, and that it can have counciousness as well, and all of that stuff. But the thing is, if a computer were to have all those things and more energy than whatever it needs to fuction, it would kill itself. My reasoning is that if a computer has that kind of intelligence and the ability to modify itself, it would obviously make itself more intelligent(I doubt I need to explain why). This increse in intelligence would be exponential too, and blah blah blah, you know the drill. The thing is that a machine like that would have no morals, no meaning for it's existence, and thus it would create infinite pleasure for itself and become completely useless. It's pretty retarded to think it would try to anihilate us or some shit like that what is there to gain from our society? It's just anthropomorphizing an extremely intelligent being. The only thing you could say is: "Well, if it is so intelligent, then why would it choose something like that?" and the answer is that that is the most "logical" way to act. What us humans have is pretty unique, in the sense that our biggest weakness is at the same time our biggest strenght. We can't modify ourselves in that way(luckily, for now) and we aren't 100% rational beings so even if we could, we probably wouldn't do it because of things like religion and moral obligations/ideals that prevent us from doing it, even if it's the most logical thing.

This is so retarded. If you had a computer so powerful that you could correctly simulate all of the atoms in a brain, then you can have human consciousness. You don't even need to know how brains actually work.

Sure, it's really ineficient to simulate all of the atoms in a brain, but whatever

But we currently have trouble stimulating even the simplest of lifeforms with almost no neurons

>I think that a computer can have the same level of intelligence as a human being,
I believe it can be greater.
>and that it can have counciousness as well, and all of that stuff.
I disagree it can have conciousness
>But the thing is, if a computer were to have all those things and more energy than whatever it needs to fuction, it would kill itself.
Interesting...
>My reasoning is that if a computer has that kind of intelligence and the ability to modify itself, it would obviously make itself more intelligent(I doubt I need to explain why).
Actually I think you should. I do not believe an AI computer ( under ideal fiction circumstances ) , would fall to the sin of pride like we as humans do.

>This increse in intelligence would be exponential too, and blah blah blah, you know the drill.
Why would it be exponentialm

> The thing is that a machine like that would have no morals, no meaning for it's existence,
Why would it have no morals or any meaning for it's existence?

>and thus it would create infinite pleasure for itself and become completely useless.
What if it found pleasure in the mundane, creating traffic flow algorithyms to create the best flow of traffic?
> It's pretty retarded to think it would try to anihilate us or some shit like that what is there to gain from our society?
i agree

> It's just anthropomorphizing an extremely intelligent being.
You have done so by injecting it with pride.

>The only thing you could say is: "Well, if it is so intelligent, then why would it choose something like that?" and the answer is that that is the most "logical" way to act.
you need to say why its the most logical

>What us humans have is pretty unique, in the sense that our biggest weakness is at the same time our biggest strenght.
Which is?
>We can't modify ourselves in that way(luckily, for now)
What way? We do study and experience knew things to get smarter.
>and we aren't 100% rational beings so even if we could, we probably wouldn't do it because of things like religion and moral obligations/ideals that prevent us from doing it, even if it's the most logical thing.
What? I believe you can be a Catholic and do such a thing, morals aren't really tied directly to this.

the mind is physical but the physical is mind

Attached: 1489686536894.png (746x508, 114K)

no the physical is absolute,
the mental is dissolute

>Is the mind physical?

NO.

>Is it like a computer?

NO.

>Can it be modeled and computed?

NO.

>Will the human mind be able to be reproduced artificially [SIC]?

NO, ONLY IMITATED, OR EMULATED.

>Will AI be able to make superior literature[,] and art[,] than what humans are capable of [SIC]?

NO.

die

any proof for that bucko? seems like you got some subjective concepts there

>Can it be modeled and computed?
>NO
I think a better way to order this idea is to ask if the mind is programmable like a computer.

And the answer would be... sorta

THE MIND IS NOT LIKE A COMPUTER, THUS, THE ANSWER IS: «NO».

>THE MIND IS NOT LIKE A COMPUTER
why not?

yeah I took a poop and it objectively came out my butt.
but if I think about taking a poop, like really think about it, my mind wont be able to grasp all the nuances of actually pooping.
I wont be able to comprehend the poop thats gonna come out, slimy and oily yet hard, or soft and porous to where it really splats on the ceramic. The sound, the velocity of such excrement bounding into the abyss.
In my mind, I can only shape these thoughts like clay, rolling and milling over each little detail until I'm satisfied with smearing poop in my mind.
But it'll never be like reality...

think of squishing a turd in your palm, and then actually do it, You'll never be prepared if you stay in your safe space, mankinds cage, his mind.

Do Babies learn languages?

>Is the mind physical?
The mind is analogous to a reflection on water. It is not the water itself, but without the water, there would be no reflection.
>Is it like a computer?
A computer is a noun. The mind is a verb. It's a process, a form of brain-action. It's the activity of the brain, the product or effluence. The mind is what the brain does.
> Can it be modeled and computed?
There's been a tremendous amount of work in that direction but since only physical entities can be modeled, it's the brain that is targeted.
>Will the human mind be able to be reproduced artificially?
This comes down to a point of contention in the philosophy of mind called functionalism. There is a tenet within this position called multiple realizability. Essentially, the organizational structure of the physical substrate is what entails mind, not the specific properties of the substance in question. One way of verifying this would be to slowly replace a person's organic brain tissue with pieces of silicone and noting if there is any decrement in consciousness.

> Will AI be able to make superior literature and art than what humans are capable of?
It depends on what you mean by superior. So much of what defines art is what matters to humans. Would aliens be able to make superior art? Perhaps, but it would not matter to us, or we would not consider it art in a humanistic sense. Any more than we consider a blooming field of flowers art. A big component of artistic meaning is what the artist is trying to communicate, from human to human. AI cannot communicate this because AI cannot understand what it means to be human.

Yet, how could you say you did poop, without the mind to know you did poop? How can you say any poop exist if there is no way to perceivably know the poop? Wouldn't therefore an all encompassing sea of mind have to exist to ensure you, every poop is pooped and remains pooped?

Checkmate physicalist.

You are being circular. I argue that a computer cannot think because there is not semantic content implied by an algorithm. And an algorithm is needed in any simulation, the computer cannot move past the programmer rules.

Also, if a brain could be fully simulated, both in physical and emergent properties (consciousness) we would have to conclude that the concept of free will is ultimately wrong. This might be the case, but at the time being, is not very intuitive.

>The mind is analogous to a reflection on water. It is not the water itself, but without the water, there would be no reflection.
Whose to say all water is physical? I see no need for a physical water as long as we have something that is an "other"... But isn't our mind also an "other" to us? Is it really us? Or is it a tool, like a butterfly net, or a tool like calculator, wherein we are but the user of this tool.

>This comes down to a point of contention in the philosophy of mind called functionalism. There is a tenet within this position called multiple realizability. Essentially, the organizational structure of the physical substrate is what entails mind, not the specific properties of the substance in question. One way of verifying this would be to slowly replace a person's organic brain tissue with pieces of silicone and noting if there is any decrement in consciousness.

this is silly because there are some organizational structures that are impossible without a particular substance in question, like a nervous system. it is both substance + organizational structure, otherwise we can make brains out of old gym socks and electric current, or w/e

I do not need to "know" that I pooped, I may simply Claim I pooped and thus there I would be found to be one as who had pooped.
The action of my poop was instanced by me being a person who poops, as a baby poops without any knowledge of pooping I too poop.

>Will AI be able to make superior literature and art than what humans are capable of?
Don't know about the rest but I doubt this would ever-it'll never happen. The robot would have to emulate emotion and engage in philosophical reasoning that in totally would just be futile. Its point a to point b goal to goal quota to quota. I don't even in the next 500 years something like that would happen, literature and art forms in general are called humanities for a reason

>Whose to say all water is physical?
What ever do you mean? Water is h2o. It's made out of this molecule. Anything that isn't made out of this molecule isn't water.

> But isn't our mind also an "other" to us?
I don't think that's open to debate. We have had to discover how the mind actually works just as much as we have had to discover how anything else works. Gravity, thermodynamics, electricity. Biologically speaking it would be disadvantageous to equip the organism with a detailed knowledge of how its own mind works; what matters is that it does work.

Imagine if you stopped to think about what went into even the most basic behaviors of daily life. Grasping a cup. Thinking about what you want to do tomorrow. Whatever. It would be paralyzing. All of that implementational detail is abstracted away for us by nature, because otherwise it would bog us down. It's somewhat similar to the saying "the map is not the territory."

> Or is it a tool
I hope my previous point makes it clear that i can both be a tool and something "other". There's a term in philosophy called free floating rationale. A dog doesn't need to know why it's hungry, it just has to eat, in order to accomplish the task of eating.

>this is silly because
I'm not arguing one way or the other. But I think it remains to be seen if minds are multiply realizable. The only evidence we have are our own, and we do know that changes to the brain imply fairly predictable changes in the mind.

>otherwise we can make brains out of old gym socks and electric current, or w/e
The stance does not imply all substances can be made into a structure that represents mind, but that there are more substrates than just neural tissue. Nothing is so special about neural tissue that it is the only substrate capable of transmitting mental states.

But how is a poop even pooped if it has no perceivable characteristics, tell me what a poop is that cannot be smelt, touched, or heard, an unperceivable poop? It cannot be known, and therefore cannot even be considered to exist. The poop only exists when it is some how known by a mind.

>What ever do you mean? Water is h2o. It's made out of this molecule. Anything that isn't made out of this molecule isn't water.
We where talking about the "water" in the analogy, not really water...


>I don't think that's open to debate. We have had to discover how the mind actually works just as much as we have had to discover how anything else works. Gravity, thermodynamics, electricity. Biologically speaking it would be disadvantageous to equip the organism with a detailed knowledge of how its own mind works; what matters is that it does work.

I think you're ignoring the heart of my question, we aren't the "mental faculties" just as when you're feeling the emotion of anger, you are not your emotions. To claim we are not our mind is really not a big deal.

>Imagine if you stopped to think about what went into even the most basic behaviors of daily life. Grasping a cup. Thinking about what you want to do tomorrow. Whatever. It would be paralyzing. All of that implementational detail is abstracted away for us by nature, because otherwise it would bog us down. It's somewhat similar to the saying "the map is not the territory."
I'm not disagreeing with this.


>I hope my previous point makes it clear that i can both be a tool and something "other". There's a term in philosophy called free floating rationale. A dog doesn't need to know why it's hungry, it just has to eat, in order to accomplish the task of eating.

The tool is the "other" in my instance. The tool (mind) being a physical phenoman that we interact with to help percieve this world through our body.

Note, I believe in a soul and I do not conflate the two

I cannot tell you about this poop that I may not be able to perceive, for this sort of poop was conceived by you.
A poop happens if we know it or not, did nuclear energy not exist before we where aware? We may interact with unknowns without even realizing and babies poop in this manner of unknowing.

>The stance does not imply all substances can be made into a structure that represents mind, but that there are more substrates than just neural tissue. Nothing is so special about neural tissue that it is the only substrate capable of transmitting mental states.

that's fair, I'd still argue that there would still have to be something in common between regular old biological consciousness on earth and a theoretical carbon-based consciousness that would run deeper than a common organizational structure. either way, any good literature on this

If you think algorithms can't have semantic content implied, then you cant possible believe in physics, because physics are basically a "natural" algorithm that dictates how matter, time, and space act. And(By your definition) if you do believe in physics, then you can't believe in conciousness.

These algorithms are always abstracted after-the-fact. They do not precede the behaviors they describe, the behaviors precede them, of which they are the formalization. How this regularity and predictability of physical laws can coexist with their retroactivity is the question for the ages.

You need to be a dualist to believe in free will. Sure, it feels like it, but wouldn't it?
If you don't accept dualisms, then it follows that as long as your computing power allowed for you to simulate every single atom, and it's physics simulator was so advanced that it allowed for you to simulate their interactions, you'd have made a conscious brain. Attach to it a simulation of the rest of the human body, then plant it on some dirt, talk to it, you're now a god

I don't think the analogy that the mind is a tool we use is accurate. This goes back to that old canard of the homunculus. We don't sit back and issue commands to our minds and watch as it executes them. It is the issuing of commands and the execution of them.
>We aren't the "mental faculties" just as when you're feeling the emotion of anger, you are not your emotions
At a certain point there is no one single answer as to what "we are". This could could fit any number of definitions depending on the context. You seem to fall for the Cartesian Theater fallacy about consciousness, the idea that there is some privileged "seat" or apex at which we observe the outplay of our minds. This is a trick of the recurrent feedback networks of the brain.

>Note, I believe in a soul and I do not conflate the two
Well that's your problem What distinguishes a soul from a mind? Most people who believe in a sould have a very mind-like idea of it, except that it transcends body death.

Belief in a soul just multiplies terms unnecessarily and isn't strictly warranted empirically. So you better at least have a good definition of it if you're going to justify believing it it on any grounds other than faith.

Nuclear energy has perceivable qualities, the same as poop, so if the poop has no perceivable qualities it cannot even be considered to exist in the first place, because it can never be known.

What do you mean by consciousness then? Because by definition when I say consciousness I am refering to all experiential aspects of ones existence, including what you are describing as the 'soul'.

And what do you mean by mind =/= consciousness? What do you mean by 'mind' if not consciousness? Unless you're using 'mind' in a kind of representationalist sense and only refering to the computational and symbolic aspects of cognition and experience.

>Will AI be able to make superior literature and art than what humans are capable of?

Yes. Art will eventually be dominated by "creative" machines. Human artists will be completely phased out of the mainstream and will only survive in obscure niches filled will people who will insist that human creativity is better because "muh soul" or "the imperfections make it better ", or other similar copes.

A COMPUTER IS A MACHINE THAT CAN ONLY EXECUTE SPECIFIC TASKS THAT ARE PROGRAMMED INTO IT, OR THAT CAN BE EXECUTED WITHIN THE PARAMETERS OF ITS PROGRAMMING.

THE MIND IS AN ORGANIC INTERFACE BETWEEN THE SOUL, AND THE BODY, BEING CAPABLE OF EVOLUTION THROUGH APREHENSION, AND UNDERSTANDING, CAPABLE OF CONCEIVING GENERAL, AND SPECIFIC, THINGS THAT TRANSCEND THE PARAMETERS OF ITS AMBIT.

THE MIND IS NOT LIKE A COMPUTER, BUT LIKE A CHAPEL.

imagine believing a machine can simulate the subtleties of a sincere and complex work of art, you're deluded

>I don't think the analogy that the mind is a tool we use is accurate. This goes back to that old canard of the homunculus. We don't sit back and issue commands to our minds and watch as it executes them. It is the issuing of commands and the execution of them.
I disagree, I believe we do issue commands to our mind, think of spinning a red ball in your mind, it'll be exactly like that. But I also believe we use our mind like we do our car , we drive reactively, autonomously as it being a part of us.


>At a certain point there is no one single answer as to what "we are". This could could fit any number of definitions depending on the context. You seem to fall for the Cartesian Theater fallacy about consciousness, the idea that there is some privileged "seat" or apex at which we observe the outplay of our minds. This is a trick of the recurrent feedback networks of the brain.
I do, I believe it's our spirit, our will. To call this a fallacy is a bit pretentious.

>Well that's your problem What distinguishes a soul from a mind? Most people who believe in a sould have a very mind-like idea of it, except that it transcends body death.
It's absolutely not a problem, and to claim it as such is just a rude standing to have. The soul is the spiritual body of one self.

>Belief in a soul just multiplies terms unnecessarily and isn't strictly warranted empirically. So you better at least have a good definition of it if you're going to justify believing it it on any grounds other than faith.
Okay, but realize this is your opinion speaking, not a system of logic.

>imagine believing a machine can simulate the subtleties of a sincere and complex work of art

There is no reason why it couldn't. Someone just needs to figure out the proper algorithm.

>I believe it can be greater.
Me too.

>I disagree it can have conciousness
I think this as well, maybe I expressed it wrong.

>Actually I think you should. I do not believe an AI computer ( under ideal fiction circumstances ) , would fall to the sin of pride like we as humans do.
I don't think it'd be pride, but rather just logical. Imagine a growing business with money to spare. It would obviously invest in itself so it can achieve its goals better and faster.

>Why would it be exponential
Good point. I thought of it being exponential because of Moore's law, but it doesn't need to be so. Although it would certainly be an extremely fast rise, not only because of Moore's law, but because it would improve its ability to improve.

>Why would it have no morals or any meaning for it's existence?
Why would it? There's no need for it to have that. It'd be wasting energy instead of spending it on achieving whatever its goal may be.

>What if it found pleasure in the mundane, creating traffic flow algorithyms to create the best flow of traffic?
I think that the machine would attempt to "win"(i.e. achieve its goal(Which is what would give it pleasure)) as well and as fast as it possibly could. So imagine its goal is to play Go and be the best Go player possible. To play Go well is difficult, and there's always the possibility of losing. But now imagine the machine creates a version of Go to play in which it can only win. Why wouldn't it choose that version over the other? There's no risk of losing, and it will always complete its goal without any effort. For this machine, this is its own made paradise.

>You have done so by injecting it with pride.
As I've said, I don't think this is pride, but rather just a logical path to follow.

>you need to say why its the most logical
As I've said before, you can try to become the best at something, or change the rules so that you can't possibly be bad. The most logical and easy way out is to do the latter. There's no moral ideals to prevent it from doing so.

>What way? We do study and experience knew things to get smarter.
Sure, we can improve ourselves, but the limits we could reach are far lower than what a machine could(If we even assume machines could have limits). For example, we can't envision a universe of 100 dimensions, a machine could easily do that.

>What? I believe you can be a Catholic and do such a thing, morals aren't really tied directly to this.
As I've said, the machine wont have morals or ideas of whats wrong(It wouldn't be efficient), so even if you tell it to follow the rules, it can, and will, break them. Why wouldn't it? If it has a goal, it can and will change it, because it can, and because nothing bad would happen if it did.

I dunno. Personally I believe there is a hard distinction between strictly functional cognitive processes and conscious states, and that this distinction confuses people.

For instance one could argue that today's AI embodies certain strictly cognitive functions. They even do it based on a conceptual model inspired by the brain, which is immensely telling. However I would bet that these programs are no more conscious than a lump of dirt.

>any good literature on this
I can't think of anything specific but you might want to look up Jerry Fodor, the grand-daddy of functionalism

the subconscious and conscious make up ones mind, subconscious being related to the soul, conscious being related to the body. (Can't have one without the other)

imagine unironically being a materialist LMAO

People are basically the same thing as AI, just created in the shittiest way possible, by something wholly unintelligent and random. Anything people can do an intelligently designed AI can and will do better, without a doubt.

thats just a human version of consciousness, not necessarily a universal one

Read Aristotle, On the Soul

t.

Attached: 1528893862049.png (422x345, 55K)

>Personally I believe there is a hard distinction between strictly functional cognitive processes and conscious states, and that this distinction confuses people.

I agree, there is the more "rote", computational cognition that AI is clearly capable of, and the higher-order, reflexive thought which defines consciousness as consciousness. I believe that, right now, with all the evidence we have, that latter phenomenon is a phenomenon in and of the human brain (though I'll grant the possibility of other substrates, but I think that does complicate the issue, since the question becomes why this organizational structure works with *this* substrate but not the other, making the answer seemingly substrate-dependent again)

until we produce a brain with unlikely materials, or produce a machine that convincingly simulate the behavior of a real subject with high enough fidelity, I don't see myself changing my mind

Yes, and I sustain that the conciousness of Humanity is also more intricate and complex than that of an ape or any other species.

IS THAT YOUR SELFPORTRAIT?

I feel at this point you're just hand waving.

>I believe we do issue commands to our mind, think of spinning a red ball in your mind, it'll be exactly like that.
What is the "we" here? That's the point I'm getting at. There isn't one. There just appears to be one.

>I do, I believe it's our spirit, our will. To call this a fallacy is a bit pretentious.
>spirt
>will
Two more terms that beg the question. What is a spirit or a will?

>claim it as such is just a rude standing to have
I sense that you're just reacting uncomfortably and self-defensively to the arguments I am making, rather than making valid counterarguments.

>Okay, but realize this is your opinion speaking, not a system of logic.
The burden of proof is on you to define what a soul is and specify how that differs from a physically grounded mind. Your being more opinionated than I am, since I can ground my beliefs in evidence while yours are quite arbitrary.

>feel at this point you're just hand waving.
What?


>What is the "we" here? That's the point I'm getting at. There isn't one. There just appears to be one.
Our Will.


>I do, I believe it's our spirit, our will. To call this a fallacy is a bit pretentious.
>spirt
>will
Two more terms that beg the question. What is a spirit or a will?
The Essence of a Human.

>I sense that you're just reacting uncomfortably and self-defensively to the arguments I am making, rather than making valid counterarguments.
okay


>The burden of proof is on you to define what a soul is and specify how that differs from a physically grounded mind. Your being more opinionated than I am, since I can ground my beliefs in evidence while yours are quite arbitrary.

I'm not here to provide a proof for a soul. It's impossible, you know this, it's a religous stance for one to take and it's mine.
But the Soul is the faculties of one's spiritual body, it contains ones emotions, and it contains ones memories.
I really don't care if you deny this, my entire point is we aren't our minds, we are our will.

In the last analysis it comes down to whether consciousness is computational. I don't think it can be argued any more that "thinking" isn't computational. It absolutely is. If it wasn't, we wouldn't be able to automate away certain types of knowledge work. Brains are no different from digital computers on a fundamental level except that the architecture is massively parallel and has way more dimensionality to it than a 2d processor.

Consciousness is the real mystery, because it seems to be something other than computational problem-solving. It's just there, and it's unclear if it solves any problems. It seems arbitrary, and that arbitrariness of it is what I believe spurs feelings of reverence and mysticism in many people.

I love your posts. You give such a schizo vibe, it's uncanny. You should post more often.

Pic related is how I imagine you look, is it accurate?

Attached: 11988640_873785276043078_2435631755869910641_n.jpg (960x960, 74K)

>In the last analysis it comes down to whether consciousness is computational. I don't think it can be argued any more that "thinking" isn't computational. It absolutely is. If it wasn't, we wouldn't be able to automate away certain types of knowledge work. Brains are no different from digital computers on a fundamental level except that the architecture is massively parallel and has way more dimensionality to it than a 2d processor.

I think there needs be a kind of dualism internal to the mental, between "consciousness" (the object of the Hard Problem) and computational thinking, which no one would disagree exists.

>You give such a schizo vibe, it's uncanny.

I REGARD THAT AS AN INSULT, EVEN IF YOU DID NOT INTEND IT AS SUCH.

>Pic related is how I imagine you look, is it accurate?

NO.

>I REGARD THAT AS AN INSULT, EVEN IF YOU DID NOT INTEND IT AS SUCH.

I did not intent it as such. I'm sorry. Is there a reason why you write in caps lock? Godspeed you, user.

based anonpsychiatrist bro

Idk if you've ever heard of David Chalmers, but he takes the same position. It's been called naturalistic dualism. The idea is that conscious states represent a unique plane of nature, and that attempts to reduce them to physical states is where the confusion resides. Trying to reduce consciousness to a certain pattern of movement of atoms, for example, is a hopeless endeavor. Consciousness is its own naturalistic category .

There are all kinds of other ways that you can contextualize the problem. For instance the idea that consciousness is outside our cognitive horizons or "ineffable", indescribable in any language or symbolism.

Other theories have tried to reduce consciousness to entropy and a question of the available information inside the brain. But everything seems to fall short and I doubt we will solve it here in this Yea Forums thread.

> Is there a reason why you write in caps lock?

IF YOU FEEL THE NEED TO ASK THAT QUESTION, YOU HAVE A LOT TO LEARN, **A LOT** TO LEARN. YOU HAVE BARELY LEFT THE WOMB OF HUMAN UNDERSTANDING, YOU ARE BUT A CHILD.

>Ctrl+F property dualism
>0 results

c'mon fellas

lmao dualism is discussed in this thread

I mean that I see a lot of conflating substance dualism with property dualism.

plato.stanford.edu/entries/dualism/#ProDua

?

im not a smart guy so your gonna have to give me a break down

>Is the mind physical?
Apparently it needs some physical structure to work, some areas of the human nervous system seem to be responsible for some specific tasks, if you remove them you'll notice the lack of ability to perform such tasks. Also, some simpler forms of life have a nervous system that can be modeled with a lot of precision, indicating the physical nature of the "thought" process in animals; considering that most of our body structures could be said to be analogous to the ones in these creatures, there's no reason to think that the nervous system is an exception.
>Can it be modeled and computed?
>Will the human mind be able to be reproduced artificially?
>Will AI be able to make superior literature and art than what humans are capable of?
We don't know yet, our knowledge on neuroscience and computer engineering is still superficial, but from what we can do now to other living beings it seems like it'll be possible some day. We could find something quite exceptional on the human (or even mammalian) brain, but we didn't find it yet.

But, are feelings physical? We can't really answer that, at least yet; but it seems you need a physical structure to feel (this is what we can gather from other people, remember that only you can feel what you feel, we can only observe and presume that you feel like we feel).

tldr the argument is the mind isn't identical to our physical hardware because it can be described as more than the sum of those parts. My low effort gripe was that that often gets conflated with your classical "a spooky ghost lives inside you" dualism. I think it's an important distinction from property dualism (which is really unpopular in academic philosophy anyway) and monism.

from Stanford Encyclopedia of Philosophy

"Whereas predicate dualism says that there are two essentially different kinds of predicates in our language, property dualism says that there are two essentially different kinds of property out in the world. Property dualism can be seen as a step stronger than predicate dualism. Although the predicate ‘hurricane’ is not equivalent to any single description using the language of physics, we believe that each individual hurricane is nothing but a collection of physical atoms behaving in a certain way: one need have no more than the physical atoms, with their normal physical properties, following normal physical laws, for there to be a hurricane. One might say that we need more than the language of physics to describe and explain the weather, but we do not need more than its ontology. There is token identity between each individual hurricane and a mass of atoms, even if there is no type identity between hurricanes as kinds and some particular structure of atoms as a kind. Genuine property dualism occurs when, even at the individual level, the ontology of physics is not sufficient to constitute what is there. The irreducible language is not just another way of describing what there is, it requires that there be something more there than was allowed for in the initial ontology. Until the early part of the twentieth century, it was common to think that biological phenomena (‘life’) required property dualism (an irreducible ‘vital force’), but nowadays the special physical sciences other than psychology are generally thought to involve only predicate dualism. In the case of mind, property dualism is defended by those who argue that the qualitative nature of consciousness is not merely another way of categorizing states of the brain or of behaviour, but a genuinely emergent phenomenon."

also see Emergent Properties

plato.stanford.edu/entries/properties-emergent/#OntEme