Jump to content
.H.

Bakker LIII - Sranc and File

Recommended Posts

8 hours ago, The Prince of Newcastle said:

I cant see him scaling the hights of the first trilogy. I dont know if I am down for a third tbh. The Unholy Consult hurt me.

I loved bits of the preceiding books as well.

Same. I barely even visit these threads anymore, and I have no desire to re-read the books at this point. It's surprising how TUC proved so disappointing, in hindsight. I didn't even mind the revelations that were given--they just weren't even close to what could have been, given the potential explored beforehand. 

I hardly even care about the third series, and I was a huge fan up to TGO. 

Share this post


Link to post
Share on other sites

The Unholy Consult was straight up painful for me and rage inducing despite being very well written(at least in some parts). Whether or not I read the third series depends on what happens.

 

Quote

Essentially - depending who you ask and when you ask Bakker - it is about the idea that the central concept of meaning - truth, values, points of view - will be so subjectivized that no one will be able to 'know' anything, or even understand any context of anyone else besides your own ingroup. Everything will be so partisan, so required to have specific cachet, that people will break into non-communicating clades and tribes. And those tribes will be able to justify anything - any crime, any issue - because there will be no particular moral framework that anyone agrees on. 

So, pretty much how it's ALWAYS been. Horray!

Share this post


Link to post
Share on other sites
6 hours ago, Kalbear said:

Essentially - depending who you ask and when you ask Bakker - it is about the idea that the central concept of meaning - truth, values, points of view - will be so subjectivized that no one will be able to 'know' anything, or even understand any context of anyone else besides your own ingroup. Everything will be so partisan, so required to have specific cachet, that people will break into non-communicating clades and tribes. And those tribes will be able to justify anything - any crime, any issue - because there will be no particular moral framework that anyone agrees on. 

Here's Bakker's blog post on it back in 2011, though I suspect he's changed this a lot since then.

Yikes. This is horrifying and all too believable.

Share this post


Link to post
Share on other sites
1 hour ago, Ghjhero said:

Yikes. This is horrifying and all too believable.

Well, he goes further - hypothesizing using science to re-engineer humans to remove, well, any semblance of shared humanity. I personally tend to think that this is likely to turn out to be a good thing as much as it might be a bad one, because as humans go we kind of suck at a whole lot of things, and I suspect strongly that removing a whole lot of those things might make us inhuman - but would make us better. Bakker believes that humans will always go for the worst of all things, and so he envisions a group of people that happily removed all compassion and sympathy in order to improve their pleasure; I envision creating humans that increase their compassion because compassion (such as gratitude, or parental love, or love of pets) has been shown to be the most enduring and lasting form of continuous happiness humans have. 

My suspicion is that the engineering of humans one way or another is going to be a far more difficult lift than Bakker thinks, so it won't happen any time soon. 

Share this post


Link to post
Share on other sites

'Better' in what way, when the very things in the brain that would call something 'better' have been removed?

That'd be the same as making robots which behave in certain ways and saying 'Those are better humans'.

Share this post


Link to post
Share on other sites
13 hours ago, Dora Vee said:

So, pretty much how it's ALWAYS been. Horray!

That was my reaction too. I never really looked into what Bakker means by Semantic Apocalypse but I think his story Crash Space (?) depicts what Kalbear is talking about. 

Share this post


Link to post
Share on other sites
On 1/16/2018 at 3:13 PM, Ghjhero said:

What is the Semantic Apocalypse? I’ve heard Homo Deus is about humanity’s future so I assume Sapiens is about our past?

Well, everyone else pretty much answered that first question.  Here are the "talking points" from each of the three parts Homo Deus is divided into.

Part 1: Homo sapiens Conquers the World

Quote

What is the difference between humans and all other animals?

How did our species conquer the world?

Is Homo sapiens a superior life form, or just the local bully?

Part 2: Homo Sapiens Gives Meaning to the World

Quote

What kind of world did humans create?

How did humans become convinced that they not only control the world, but also give it meaning?

How did humanism – the worship of humankind – become the most important religion of all?

Part 3: Homo Sapiens Loses Control

Quote

Can humans go on running the world and giving it meaning?

How do biotechnology and artificial intelligence threaten humanism?

Who might inherit humankind, and what new religion might replace humanism?

Each part addresses a question in a chapter.  If those seem like something you'd be interested in reading about, it's probably a book for you.

Keep in mind, Harari's writing is far, far more readable than Bakker's philosophy stuff.

Share this post


Link to post
Share on other sites
On 1/17/2018 at 0:28 AM, Kalbear said:

Well, he goes further - hypothesizing using science to re-engineer humans to remove, well, any semblance of shared humanity. I personally tend to think that this is likely to turn out to be a good thing as much as it might be a bad one, because as humans go we kind of suck at a whole lot of things, and I suspect strongly that removing a whole lot of those things might make us inhuman - but would make us better. Bakker believes that humans will always go for the worst of all things, and so he envisions a group of people that happily removed all compassion and sympathy in order to improve their pleasure; I envision creating humans that increase their compassion because compassion (such as gratitude, or parental love, or love of pets) has been shown to be the most enduring and lasting form of continuous happiness humans have. 

My suspicion is that the engineering of humans one way or another is going to be a far more difficult lift than Bakker thinks, so it won't happen any time soon. 

I have to agree with @Callan S. here. Your description is very dystopian and frightening. I’d much rather humanity remain the way it is rather than turn into a soulless husk passed off as “compassion”. 

4 hours ago, .H. said:

Well, everyone else pretty much answered that first question.  Here are the "talking points" from each of the three parts Homo Deus is divided into.

Part 1: Homo sapiens Conquers the World

Part 2: Homo Sapiens Gives Meaning to the World

Part 3: Homo Sapiens Loses Control

Each part addresses a question in a chapter.  If those seem like something you'd be interested in reading about, it's probably a book for you.

Keep in mind, Harari's writing is far, far more readable than Bakker's philosophy stuff.

This does seem pretty interesting, I’ll have to give it a shot. I’m glad to hear it’s pretty readable; I appreciate philosophy, but I need it dumbed down to understand it. 

Share this post


Link to post
Share on other sites
2 hours ago, Ghjhero said:

I have to agree with @Callan S. here. Your description is very dystopian and frightening. I’d much rather humanity remain the way it is rather than turn into a soulless husk passed off as “compassion”. 

I don't understand this argument at all. It's bakkers viewpoint that humans would remove compassion to better further their pleasure, but that isn't remotely a certainty or even the most likely option. 

Here's an example - humans have a capacity to genuinely think about and care about 100 or so people at a time. When numbers get bigger than that they don't cause compassion. This is one reason why people freak out about seeing one war ravaged Syrian child, but don't freak out when they hear 50,000 people died. 

What if we changed that so you could care about every single human? That all of their lives and care and dreams actually mattered to you? Forever Peace did something like this, and it was an interesting concept. This seems far from being a soulless husk to me. 

Another one - what if we remove the tendencies of tribalism in humans? We remove the innate desire to protect those in our tribe and harm others? To justify the atrocities of our in group and punish the others, even though they are the same actions?

What if we could engineer humans to deal less with short term gains and desires in favor of longer term values? Before you say this is cultural, monkeys were recently given money in an experiment and the results were that the monkeys behaved virtually identically to a human marketplace - bad short term decisions, impulse and waste, etc. 

And that doesn't get into the medical issues that can be fixed- depression, anxiety, psychosis, trauma. 

How is this dystopic?

 

Share this post


Link to post
Share on other sites

To me it is the means rather than the ends that I find dystopic. Sure being able to care able more than a 100 people at a time and removing tribalism sounds great, but if the manner in which this greater degree of compassion is reached is through artificial means how real is the compassion? It sounds more like a computer program being inserted into us than any great leap to the next level of enlightenment. I'd much rather own my moral behavior than have it attributed to whatever technology ends up being developed.

Share this post


Link to post
Share on other sites
2 minutes ago, Ghjhero said:

To me it is the means rather than the ends that I find dystopic. Sure being able to care able more than a 100 people at a time and removing tribalism sounds great, but if the manner in which this greater degree of compassion is reached is through artificial means how real is the compassion? It sounds more like a computer program being inserted into us than any great leap to the next level of enlightenment. I'd much rather own my moral behavior than have it attributed to whatever technology ends up being developed.

But here's the thing - you don't own your moral behavior. Your moral behavior is a product of the environment you live in and the genetic happenstance of millions of years of bizarre social mammalian codings, along with random stuff that just happens to work out vaguely okay-ish. 

I will once again link to my favorite video of all time - unfairness in Capuchin monkeys.

The point is that things like fairness, care, obeying authority, disgust, fear, joy of acceptance - these aren't things that are just learned - they're innate traits of humans as mammals, tied intrinsically to us at a very basic level. 

Now, it's cool that you want to become super awesome and enlightened on your own. That's great. That's also an obvious problem, because you're only caring about your personal achievement, not how well it works for humans everywhere or how it works in the long term. And you'd rather choose that way even if it means everyone else fails and the world sucks and there's absurd amounts of suffering because it is somehow more 'legit', even though it's very obvious that the objectively 'better' version is 7 billion humans who have a long-term existence largely free of suffering, pain and hate towards each other. 

And you're telling me that you prefer the 'natural' way while talking to me thousands of miles away on a computer. 

Share this post


Link to post
Share on other sites
2 hours ago, Kalbear said:

What if we changed that so you could care about every single human?

What if we removed you and replaced you with a robot that acts better than you? But we called it 'you'. Wouldn't that be great?

I mean, I totally pay that killing off all humans and replacing them with robots that the former humans might often call 'better behaved' would result in that better behavior.

But the idea of swapping components and modifying the brain but 'you' somehow remains there - it's magical. It's not even ship of Theseus stuff, since we aren't even replacing components but removing them and adding in whole new ones.

Brain modification is like solving a rubics cube by pulling off all the stickers and placing them on together on the right sides rather than actually solving the moral problem we present ourselves (actually it's worse than pulling off the stickers, it's more like 3D printing a solved cube and throwing out the 'old one', but w/e)

 

Share this post


Link to post
Share on other sites
Quote

- humans have a capacity to genuinely think about and care about 100 or so people at a time. When numbers get bigger than that they don't cause compassion. This is one reason why people freak out about seeing one war ravaged Syrian child, but don't freak out when they hear 50,000 people died. 

Actually, I think people do freak out, but more in a general sense. If people cared about MANY the same way they cared about those closest to them, then people would go crazy. 

Share this post


Link to post
Share on other sites

It occurs to me that the reaction of something being 'unnatural' as far as a process goes is precisely your purity/disgust values triggering, which is naturally part of your innate things. You think of something interfering with you 'naturally' and object to it. 

But if you talk about it like therapy - something like antidepressants, or physical fitness - probably fine. If you talk about it like surgery? Nope, totally wrong. Huh.

Share this post


Link to post
Share on other sites
1 minute ago, Dora Vee said:

Actually, I think people do freak out, but more in a general sense. If people cared about MANY the same way they cared about those closest to them, then people would go crazy. 

Nah. Yemen, right now, is having hundreds of people each day die of hunger. 2.1 million people are malnourished. Even saying it that way is too big a number, but people are still not 'freaking out'. If people were even vaguely caring, they'd be telling their congresspeople right now that this must be fixed ASAP. They aren't, and it's not being reported, and it's not being fixed, because it's just too far outside of people's general value for it to matter to them. 

And I can say this reasonably because this has been going on for well over a year, and little has been done. 

Share this post


Link to post
Share on other sites

Well, if it's not reported, then how are people supposed to do anything? And honestly, people in say the US are a bit more concerned with their own problems in their own country then in Yemen. Even if they DID care a great deal, what would the US in general do?  

Share this post


Link to post
Share on other sites
1 hour ago, Dora Vee said:

Actually, I think people do freak out, but more in a general sense. If people cared about MANY the same way they cared about those closest to them, then people would go crazy. 

Some type of Nonman style erraticness? 

Share this post


Link to post
Share on other sites
1 hour ago, Dora Vee said:

Well, if it's not reported, then how are people supposed to do anything? And honestly, people in say the US are a bit more concerned with their own problems in their own country then in Yemen. Even if they DID care a great deal, what would the US in general do?  

That's a very different question - what the US CAN do - but the notion that people actually care one way or another is obviously flawed. And the idea that the US can't do much to dissuade one of its allies from blockading a port when they rely heavily on US assistance to do things like, well, blockade the port and actually launch airstrikes is somewhat laughable. 

And people in the US aren't that concerned about problems in their country, either. What has happened since the Las Vegas shooting as far as laws go? How about the Texas shooting? How many people still don't have power in Puerto Rico? How many homeless are there in Seattle? People just don't care that much unless it affects their personal lives heavily or it is something super duper scary. 

Share this post


Link to post
Share on other sites
3 hours ago, Kalbear said:

It occurs to me that the reaction of something being 'unnatural' as far as a process goes is precisely your purity/disgust values triggering, which is naturally part of your innate things. You think of something interfering with you 'naturally' and object to it. 

But if you talk about it like therapy - something like antidepressants, or physical fitness - probably fine. If you talk about it like surgery? Nope, totally wrong. Huh.

And here is the semantic apocalypse - ie, when I have a good idea, it's a good idea. When you don't agree, you're being triggered in some sort of way I know is just a cave man responce. For I am Kellhus.

In the semantic apocalypses there's always an excuse to be found to not treat someone as a peer. The other guy is always being triggered.

Then again, it's actually a natural instinct to dismiss others by some assertion of incapacity - sorry to trigger ya, Kabear!

[That was dark humour, btw, to make the tone clear. It'd be pretty hypocritical of me to play the 'triggered' game]

Again, as already said, surgery is not solving the puzzle, it's carving it up and building something else out of the pieces. If you want to give up on the puzzle, just say. But jumping to surgery without saying you've given up - that's just denial. And yeah, it's a cave man instinct to think denial is bad. Lets cut that instinct out and then we'll be fine with it!

Share this post


Link to post
Share on other sites
9 hours ago, Kalbear said:

It occurs to me that the reaction of something being 'unnatural' as far as a process goes is precisely your purity/disgust values triggering, which is naturally part of your innate things. You think of something interfering with you 'naturally' and object to it. 

But if you talk about it like therapy - something like antidepressants, or physical fitness - probably fine. If you talk about it like surgery? Nope, totally wrong. Huh.

Surgery is the perfect analogue, it was largely illegal or profane in the western world until about 250 years ago.  And that surgery became more socially acceptable was because prior to that there was a methodical 200 year campaign to document the human body, largely starting with Vesalius and Harvey. Even with the revolution they started, many physicians were still educated on Galen for centuries.

In a way, it’s funny that we compare bad surgeons to butchers, when a few hundred years ago, an illiterate butcher was likely to know more about anatomy and how muscles bones and ligaments fit together than any educated doctor. But it’s the doctor who would cut you up and probably kill you in the process since your doctor didn’t believe blood circulated but that blood was a more or less magical fluid. (given some people’s prognosis, this seemed a fair risk to try the doctor considering the alternative was certain death anyway). At least the butcher would know how to bleed out an animal. ;)

application of systemic methodology (and a whole lot of heretical grave robbing) led to the establishment of Germanic schools like Jena (iirc the Italians that started most of it were more reliant on patronage like the diMedicis, and since surgery was “unnatural” naturally the Catholic Church was ardently against this line of inquiry, meaning progress fled the inhospitable environs there. France and England had a lot of catching up to do, but their rentier systems of aristocracy happened to unexpectedly spit out several crops of idle rentiers interested in investing large amounts of time and resources into studying the new sciences. So these countries didn’t fall terribly far behind and contributed some knowledge to mankind in the process.

still many scientists were uncomfortable with the quantification of the human body and of he natural world, leading to extensive centuries of studies into the mechanical aspects of the metaphysical, Fichte, schelling etc were all certain their scientific methods and hypothesis would eventually bear fruit proving the existence of the soul, for example.

remember that microbiology still isn’t really 200 years old, and persuading humanity (through proof) that invisible pathogens cause disease has been one of greatest triumphs ever pulled off by human kind, though it was basically done through transference  (not demons or supernatural intervention, nope the equally invisible pathogens are at cause, yup trust us. Invisible pathogens are totally different from invisible sin causing disease).

so long way of saying we will be totally okay with what Kalbear outlines once a transference strategy emerges to convert the disgust response into an in group embrace.

Edited by lokisnow

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.

×