Sign in to follow this  
Followers 0
.H.

Bakker LIII - Sranc and File

29 posts in this topic

8 hours ago, The Prince of Newcastle said:

I cant see him scaling the hights of the first trilogy. I dont know if I am down for a third tbh. The Unholy Consult hurt me.

I loved bits of the preceiding books as well.

Same. I barely even visit these threads anymore, and I have no desire to re-read the books at this point. It's surprising how TUC proved so disappointing, in hindsight. I didn't even mind the revelations that were given--they just weren't even close to what could have been, given the potential explored beforehand. 

I hardly even care about the third series, and I was a huge fan up to TGO. 

Share this post


Link to post
Share on other sites

The Unholy Consult was straight up painful for me and rage inducing despite being very well written(at least in some parts). Whether or not I read the third series depends on what happens.

 

Quote

Essentially - depending who you ask and when you ask Bakker - it is about the idea that the central concept of meaning - truth, values, points of view - will be so subjectivized that no one will be able to 'know' anything, or even understand any context of anyone else besides your own ingroup. Everything will be so partisan, so required to have specific cachet, that people will break into non-communicating clades and tribes. And those tribes will be able to justify anything - any crime, any issue - because there will be no particular moral framework that anyone agrees on. 

So, pretty much how it's ALWAYS been. Horray!

Share this post


Link to post
Share on other sites
6 hours ago, Kalbear said:

Essentially - depending who you ask and when you ask Bakker - it is about the idea that the central concept of meaning - truth, values, points of view - will be so subjectivized that no one will be able to 'know' anything, or even understand any context of anyone else besides your own ingroup. Everything will be so partisan, so required to have specific cachet, that people will break into non-communicating clades and tribes. And those tribes will be able to justify anything - any crime, any issue - because there will be no particular moral framework that anyone agrees on. 

Here's Bakker's blog post on it back in 2011, though I suspect he's changed this a lot since then.

Yikes. This is horrifying and all too believable.

Share this post


Link to post
Share on other sites
1 hour ago, Ghjhero said:

Yikes. This is horrifying and all too believable.

Well, he goes further - hypothesizing using science to re-engineer humans to remove, well, any semblance of shared humanity. I personally tend to think that this is likely to turn out to be a good thing as much as it might be a bad one, because as humans go we kind of suck at a whole lot of things, and I suspect strongly that removing a whole lot of those things might make us inhuman - but would make us better. Bakker believes that humans will always go for the worst of all things, and so he envisions a group of people that happily removed all compassion and sympathy in order to improve their pleasure; I envision creating humans that increase their compassion because compassion (such as gratitude, or parental love, or love of pets) has been shown to be the most enduring and lasting form of continuous happiness humans have. 

My suspicion is that the engineering of humans one way or another is going to be a far more difficult lift than Bakker thinks, so it won't happen any time soon. 

Share this post


Link to post
Share on other sites

'Better' in what way, when the very things in the brain that would call something 'better' have been removed?

That'd be the same as making robots which behave in certain ways and saying 'Those are better humans'.

Share this post


Link to post
Share on other sites
13 hours ago, Dora Vee said:

So, pretty much how it's ALWAYS been. Horray!

That was my reaction too. I never really looked into what Bakker means by Semantic Apocalypse but I think his story Crash Space (?) depicts what Kalbear is talking about. 

Share this post


Link to post
Share on other sites
On 1/16/2018 at 3:13 PM, Ghjhero said:

What is the Semantic Apocalypse? I’ve heard Homo Deus is about humanity’s future so I assume Sapiens is about our past?

Well, everyone else pretty much answered that first question.  Here are the "talking points" from each of the three parts Homo Deus is divided into.

Part 1: Homo sapiens Conquers the World

Quote

What is the difference between humans and all other animals?

How did our species conquer the world?

Is Homo sapiens a superior life form, or just the local bully?

Part 2: Homo Sapiens Gives Meaning to the World

Quote

What kind of world did humans create?

How did humans become convinced that they not only control the world, but also give it meaning?

How did humanism – the worship of humankind – become the most important religion of all?

Part 3: Homo Sapiens Loses Control

Quote

Can humans go on running the world and giving it meaning?

How do biotechnology and artificial intelligence threaten humanism?

Who might inherit humankind, and what new religion might replace humanism?

Each part addresses a question in a chapter.  If those seem like something you'd be interested in reading about, it's probably a book for you.

Keep in mind, Harari's writing is far, far more readable than Bakker's philosophy stuff.

Share this post


Link to post
Share on other sites
On 1/17/2018 at 0:28 AM, Kalbear said:

Well, he goes further - hypothesizing using science to re-engineer humans to remove, well, any semblance of shared humanity. I personally tend to think that this is likely to turn out to be a good thing as much as it might be a bad one, because as humans go we kind of suck at a whole lot of things, and I suspect strongly that removing a whole lot of those things might make us inhuman - but would make us better. Bakker believes that humans will always go for the worst of all things, and so he envisions a group of people that happily removed all compassion and sympathy in order to improve their pleasure; I envision creating humans that increase their compassion because compassion (such as gratitude, or parental love, or love of pets) has been shown to be the most enduring and lasting form of continuous happiness humans have. 

My suspicion is that the engineering of humans one way or another is going to be a far more difficult lift than Bakker thinks, so it won't happen any time soon. 

I have to agree with @Callan S. here. Your description is very dystopian and frightening. I’d much rather humanity remain the way it is rather than turn into a soulless husk passed off as “compassion”. 

4 hours ago, .H. said:

Well, everyone else pretty much answered that first question.  Here are the "talking points" from each of the three parts Homo Deus is divided into.

Part 1: Homo sapiens Conquers the World

Part 2: Homo Sapiens Gives Meaning to the World

Part 3: Homo Sapiens Loses Control

Each part addresses a question in a chapter.  If those seem like something you'd be interested in reading about, it's probably a book for you.

Keep in mind, Harari's writing is far, far more readable than Bakker's philosophy stuff.

This does seem pretty interesting, I’ll have to give it a shot. I’m glad to hear it’s pretty readable; I appreciate philosophy, but I need it dumbed down to understand it. 

Share this post


Link to post
Share on other sites
2 hours ago, Ghjhero said:

I have to agree with @Callan S. here. Your description is very dystopian and frightening. I’d much rather humanity remain the way it is rather than turn into a soulless husk passed off as “compassion”. 

I don't understand this argument at all. It's bakkers viewpoint that humans would remove compassion to better further their pleasure, but that isn't remotely a certainty or even the most likely option. 

Here's an example - humans have a capacity to genuinely think about and care about 100 or so people at a time. When numbers get bigger than that they don't cause compassion. This is one reason why people freak out about seeing one war ravaged Syrian child, but don't freak out when they hear 50,000 people died. 

What if we changed that so you could care about every single human? That all of their lives and care and dreams actually mattered to you? Forever Peace did something like this, and it was an interesting concept. This seems far from being a soulless husk to me. 

Another one - what if we remove the tendencies of tribalism in humans? We remove the innate desire to protect those in our tribe and harm others? To justify the atrocities of our in group and punish the others, even though they are the same actions?

What if we could engineer humans to deal less with short term gains and desires in favor of longer term values? Before you say this is cultural, monkeys were recently given money in an experiment and the results were that the monkeys behaved virtually identically to a human marketplace - bad short term decisions, impulse and waste, etc. 

And that doesn't get into the medical issues that can be fixed- depression, anxiety, psychosis, trauma. 

How is this dystopic?

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0