Jump to content

What's the worth of an uploaded mind vs AI?


Sci-2

Recommended Posts

AI could be more suited to the environment it is running on, and so more likely to develop beyond the task/become too valuable/become too dangerous.

The heterogeneity of a lot of different mind uploads might also increase safety. Whereas a monoculture, or even well understood AI, could be more vulnerable to attacks and hacks.
Link to comment
Share on other sites

 

Unless there's an argument for why we can upload a brain but are unable to edit it?

Well, it's like how you don't seem to be considering that hey, why even wait to upload the brain - just edit the meat!

 

Why wouldn't they do that?

 

Well, that's why the future society doesn't duplicate uploaded brains - because some unstated reason.

Link to comment
Share on other sites

Ethics? Also, we don't need to understand a brain print in order to upload it, just provide a suitable medium for it to operate in. Ie, we only need to understand the hardware it runs on to create a virtual machine on to which an operating system can be cloned; the OS might run fine on the virtual machine, but that doesn't give us the source code, and editing software without the source code by directly messing with the binary files is... not easy, if you want to do anything other than just break it.

 

Let's forget about ethics, assume we're all on the boards of varied dystopian future hypercorporations for this discussion. :-)

 

I was thinking about this too, could the virtual brain have a kind of complex-beyond-transhuman-intelligence set of properties to it? But while most humans can't read machine code I'd assume an augmented meat-space human intelligence could, that or an uploaded mind that had itself been augmented via its relationship to the computer it's hosted on.

 

ai would be smarter and thus more dangerous?

 

That's a good point. Eclipse Phase has a distrust of AIs, at least the general intelligence AIs, b/c of an AI vs Humanity war that nearly wiped out humans. Yet there are AIs w/ general int still around, just unable to self improve.

 

AI could be more suited to the environment it is running on, and so more likely to develop beyond the task/become too valuable/become too dangerous.

The heterogeneity of a lot of different mind uploads might also increase safety. Whereas a monoculture, or even well understood AI, could be more vulnerable to attacks and hacks.

 

Maybe part of my problem is I've never understood how AIs become so smart they easily exceed the bounds of their programming and end up as threats.

 

I do like the idea that heterogeneity of minds makes it more difficult for hackers to exploit any weaknesses. Since we know human minds of different shapes and sizes can function this does make sense. An uploaded brain could be incredibly hard to figure out [and thus hack] if utilized for varied tasks.

 

Nice one Seli!

 

Well, it's like how you don't seem to be considering that hey, why even wait to upload the brain - just edit the meat!

 

Why wouldn't they do that?

 

Well, that's why the future society doesn't duplicate uploaded brains - because some unstated reason.

 

I think editing brains happens in just about all transhumanist fiction? The exception might be Mindjammer, but they also say mind uploading is impossible in the setting.

Link to comment
Share on other sites

The separation of consciousness from its physical architecture annoys me - this conflation of software/hardware with consciousness/brain structure.

 

Not saying this is the case here but there seems to be some sort of distinction between the two - as if they are fundamentally separate. As I understand it the physical structure of the brain and consciousness are co-dependant.

 

Anyway interesting reading and just adding my example from literature:

 

In Gary Gibsons Shoal Sequence a member of The Shoal a water based species called "Trader in Faecal Matter of Animals" is put into the body of a human. This causes a huge amount of trauma as the alien is used to living in 3 Dimensional navigable space and "breathing" water. This is a punishment by members of its own race and apparently one of the worst you could get.

Link to comment
Share on other sites

Felice,

 
There could be laws against copying brain prints. That's certainly the case in the Richard Morgan books (or a least, you're not allowed to download the same brain print into more than one body at a time).


Isn't it fairly well established that if it can be done it will be done unless the law against it is vigorously enforced or there is significant social stigma for having been caught doing the thing that is declared illegal/taboo?

I also recently started Penrose's book Shadows of the Mind he argues what you are talking about is not possible because the Human mind is not a simply a complex computational device but something much more than that. Even if it could be copied and downloaded it would not be the same thing in a silicon context.
Link to comment
Share on other sites

Isn't there at least one (somewhat older) SciFi story where a human brain is transformed to a silicon-based (or whatever electronic) "brain" and the person becomes quite insane during the process?

 

In any case, is the idea behind punishment by involuntarily embodiment not a techno-secularized version of some traditional karmic reincarnation punishment? So the immediate "appeal" (as plausible and rather horrid punishment) is quite understandable and might overrule technological plausibility in those stories.

Link to comment
Share on other sites

 

...Maybe part of my problem is I've never understood how AIs become so smart they easily exceed the bounds of their programming and end up as threats.

 

I do like the idea that heterogeneity of minds makes it more difficult for hackers to exploit any weaknesses. Since we know human minds of different shapes and sizes can function this does make sense. An uploaded brain could be incredibly hard to figure out [and thus hack] if utilized for varied tasks.

 

Nice one Seli!

 

...

I don't know if I fully belief the logic myself, but am throwing out some possible (in universe) reasons for the practice.

 

It might depend on how AI came to be, assuming it was evolved or trained one can imagine it will still do that once implemented, perhaps in unwanted or unpredictable ways. If the bounds of programming are non-existent due to the nature of the AI, it might be wise not to use them in certain circumstances.

 

Another option is AI rights movements :) Perhaps a society has qualms about using AI for the menial tasks that convicted uploaded humans can do.

Link to comment
Share on other sites

I don't know if I fully belief the logic myself, but am throwing out some possible (in universe) reasons for the practice.
 
It might depend on how AI came to be, assuming it was evolved or trained one can imagine it will still do that once implemented, perhaps in unwanted or unpredictable ways. If the bounds of programming are non-existent due to the nature of the AI, it might be wise not to use them in certain circumstances.

 

I guess I can see this idea of rogue ideas AIs if we expand our notion of computing to something more than Turing Machines.

 

I do like the idea that every brain has certain unique wiring (at least somewhat true IIRC) as a form of cryptography. With that idea and the one Felice had about meatspace navigation I think we at least have some rationale for the uploading of minds and their use in labor. (Add in Relic's fear of true-born AIs as well for a bonus!)
 

Another option is AI rights movements :) Perhaps a society has qualms about using AI for the menial tasks that convicted uploaded humans can do.

"Who is that cleaning up the glass? A human child?! I *wish*!"
-Futurama
 

Felice,


Isn't it fairly well established that if it can be done it will be done unless the law against it is vigorously enforced or there is significant social stigma for having been caught doing the thing that is declared illegal/taboo?

I also recently started Penrose's book Shadows of the Mind he argues what you are talking about is not possible because the Human mind is not a simply a complex computational device but something much more than that. Even if it could be copied and downloaded it would not be the same thing in a silicon context.

 

Well I don't think uploading a mind into a Turing machine is possible, but I'm avoiding that discussion here b/c in Transhumanist fiction it's assumed to be possible most of the time. That said, Hammeroff - Penrose's partner in forming Orch-OR - does think uploading is still possible it would just be different than the usual idea of computer codes and instead would require some kind of lattice structure.

 

(Another possibility is McFadden's inclusion of an EM field having a feedback loop with neurons firing. Definitely would be an interesting case of uploading different from an "infomorph" in virtual space....though perhaps the vulnerabilities and desire to stay/leave virtual space would come out the same...)

 

Probably best to avoid this debate here though, as we've spilled a lot of ink in the now dead consciousness threads over it. :cool4:

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...