Volume 1, Issue 4
4th Quarter, 2006


Indirect Mind Uploading:
Using AI to Avoid Staying Dead

Paul Almond

page 3 of 13

The process of mind uploading would be as follows:

The suggestion here is that it is not important that your brain itself survives and that your identity should be considered as associated with the information describing your personality rather than with any particular collection of matter. In this view, if something continues to exist with the same memories and personality that you had then you can be regarded as still being alive.

Mind Uploading
Image 3: A graphic representation of mind uploading. Copyright http://www.transtopia.org

There are two main philosophical issues with mind uploading:

For the purposes of this article, mind uploading has an obvious weakness: neither the scanning technology to capture a digital representation of a human brain with acceptable accuracy nor the computing power to actually run a computer model derived from it are available now. If you are planning on dying in the near future, unless there is a surprising technological breakthrough, you will not have this process at your disposal.

The Strong AI Issue
Mind uploading relies on the idea that computers can be conscious (whatever that means) and experience things as human beings experience them. This is important: there would be little point in actually trying to continue your consciousness by using a machine that could not actually be conscious.

The term strong AI was given by John Searle, an American philosopher, to the proposition that machines can be conscious and he described it as follows:

"...according to strong AI, the computer is not merely a tool in the study of the mind; rather, the appropriately programmed computer really is a mind, in the sense that computers given the right programs can be literally said to understand and have other cognitive states. In strong AI, because the programmed computer has cognitive states, the programs are not mere tools that enable us to test psychological explanations; rather, the programs are themselves the explanations." [2]

The strong AI case would seem to imply that:

Next Page

Footnotes
1. Chislenko, A. (n.d.). Drifting Identities. Retrieved June 22, 2003 from http://www.aleph.se/Trans/Cultural/Philosophy/identity.html (back to top)

2. Searle, J. R. (1980). Minds, brains and computers. The Behavioral and Brain Sciences 3:417-457 (back to top)

1 2 3 4 5 6 7 8 9 10 11 12 13 next page>