Tuesday, June 23, 2015

The psychological theory of personal identity

Let's suppose that personal identity over time is secured by continuation of psychological states. Now imagine Jim and Sally are robots who are persons (if you don't think robots could be persons, just suspend disbelief for a while until I get back to the issue) and have almost all of their psychological states on their hard drives. According to the psychological theory, if you swap Jim and Sally's hard drives, Jim and Sally will go with the hard drives, rather than with the rest of their hardware. But here is something odd. When you unplug Jim and Sally's hard drives during the swap, either Jim and Sally continue existing or they don't. If they do continue existing, then by the psychological theory, they are surely located where the hard drives are, since that's where the memories are. They are basically reduced to hard drives.

There is a case to be made that they do continue existing, at least given the psychological theory of personal identity. First: To kill an innocent person, even temporarily (after all, many people, including me, believe that all our deaths are temporary!), is seriously wrong. But swapping hard drives doesn't seem problematic in this way. Second: There is some reason to think temporally gappy existence is impossible, and if gappy existence is impossible, then if Jim and Sally exist before and after the swap, they exist during it. Third (and specifically to the psychological theory): It is plausible that if the identity of a person across time is secured by a part of the person, then the person can exist reduced to that part. Thus, if the identity of a person comes from the soul, then the person can survive reduced to a soul.

So we have this: Given the psychological theory, Jim and Sally exist reduced to hard drives. But that's absurd! For we can replace hard drives by cruder mechanisms. We can suppose a computer where memory is constituted by writing in a large book. It is absurd to think a person can exist reduced to a book. So we should reject the psychological theory.

Well, that assumed that robots could be persons. Maybe they can't. And our memories do not sit on a convenient isolated piece of hardware in the brain. Indeed, that is true. But surely agents could have evolved whose memories are stored on a convenient isolated piece of hardware, and such agents could be persons. And the argument could be run for them.

8 comments:

Heath White said...

I don't think the psychological theory of PI is most plausibly identified with the memory theory of PI ... psychology encompasses more than memory. So nobody should think a person could be reduced to a book, or in fact, any other storage device. Surely the relevant part of 'psychology' includes both active as well as passive aspects of a person.

Alexander R Pruss said...

Just make sure the software is on the hard drive, and so the personality comes along with the drive.

Heath White said...

My thought was, so to speak, that the software needs to be running somehow, which argues for some kind of hardware (embodiment) though perhaps no particular hardware is important.

Anonymous said...

First: To kill an innocent person, even temporarily is seriously wrong. But swapping hard drives doesn't seem problematic in this way.

Well, it's certainly wrong if you cannot undo the killing. (That's one reason why it isn't problematic for God to kill someone. (cf. Heb. 11:19)) But under this scenario, "resurrecting" a robot is as easy as plugging the hard drive back in.


But surely agents could have evolved whose memories are stored on a convenient isolated piece of hardware, and such agents could be persons.

Hm, I wonder whether this may be begging some questions about the difference between an actual memory (personality, etc.) and a representation of one. But then, maybe that's the point.



My thought was, so to speak, that the software needs to be running somehow, which argues for some kind of hardware (embodiment) though perhaps no particular hardware is important.

But how can we make a principled distinction between "not running" and "running in a comatose state"?

Austin said...

I guess this presupposes some sort of functionalist, or broadly materialistic account of the mind. Given that assumption, it makes sense.

Are you familiar with MacIntyre's narrative theory of personhood? He sets aside a chapter in After Virtue explaining it, and I thought it was really interesting, if not the most robust explanation of it.

Michael Gonzalez said...

"But surely agents could have evolved whose memories are stored on a convenient isolated piece of hardware"

I wouldn't be so sure, Dr. Pruss. If memory, as it occurs in agents, is necessarily not a matter of storage but of re-creation of experience, then no such agent could exist. Moreover, even if we expand this discussion to include more aspects of psychology (as Heath suggests, and I agree), it could be that NONE of those is the kind of thing that is "stored" (regardless of medium). It could be that thinking, remembering, and acting in ways characteristic of your personality are all just activities which we naturally engage in. Activities cannot be stored. They are performed. The being which has causal continuity from moment to moment, and about whom it can be said that it did all these "personality-activities" in just such a way, is a continuing identity, and a story/biography can be told coherently about this entity. If their causal continuity were ever interrupted, they would cease to exist.

So, your robots are not agents if their "personality" is a matter of stored information. On the other hand, they could be agents in this enactive way that I'm suggesting, and then the only thing that could kill them is the cessation of causal continuity from moment to moment. But then, causal continuity of which parts? On this view, I think we'd have to say it's whatever parts do the control-panel sort of operation. In our case, that's the brain. In the robots' cases, it is probably something on the mother board. But removing the hard drive may be more like giving a person severe amnesia. The person does not cease existing. On the other hand, if you cause her control panel to cease, then she does indeed die and cease existing.

Joshua said...

"It is absurd to think a person can exist reduced to a book."

Why? Isn't that what John's gospel is about?

Joshua said...

"It is absurd to think a person can exist reduced to a book."

Why? Isn't that what John's gospel is about?