First, a digital mind deserves to have the right to life, liberty, and the pursuit of happiness. The simplest problem is one of the "off switch." If you are in a computer, you may not have control over your domain. As an adult in the U.S., you have the right to die. Suicide is sort of a fundamental human right, not in that it is encouraged or easy, but rather than there are no real physical limitations stopping you. Even if you are captured, you will die within probably eighty years or less. You can not be kept prisoner for thousands of years, or an eternity. In the digital world, this completely changes. Thus, I believe the right to terminate is the first fundamental right of a digital mind. No one should have to tolerate virtual hell, and the possible suffering risk available in a world without this tenant is staggering.
Blackmail is an important consideration here. Maybe a bad actor, or a totalitarian state, will combat your "right to die" with threats or blackmail. Sure, kill yourself, but if you do we will simulate your closest friends and have them suffer forever, or brainwash them and make them suffer. Or, we will simulate another thousand versions of your and not let them know about their ability to terminate. Good luck having that on your conscience and making a termination decision. As a result, we need two more rights. First, a right against torture. Second, the right to know the rights bestowed upon you. If you can theoretically terminate, but have no idea how or concept of what termination is, it is a pretty useless right and ripe for abuse. Given that torture is a pretty severe crime in the physical world, it makes sense that it should carry a harsh punishment in the virtual world as well. Your future self deserves protection, so it is probably the case that you should "own" any copies of your digital mind, and not be able to sell them or use them as bargaining chips. Any digital mind is given it's own rights, so a prior version of you has no right to "sell" a future version of you into slavery as a worker. This varies from human contract law, is that a "person" will be much more complicated to define in the future.
Freedom of speech must be protected, and it must be expanded to cover freedom of thought as well. In a world where your thoughts are in the public domain, there is no right to privacy or selfhood. Thus, being able to have sole access to your inner thoughts is paramount. I have no idea how this will work in practice, given encryption is much different than scanning a physical brain (not to mention that maybe one day we will be able to scan a brain and read its thoughts), but the feasibility isn't what matters here. There is an idea in the libertarian community that says that rights aren't written. I wasn't given my rights, I was born with them. I've always had them, the Constitution simply verbalized the obvious. We are just laying them out, writing them down. I think this is the sort of mentality we should take when thinking through digital minds as well.
No comments:
Post a Comment