Friday, May 26, 2023
Utopia Now
Monday, May 22, 2023
The Future of Freedom
Thursday, May 11, 2023
Mind Crime
Humans are really, really bad at planning in advance to not be monsters. We have a pretty horrible ethical track record. Genocide and slavery seem to come pretty easily to most of us, given the right time period and circumstances. If there are internalized morals, we sure took our sweet time finding them. Generally, I don't think humans are in a position to make rational, ethical choices involving other conscious beings. Regardless of your take on factory farming, it is pretty clear we didn't spend decades deliberating the ethical issues in advance. Have you fully thought through the moral implications of factory farming, or are you just along for the ride? I am very worried that unaligned superintelligence will kill all of humanity, or enslave us, or torture us, or become authoritarian and lock in terrible values for eternity. Still, I am also worried about mind crime.
Look at our track record with slavery. Read about the recent Rwandan genocide. Look at the various authoritarian regimes and staggering human rights abuses across the planet. But don't worry, we will somehow care a lot in advance about the moral rights of artificial intelligences. From the industry that brought you social media, and don't worry they totally thought through and predicted any negative ramifications of the technology and have your best interest at heart, here is the new god! And don't worry we will treat it well and we totally won't be enslaving a morally significant being.
If we gain the ability to generate millions of digital minds, we gain the capacity for horrors worse than any genocide or slavery in humanity's past. We might not even do it on purpose, but just through sheer ignorance. It took a long time for people to treat other humans as morally significant. And by long time I mean basically until fifty years ago in the U.S., and in many other countries this is still not the case. It isn't crazy to imagine that we will treat "computers" much worse. Mind crime will have to legislated early. If you knew slavery was about to become legal again in twenty years in the U.S., what policies would you put in place? How would you get ahead of the problem and ensure that morally significant beings aren't put in virtual hell? These are the questions we should all be asking.
The World Will End Because Math is Hard
I am a newbie to this field and Robert is the OG (someone who understands the entire stack). His take is entirely fair, as companies will only be incentivized to curb short term risks where they will be affected. The elephant in the room is obviously the end of humanity or worse. People that don't see this as feasible simply need to read "The Doomsday Machine" by Daniel Ellsberg. All this talk of nanotechnology makes us miss the obvious problem that we are a hair's breadth away from worldwide thermonuclear war at every moment. I wonder how things will change when a powerful, unaligned AI starts increasing its hold on such a world. Longtermists drastically undervalue the terror of events that kill 99% of people instead of 100%. In regards to long term AI alignment, I think the number of researchers will matter, and I hope people in the AI safety industry would be incentivized to study long term alignment outside of work hours. Maybe I'm wrong and there's not a strong impact, but I haven't managed to find too many negative impacts of such a pursuit.
Wednesday, May 10, 2023
Company Thoughts: Part One
Mind Crime: Part 10
Standing atop the grave of humanity, smugly looking down, and saying "I told you so," is just as worthless as having done noth...
-
Standing atop the grave of humanity, smugly looking down, and saying "I told you so," is just as worthless as having done noth...
-
The treatment of digital minds will become the most important ethical dilemma of not only the next century, but of the remaining lifes...
-
I would structure a rough listing of digital mind rights as follows, completely off the cuff and spur of the moment: 1. The ability ...