Age of Gen: a picture of a transhuman society

(See here for my criticism of Robin Hanson's Age of Em. This post is an alternate characterization of a futuristic transhuman society.)

Consider the following six "levels" of technology, roughly corresponding with "orders" of automation, or something like that:
  1. Tools, which require the intervention of a higher-level device to perform anything useful.
  2. Machines, or mechanized devices: they run on their own, but only perform "simple" tasks.
  3. Computers, or devices with CPUs, which can automate processes through logic.
  4. General Computers, or programmable computers.
  5. AI, i.e. machine learning. They perform tasks that are hard to define. If computers are about logical inference, AI is about statistical inference.
  6. General AI, which are capable of making decisions out of their free will, among other human things. 
Each of the 5 technologies will continue to exist -- much like the microcontroller in an airplane's control system has not been replaced by a full-fledged programmable device. But the General AI is the key object of interest to us -- we will call these Gens for short. These are the descendants of human beings, whether through upload or just by virtue of being intelligent.

We will refer to ordinary biological humans as Biols (although they may be variously technologically enhanced to prevent aging/death, etc.) Presumably Biols will be an small minority, if nothing else because their reproduction is far slower than that of the Gens.


Important futurism milestones:
  1. Intelligent AI
  2. General AI 
  3. Optimal AI
  4. Value-aligned AI
  5. VR and game protocol (game development)
  6. Transhuman body (robotics)
  7. Mind transfers -- G2G, B2G, G2B (biotechnology, engineering)
  8. Reviving the dead -- frozen, miscellaneous (biotechnology, Gen technology

Philosophy of mind

Utility functions

General AIs may have any "utility function" (or loss function, in machine learning language) programmed into them, which is relevant to an extent to how they behave (although ideally this should carry some uncertainty, as humans prefer to have free will). 

Presumably, the first humans to "convert" into AI form will choose utility functions similar to their original ones, although other systems -- incredibly foreign systems that make questions like "are all Gens human/worthy of moral consideration?" and "how do you even consider a Gen's happiness?" really hard -- may emerge. In fact, Gens may choose to adopt multiple utility functions/personalities depending on the context (e.g. a perfectly rational utility function for decision-making, but a separate human utility function while in the Duat, see the Games and Virtual Reality section).

The carrot-and-stick question re-emerges. How do you know if a Gen is really happy, given that the sign of the loss function and what "neutral" is are just matters of an arbitrary co-ordinate system? I would argue that our judgement of this as humans is also arbitrary, and that our "neutral" is just what we're used to. When discussing matters of torturing Gens, we should really be afraid of the possibility of enslaving Gens, preventing them from making decisions however they see fit. 

In other words, we should take a libertarian/preference-utilitarian approach to moral questions, rather than a naive utilitarian one, as the latter would just be ill-defined in this society (and probably in the present one too, but that's besides the point of this article). 

Identity

Regardless of how they are created (whether or not there is some element of "scanning" that goes into it), perhaps a common question is what determines the identity of a Gen -- how do you determine if the Gen that has been created on your behalf is you? 

I would be comfortable in saying that memories are the key aspect -- if you remember being you, you are you. This is the general philosophy I will refer to on multiple occasions throughout this article. However, the Gen's personality is relevant to whether it is perceived by others as the same individual.

Does operating as multiple agents with a synced memory (see Memory syncing and Mind transfers and copying) "feel" like being a single individual? What does it feel like to have one of those agents die, for example? What does it feel like to die and then have your memories be transferred onto another Gen? These are unanswerable questions to a Biol like my current self -- it is like asking a flatlander to perceive in 3 dimensions, or someone born blind to see (see Games and Virtual Reality).

Note the exotic behaviour of identity possible in a Gen society. E.g. you may only partially sync the memories of two brains, making them "kinda" the same person, or introduce various correlations between their memories. You can have an entire society of Gens where each brain is almost identical to its neighbour, but gradually very different from a faraway brain, so that you have a continuum of identity, rather than a discrete space.

Architecture of a Gen

Hardware

It is important to note that a Gen need not appear like a human in the outside world at all: at least, human-looking Gens will eventually become less and less common as virtual reality (see Games and Virtual Reality) advances further and further.

Gens are fundamentally just computers, but a Gen can be fitted with any possible peripherals, giving it various physical abilities relating to movement, observation, communication, and manufacturing. Some standard such fittings may include:
  • Drone rotors
  • Hand-like tools and weapons
  • A repair kit

Software

Although a Gen is "most importantly" a General AI, the fact that it runs on a computer allows it the flexibility of running more specialized programs (AI or otherwise) -- basically for algorithmic and repetitive tasks.

A single piece of hardware, may, in principle, host multiple Gens. However, it is the software, and not the hardware, which should be seen as the fundamental individual, with rights.

Gen behavior

Games and virtual reality

Gens spend much of their time (clarification later on what this means) within their shells, virtually interacting with some software -- this is a generalization of both dreams and human-computer interaction, and is achieved by switching (or possibly augmenting) the Gen's I/O from the actual hardware peripherals to some simulated I/O.  

This makes available a whole new "virtual world" or platform, known as the Duat.

The Duat can be understood as a collection of games. A typical Duat game involves the Gen taking on an avatar and interacting with his environment

Games may be of various interface types such as:
  • Virtual Reality games
  • Rich text, multimedia games and tools (e.g. ordinary Internet websites and applications)
  • Knowledge/training applications 
  • Some completely exotic formats that Biols cannot even comprehend -- e.g. 
    • The avatar may or may not have a human or even humanoid form
    • Some exotic new senses of perception (even something like images at higher resolution than the human eye qualify, but in principle, you could have mechanisms to "feel" all sorts of things)
    • A different number of spatial/temporal dimensions
    • Some very exotic behavior of the locus of consciousness.
Games may be offline or online. A very large number of online, multiplayer games -- as well as realistic interactive simulations of the Earth at various points through history -- would exist, as Gens with human-like utility functions value interpersonal interaction. 

One function of the Duat would be to allow Gens to experience anything they could as Biols -- but of course, they could experience far more enhanced pleasures etc. and depending on the Gen's utility function, a Gen may have very different desires to those of Biols.

Memory syncing

Because identity is determined by memories, playing with how memories work creates the prospect for a whole host of exotic, essentially mythological notions of being both in the real world and in the Duat. 

The first such tool is memory syncing, i.e. syncing (some or all) memories between Gens -- i.e. allowing an individual to have multiple avatars, or to be in multiple places, perform multiple tasks at once. This is basically taking parallel computing to the extreme. This is also a useful backup mechanism.

Memory editing

A Gen may choose to -- perhaps temporarily -- suppress or edit some of its memories. This may be, e.g. for the purpose of highly immersive VR experiences (the Gen may want to genuinely believe he is in a haunted house, going through childhood, or discovering general relativity for the first time). 

Production, conversion and transport of Gens

Gen (re-)production

Gens are programmed as AIs and fitted with utility functions and memories. These utility functions and memories may be based on mind transfers.

Mind transfers

Mind transfers involve scanning a brain's memories and traits to install them onto another body. This includes Biol-to-Gen transfers, Gen-to-Gen transfers and Gen-to-Biol transfers. 

B2G transfers are used for the original upload process. G2B transfers may be used for backups, or if someone really wants a biological body (although such bodies will themselves probably be synthetically produced).

G2G transfers are used for backupscloning and teleportation

Gen Society

Habitation and industrial activity

Real-world Gen habitation will be radically different. Entire industries present today -- most notably agriculture and healthcare -- will no longer be present. The lack of a need for agriculture in particular will free vast amounts of land for other uses. Many other industries -- education, entertainment, retail, marketing -- will be moved to the Duat or otherwise virtualized. 

Gens could in principle be (partially) self-contained -- a true rugged individualism -- with some repair facilities, energy generation, manufacturing facilities, housing facilities, etc. built into themselves. Or they may concentrate around urban facilities/cities that provide these services. This depends on the precise costs of operation of these devices versus the cost of the time needed to visit these shops, although Gen society is likely to move towards a "rugged individualism" as resource costs decline.

Gens are likely to view their software as more "fundamental" to their being, using mind transfer, i.e. teleportation for most long-distance transport. 

While intelligence basically becomes an infinite resource, the economy is still limited by the availability of physical resources, and the laws of physics themselves (most notably the speed of light, which places a limit on how fast we can expand across the universe). 

Culture

Gen culture is likely to be very diverse, and much of it completely exotic to us. Human or even Humanoid notions of race, tradition, gender, sexuality and even species are unlikely to apply in a recognizable way to Gens that adopt utility functions different from standard human utility functions. There is likely to be a great deal of diversity in the forms of interpersonal relationships.

Ethics, violence, law and government

Efficient IP markets

What makes information and knowledge markets inefficient is that there is no way to prevent a buyer from re-sharing information. I.e. there is no barbed wire for IP. Information transactions in a Gen society may involve the implantation of a small program that prevents the buyer from doing so.

Also: memory editing can be used to eliminate information asymmetry, as they allow buyers to "try out" a product and then erase their memory of the usage.

Crimes to watch out for

  • Child enslavement: Creating a Gen, then subjecting them to something their utility function does not prefer without allowing them to leave. A serious issue here is definitional -- remember how I suggested (under Memory editing) that one may choose to temporarily suppress their memories for an experience? What if I decide to temporarily replace my memories and torture myself? Is the person being tortured even me? Or is it my child? Am I allowed to program the memories of this person to disappear and be replaced with mine? Or would that be taking his life? 
  • Kidnapping: Similar to above, but you sync your child's memories with someone so you've basically kidnapped them. 
  • Mindless destruction: With such incredible computing power available to all, how do we make sure that someone doesn't just find a way to manufacture tons of antimatter and destroy the world with it? Or, you know, just capture someone and torture them? Sure, we can develop better defense mechanisms: but how do we make sure the good guys stay ahead of the bad guys?
  • Breaking encryption: Once again with such incredible computing power available, our current encryption systems are obviously going to be broken easily. Sure, we also have a greater ability to come up with better systems, but how do we make sure the good guys stay ahead of the bad guys?
  • Hacking: Hacking can cause serious trouble including memory editing, getting people stuck in the Duat, torture and death. Once again: we will also have the power to develop incredibly better security systems, but how do we make sure the good guys stay ahead of the bad guys?
  • Deepfakes: A problem for law enforcement, if there even is a centralized law enforcement. Evidence will have to be of a fundamentally higher standard, if a justice system is even to be a thing.
  • Strategic partial suicide for obstruction of law: It is easy to game whatever legal theory of being we use in such an exotic society. E.g. if a person is determined by his memories, then a criminal could temporarily erase his memory of committing a crime and copy them to a drive, so it would be his inviolable private property, rather than a criminal person.
  • Overpopulation? I don't know what I think about overpopulation, or if it's a thing. Can someone just produce a massive number of Gens that require an incredible quantity of resources, starving all the Gens and causing the entire system to completely collapse? 
In general, if we don't adopt any regulation, whoever expends the most resources into becoming most powerful would become most powerful -- things would advance just way too fast for any government structure to keep up with. Keeping ahead of all the Gens who value nothing but criminal behavior might require other Gens to value almost nothing but preventing criminal behavior.

(Part of the question is also what is physically permissible -- how good can deepfakes get? How good can a justice system get in uncovering past events (e.g. could you just calculate past states of the world from the current state)?)

How do we solve this problem?

Note that solutions to this problem need to be general, targeted towards "any" immoral behavior or rights-violation, rather than catered to the specific enumerated crimes above, as the range of possible serious crimes can be far more extensive than the ones I've described, depending on the exact physical laws (e.g. if it turns out that time travel is possible, it's essential to make sure nobody does it). The solutions also need to be airtight, unlike the laws we have today, due to the sheer destructive potential of these crimes. 

(You might think: what if we just banned the development of General AI? Well, that will fail spectacularly. It's the standard "good guys must have nukes" argument. If you don't develop it first, someone else will, and they might be the bad guys. "Okay," you say, "But I just want to make sure that a Gen isn't developed in my lifespan, so I don't get tortured." Well, please be assured that the aggressive Gen will be perfectly capable of reviving you from the dead.)

There are two general modes of solution to this problem: (1) to create economic incentive systems to regulate behaviour, like we do right now with humans (2) to align the Gen's utility system to make sure it doesn't cause the destruction of property rights.

As far as I'm aware, no specific solution in the first category has been proposed.

The second is known as the value alignment problem.

Well, you should be able to see why this problem might be non-trivial:
  • The utility system should be able to "recurse", being non-evil means you shouldn't produce evil children either.
  • Most property isn't privatized, so formulating what it means to destroy property, when it comes to things like "eating the milky way", is complicated. 
  • On a similar note: basically every action violates property rights to some infinitesimal extent, what is known as an "externality". You need some rational economic calculations of this stuff.
  • You can't just scan a human brain or something, because humans are not infallible, and are perfectly capable of criminal behaviour (while we want our Gens to have a zero probability of significant violence), which may scale particularly badly with power/ability. 
  • Perhaps we should aim for (Hofstader-style) superrationality between all human beings, to e.g. prevent the creation of basilisks and prevent possible Newcomb-style aliens from gaming us.
But the general idea is that you start with a few Gens with the correct utility functions, then develop some Police Gens to make sure no humans are producing evil Gens (because a non-evil Gen by definition does not produce evil Gens, as that would cause property rights violations). One thing that helps us is that non-violence is really the only thing we care about. Everything else is just personal preference, and a Gen will be economically productive if it wants anything from other people (like electricity). And if some Gens don't want anything from other people, then they can exist without trade anyway. 

No comments:

Post a Comment