It seems like what Tononi is getting at is that human experience cannot be modeled on a computer precisely because we can't erase it willingly.
I'd be interested to know what he makes of HM and other patients with various kinds of amnesia.
EDIT: Because I feel compelled to go a little more in depth:
Today, Phil Maguire at the National University of Ireland and a few pals take this mathematical description even further. These guys make some reasonable assumptions about the way information can leak out of a consciousness system and show that this implies that consciousness is not computable. In other words, consciousness cannot be modelled on a computer.
Is 'leaking out' really the best way to describe it, though? There are a few interesting theories of forgetting such as trace decay and interference theories. Given that we're not sure exactly how forgetting works, is Tononi jumping the gun?
Tononi's a big shot and well established so I'll tread carefully, but that doesn't mean he's bulletproof.
Maguire and co begin with a couple of thought experiments that demonstrate the nature of integrated information in Tononi’s theory. They start by imagining the process of identifying chocolate by its smell. For a human, the conscious experience of smelling chocolate is unified with everything else that a person has smelled (or indeed seen, touched, heard and so on). This is entirely different from the process of automatically identifying chocolate using an electronic nose, which measures many different smells and senses chocolate when it picks out the ones that match some predefined signature.
The electronic nose doesn't have connections to an electronic thalamic dorsomedial nucleus (which would go to an electronic association cortex) or an electronic amygdala, which are places the olfactory bulb goes to. Which brings up an odd question: can we program an amygdala?
A key point here is that it would be straightforward to access the memory in an electronic nose and edit the information about its chocolate experience. You could delete this with the press of a button. But ask a neuroscientist to do the same for your own experience of the smell of chocolate—to somehow delete this—and he or she would be faced with an impossible task since the experience is correlated with many different parts of the brain.
I'm going out on a limb here, but consumer operating systems, for example, come with a mechanism for this deletion built into the programming. Ours don't. It's not evolutionarily advantageous.
The brain has a hilarious amount of integration compared to computers, too, and has had a whopping 4 billion years to evolve from our single-celled ancestors. Computers don't need to be able to learn that that smell over in that direction is a lion and that the lion needs to be fled from. Computers have also been around a few hundred years at most by the broadest definitions of the word, acted on by human minds and not the indifferent, undirected forces of natural selection.
The brain, say Maguire and co, must work like this when integrating information from a conscious experience. It must allow the reconstruction of the original experience but without storing all the parts.
The reconstruction of the original experience is almost always distorted to a degree.
The laws of physics are computable, as far as we know. So critics might ask how the process of consciousness can take place at all if it is non-computable. Critics might even say this is akin to saying that consciousness is in some way supernatural, like magic. But Maguire and go counter this by saying that their theory doesn’t imply that consciousness is objectively non-computable only subjectively so.
I think this is the key argument.
There is something of a card trick about this argument. In mathematics, the idea of non-computability is not observer-dependent so it seems something of a stretch to introduce it as an explanation. What’s more, critics might point to other weaknesses in the formulation of this problem. For example, the proof that conscious experience is non-computable depends critically on the assumption that our memories are non-lossy. But everyday experience is surely the opposite—our brains lose most of the information that we experience consciously. And the process of repeatedly accessing memories can cause them to change and degrade. Isn’t the experience of forgetting a face of a known person well documented? Then again, critics of Maguire and co’s formulation of the problem of consciousness must not lose sight of the bigger picture—that the debate about consciousness can occur on a mathematical footing at all. That’s indicative of a sea change in this most controversial of fields.
This is pure speculation on my part, but maybe the mathematics required to compute consciousness are simply beyond our world's capabilities at the moment and the discoveries are lurking around the corner after we make a few intermediate ones.