Well, 100% certainty? I'm not sure we can be 100% certain of almost anything. And to be sure it is an area of study and inquiry filled with speculation so you have to be careful to weed out the sense from the non-sense. But the rationale for stating that other organisms live entirely in the present comes from the fact that no other animals possess language, and therefore do not run a narrative. And it is a narrative that allows you to leave the present, otherwise you are simply experiencing percepts (perceptions). It is true that these percepts will accumulate over time and give you a type of "biological memory" like "last time I touched that stove I burned myself so I shouldn't touch it again" or "last time I encountered the smell of this human he was nice so it's ok to go up to him". But the organism isn't thinking about these experiences in narrative form. So you can think of the biological memory as a sort of intelligent guide from past experience, but the guide isn't letting you revisit that past. I wrote about this here. In contrast, humans can come up with concepts/abstractions, interesting cognitive tricks that help us recall past experiences in detail. Also human language helps us explore the vast realms of the future. As Jonathan Marks stated:
And in terms of dreams, of course nonhumans dream, but their dreams probably do not have a narrative-arc - they are probably just flashes of sensations - sights, smells, sounds, etc. - those sights, smells, sounds do not get a cohesive character-based narrative explanation.
Yes, chimpanzees are really the most impressive species in terms of "death awareness". Take for example this observation from a recent Current Biology study:
But it is clear from observations that they do not have a symbolic death awareness (again no language).
What I'm trying to do here - and hope to explore the concept in detail in an academic paper (still not ready for that yet) - is to make the term singularity more easily definable. Obviously the term "technological singularity" is attempting to capture a similar phenomenon. It is a term borrowed from mathematics and physics to describe a point in time when infinite technological change (at least to the human mind) starts to occur on finite time scales. But the problem with this term is that technological change is always relative to your time/place and so it is hard to rigorously quantify when we have "reached singularity". (I discussed this a bit in a previous post). Surely it wouldn't be a biological human making the declaration but still it does seem to lack a certain objectivity.
Another popular definition of singularity is a time when a greater-than-biological human intelligence emerges. But again this would be difficult to quantify as biological intelligence varies to such a profound degree. After all, Albert Einstein and Paris Hilton are both biological humans. And it's possible that biological humans could massively enhance their intelligence with synthetic biology or nanotechnology... so is that the singularity? Or no? Hard to say.
Futurists George Dvorsky and Ramez Naam (both very respectable thinkers and theorists) recently had a conversation about the usefulness of the term technological singularity and brought up similar concerns. Dvorsky prefers the term "Intelligence Explosion" to refer to a machine superintelligence. I agree with him that machine superintelligence is the key concept for singularity, but the term doesn't seem like a useful one either - considering that an "Intelligence Explosion" could have been said to have occurred in the past (i.e., Renaissance, Enlightenment, Scientific Revolution... heck, even now, in the "Information Age").
What I think we need is a term, not borrowed from physics and mathematics, but a term borrowed from chemistry and biology. Only one time in the history of evolution has a new form of evolution emerged and become an independent process: abiogenesis. I think you're aware that I did a video on abiogenesis as well as an extensive review of a 2013 paper on the phenomenon. This was an event in which chemical replicators managed to produce entities that could grow and replicate on their own (biology).
Since this time biological evolution has been the only form of independent evolution on the planet (this is because evolution works on the basis of differential replication and there have been no non-biological replicators). Of course, the human mind has set forth cultural evolution (cultural evolution is carried by the linguistic code and produces complex technology). The primary replicator would be the meme or the idea. But cultural evolution is living in a hybrid world and it creates an organism that is a hybrid of codes. This is why the human animal is such a weird creature. But over time cultural evolution is getting much stronger. As I state in a recent publication:
This "atechnogenesis" would be a time when cultural replication "idea sex" would replace biological sex as the form of novel complexity construction. Culture could leave biological evolution and finally gain its independence. Idea sex (i.e., conversation) is getting very strong already and it is close to producing new minds. Mind children, as roboticist Hans Moravec would put it. But in order for mind to become self-sufficient it will need to exist within a substrate of its own making (good thing we're working on the Human Brain Project and Connectome Project). So atechnogenesis is A) easily definable, B) is an evolutionary process, and C) uniquely describes the event futurists are trying so hard to understand.
The implications deserve their own book. But let's imagine future hubski. It would be a place where, instead of good minds coming together and typing to each other in written language (an imperfect way to communicate thoughts/feelings/ideas) we could share our brain patterns and form higher-levels of consciousness. Hubski would become an "orgy of minds" (as would everything else for the subjective consciousness of technologically-based life). We would be sharing brain patterns directly instead of communicating the surface of our thoughts with language. We would be able to combine and re-combine minds in news ways creating new "theaters of consciousness". Ben Goertzel has tried to describe this world if you're interested. Also, two of my colleagues at the Global Brain Institute wrote a great piece about the nature of a cultural evolutionary world where "worldviews" and not "genomes" are in competition/cooperation with one another.