Methinks the lady Epstein doth protest too much.
Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers
Sorry, pal, but encoders and decoders are very much absolutely 100% something we're born with. Pressure waves hit the tympanum which excites the cochlea which is lined with cilia tuned to discrete frequency bands. Ears are analog-digital encoders. Light hits the cornea and is diffracted onto the iris which is coated in photoreceptors that are sensitive to light/dark, blue, green and red. Eyes are analog-digital encoders. The processes by which this discrete particulate data is interpreted by the brain are lossy algorithms. I'm no neuroscientist, but even the terminology listed in italics is a mishmash of philosophy, biology and signal design and if a dumbass mechanical engineer can pick three out of the list with only two senses, the onus is on the author to disprove, not accept as a fundamental fact that computing metaphors are wrong because our brains are sugar-powered mush.
In his book In Our Own Image (2015), the artificial intelligence expert George Zarkadakis describes six different metaphors people have employed over the past 2,000 years to try to explain human intelligence.
Hydraulics, automata and electronics represent the zenith of technology at the time they were used as metaphor. Most importantly, all three depend on systems mechanics to operate - a spring is a water column is a capacitor. This is why engineers all study systems together - even worse, the ebb and flow of predator/prey populations can be modeled with the same equations. The medium is not the message here: the interactions are governed by the same physics and the same math despite being taught on opposite ends of the campus.
I'm no fan of singularity cults. But telling a bunch of information processing scientists they can't discuss human behavior without using terminology from information processing doesn't make you insightful, it makes you an asshole. Stating premises that are unfounded theory as if they are fact does not make you clever when you poke holes in them. And arguing that people draw better when there's a model in front of them doesn't make the entire metaphor null and void.