Progress usually seems to be a kind of punctuated equilibrium. The amount we know about the fundamental processes behind these technologies has accumulated just as quickly as it ever has (quite a bit faster, actually). The problem is that paradigm shifts, huge leaps forward in viable applications of theoretical understanding, requires a relatively full set of knowledge. If you're working towards the next big thing, knowing 95% of what you need to know is, to the general public, as good as knowing 5% of what you need to know. Ultimately, it's not enough. So you build your body of knowledge, learn more and more and more about the world, keep refining what you can refine, and maybe 5% of what you learn is applicable at the time. And then, after a long, gradual process, the applications come about in abundance. If it takes, say, 100kJ/mol of energy for some chemical reaction to take place, the difference between adding 10kJ/mol into the system and 90kJ/mol into the system 1. 80% of the total required energy
2. Completely indistinguishable as far as the result goes (they both fail completely to initiate the reaction) The difference between 99kJ/mol and 100kJ/mol is 1. 1% of the total energy required
2. The difference between a failed reaction anda successful reaction It'll take 80 times as long to get from 10kJ to 90kJ assuming a constant rate of change, and you'll get the same outcome during that entire period, but the progress isn't being made any more slowly. Edit: Also, while the author of this article is a "science journalist," he's still a layperson with a profoundly shallow understanding of the technologies he's writing about. Just because a layperson doesn't see progress (or appreciate the progress they do see as significant) doesn't mean progress isn't being made.