Moore's Law spoiled us. Back in 1965 Gordon Moore of Intel predicted that the density of transistors on integrated circuits would double each year, while fabrication costs would go down roughly 50% per year. By the 1980s, this rate of advance was being applied to personal computers and we all got used to seeing faster computers each year for the same price as last year's PC, or last year's PC for half the price this year.
This was a good thing - it drove the IT industry to live on that type of curve. The security industry is just getting hit with Moore's law expectations, another good thing.
However, it has also led to the naive expectation (or flabby analysis) that any hard problem will be solved in a few years because processors will get faster and cheaper. The last place I saw this is an article in MIT Technology Review on using chaotically fluctuating lasers to encrypt messages. The last line in the piece says: "Mirasso estimates that using lasers to keep information private is roughly five years away from commercial viability"
There are a lot of things (speaker independent, connected speech voice translation; artificial intelligence; cars that drive themselves; VCRs that figure out the time and don't lose it on power blips; Windows PCs that only do what you want them to do...) that require big ahas! - step function advances where someone thinks up an entire new way of doing things - not just CPU price/performance advances to do the old things faster/cheaper.