"But that’s pretty much it – we currently know of no other major ways to exploit Moore’s Law for compute performance, and once these veins are exhausted it will be largely mined out."
This is an awfully big post on Moore's Law not to include any mention of memristors*
The man is discussing paradigm shifts. While memristors will be pretty awesome if they ever come online in commercial production, I'm not aware of any major paradigm shifts they will cause (or prevent)...?
I've only a casual interest in the area, but these are some things I read around the time of the announcement:
"Memristive devices could change the standard paradigm of computing by enabling calculations to be performed in the chips where data is stored rather than in a specialized central processing unit. Thus, we anticipate the ability to make more compact and power-efficient computing systems well into the future, even after it is no longer possible to make transistors smaller via the traditional Moore’s Law approach."
– R. Stanley Williams, senior fellow and director, Information and Quantum Systems Lab, HP
"Since our brains are made of memristors, the flood gate is now open for commercialization of computers that would compute like human brains, which is totally different from the von Neumann architecture underpinning all digital computers."
– Leon Chua, professor, Electrical Engineering and Computer Sciences Department, University of California at Berkeley.
It just seems odd to go into such depth on transistor density and CPU/memory architectures (and potential future architectures) without mentioning memristors.
I agree that utilization of cloud resources will be an increasingly fundamental component of modern device architecture, but - at the risk of sounding hyperbolic - if memristors live up to the promise, we're talking about supercomputers the size of the human brain*
This is an awfully big post on Moore's Law not to include any mention of memristors*
[1] http://www.hpl.hp.com/news/2010/apr-jun/memristor.html