The parallel computing/manycore initiatives may be missing the point. The challenge of manycore computing is burn up as many CPU cycles as possible doing things that we don’t do today because the computational cost is too great. Making existing apps go faster is secondary.

Today’s focus on creating manycore development platforms like OS X.vi server’s Grand Central may be a subset of where the real action will be. Maybe current levels of parallelization are good enough for most apps. So what does that leave?

How else can we use manycore computing?
Some thoughts:

Application speed up That won’t be the big win for current apps – most feel current processors are fast enough – look at the popularity of the Eee. But I’d love Handbrake to rip my DVDs faster.

Advanced UI capabilities such as voice recognition that are loosely coupled independent processes. Your application won’t run any faster, but it will be easier to use. This is an area Microsoft is looking at. Historically, the UI has been a major consumer of improved CPU and display capability.

New forms of communication and entertainment, such as 3D virtual worlds. This is an extension of the video editing market. And just think of the storage requirements!

Communities of cellular automata One core, one or a few automata. For example, Brian Tung’s and Leonard Kleinrock’s 1996 paper Using Finite State Automata to Produce Self-Optimization and Self-Control discusses using automata to guide a group of agents to cooperate on a task in a distributed systems environment.

Optimistic computing defined by David Jefferson in a 1990 ACM paper titled Virtual Time II: Storage Management in Distributed Simulation as

An optimistic simulation mechanism is one that takes risks by performaning speculative computation, which, if subsequently determined to be correct, saves time, but which is incorrect, must be rolled back.

Update: Rethinking virtualization because once a core costs $3 and you’ve got 32 or 64 of them in a $2k server, why would you spend hundreds of dollars on software to create virtual machines when you’ve got dozens of real ones?

There’s value in easy migration of virtual machines from one physical server to another. A “thin” virtualization layer atop a manycore OS – Windows 7? – could enable Microsoft to take back VMware’s market cap and reassert control of the entire OS stack.
End update.

High desert optimist
Many performance enhancements already use optimistic concepts. But the ability to throw massive computes from networks on a chip – oh, and how about reconfiguring those on-chip networks on the fly – could take us in directions we, or at least I, can’t imagine.

The StorageMojo take
The first effort with any new technology is to recreate what you could do with the old technology. It is only with the 2nd generation that the truly innovative stuff enabled by the new technology gets built.

Consider this an effort to short-circuit that historical process.

Comments welcome, of course. Thanks to Prof. West for pointing out the Jefferson paper to me.