I recently talked with someone in the computing industry, I'll forgo names as I'm so forgetful, in any case, his work is in neural science, and developed an algorithm called MSC, this algorithm allows for advances in machine vision, based on how the brain sees images, and more importantly is very computationally expensive, so it requires parallel programing, and not that shifty intel parallel programming, but real piece by sub-piece parallel.
So he assumed that his work would be all for not, for a good few years, possibly a decade, as the academic world (which he holds in great distain) has been unable since the 60's to come up with a solution to the problems within parallel programing, that is, until the industry beat them too it. Nvidia have been working on this, and it hasn't been long since some basic programing breakthroughs were made, and they now support (on any 4xx or tesla GPU AFAIK) full, unadulterated (general, that is, GPGPU) parallel programing, allowing his work to be carried through.
Needless to say this has many implications, the military already has a huge budgetary plan that they thrusted at him, and they want to design some sort of defensive installation, that includes MSC. I speculated at this point that it could be to do with spotting missiles, a typical use I'd imagine, however David wasn't sure how much he was allowed to talk about, so the subject changed.
We talked for a while about academia, and basically Davids conclusion was already met, academia was full to the brim with bullshit, and incapable of making anything, as it was too far removed from real world application to do anything. We discussed stanford and MIT, both of which he said were having trouble, as they hadn't published an real work for a long time in the field, just more bullshit.
In the long run my opinions have changed, and my understanding to, I also really want to try out some CUDA programing now, so I'll be purchasing a 460GTX sooner or latter.
What I have learned is that ATI doesn't really have driver issues, but in fact their programming pipe-line, and how they handle things on the development end is killing them, and their drivers. If they would shift over and meet Nvidia's standards, they would in fact do much better. (However this could be financial suicide in the business world) I also learned that Nvidia is really smashing academia with it's research, it's grounded in reality approach, as the industry tends to make, found the solution in months, beating the 50 years of academic research on the topic.
I voted for Nvidia on this poll originally, but now I feel my vote has something behind it.