When it comes to some of the simulations scientists or game developers create, i recently learned while watching some MIT open courses on algorithms
.. that raw computing speed isnt the 'end all'
Thats what some of these authors covering this theory seem to gloss over
It really depends on the math and engineering principles that are applied when writing one. If the algorithm is well thought out, with scalability and other contraints in mind, it will process inputs that grow exponentially more complex better than say a different algorithm running on a supercomputer.
You could be using a Pentium II and still beat the other.
Developers pushing the PS3 beyond what they thought were once its limits is a good example.
Last edited by Easy Peasy Lemon Squeezy; 12-09-2012 at 06:01 PM.