many, many
you are
since it is
algorithms,
general interest
Sorting is actually a good example to show what I mean. It's incorrect to think that an algorithm with a better algorithmic complexity is faster than a simpler one. The implementation of a simple algorithm can be optimised more and so wins in most cases (unless the number of elements is huge). QuickSort is the best-known example of this, while it is theoretically slower than all NlogN sorting algorithms, in practise it is much faster.
There is easily a factor 2 to be had in implementation efficiency - after you've chosen the most appropriate algorithms. So optimizing the implementation make sense - I consider it an important aspect of writing high quality code. It doesn't mean spending a lot of time micro optimizing code. It means putting more effort in the design of the implementation, making it as simple as possible. The simplicity translates into fewer lines of code, and thus smaller and faster code.
A bonus is that the extra time spent on the design usually results in better understanding of the problem and so fewer bugs. In my experience there is a high correlation between the amount of code someone writes for a given problem, and the efficiency and quality of it. As a real-world example, how many lines of code do you need to calculate the number of bits used by a value (ie. integer log2)? I've seen an experienced programmer writing
20 lines of code with 4 mistakes (a few of which were not obvious or easily found by testing). The really worrying bit is that this was in software destined for a well known US attack helicopter...Wilco