While this is true, there's a point of having dynamically linked libraries. The problem is that the dominating OS on desktop does have very limited mechanism of versioning of libraries (and packages in general, MSI is an attempt to better direction, but it still doesn't handle depedencies, versioning etc.).
Few years ago on the NT4 age, it was quite common, that after installing application Y - which overwrote some shared libraries in the system directory - disabled application X because of a version conflict.
The solution is that the program installer will install the shared libraries to the application's directory. But what is the point of having shared libraries after this..
(once again, this is out of scope of this ng)
Right. At least I have been thought (though I had previous work experince) to write custom memory allocation for a system that simply cannot crash because memory allocation failed.
For instance, many people think that garbage collection in Java must be very inefficient; actually it is quite the opposite, the garbage collector has information about memory usage, fragmentation and so on. It is likely, that it is able to do better job than the average programmer using manual memory allocation/deallocation.
I have a four years old computer, and I'm happy with it with Ubuntu 7.10. I bought it immediately with 1 gigabytes of memory, since it was quite cheap at the time. XP is definetly heavier with the same amount of concurrent applications than Ubuntu. Vista I wouldn't mention in the same sentence..
Actually, there are application that really require huge amounts of memory: image processing, CAD, etc. It is not just the M$ "bloat" (not saying that it wouldn't exist..).