For years I've had the graph below in my head. I came up with it independently, but it's just too screamingly obvious to not have been discovered, named, and expounded about in the project management literature. The question about software architectures made me think of it.
So -- anyone know if it has a common name in the project management world?
Basically, it exemplifies the notion that if you're doing a simple task, then a simple tool will require a low amount of effort to start doing work -- but as the task gets more complex, the incremental effort required to get more results is high. Contrast this to a more complex tool that requires a large investment in effort (and perhaps training) to even start to use successfully, but then has a much smaller increment in effort as the task gets more complex.
I use this concept to help me make decisions between C vs. C++, sometimes assembly vs C (if it's a _really small_ processor), task loop vs. RTOS, polled vs. interrupt, module-buying vs. build my own, etc.
I also see this blithely ignored, often pathologically, by individuals or management, who misunderstand the gulf between getting a lab-pig prototype working, and getting a finished product out the door -- so a tool will be chosen that gets something spun up and looking hopeful in the lab, but then calls to dispense with that tool of limited scope will be ignored because "it just works so well".
So -- what is the graph _called_?
. . . ,´ ^ . ,´ | . ,´ | . / . ,´ _- . ,´ __-'' e . / __--' f . ,´ __--'' f . ,´ _--'' o . / _--''