The Rise of Developeronomics
I have some thoughts about Venkatesh Rao's Forbes article, "The Rise of Developeronomics". The article, in brief, argues that "software is now the core function of every company, no matter what it makes," and that, as "software eats the world," maintaining relationships with excellent software developers is a prerequisite for survival for all firms.
One of the article's insights is, "while other industries have come up with systems to (say) systematically use mediocre chemists or accountants in highly leveraged ways, the software industry hasn’t." This is certainly true, and the most successful firms realize it. Again and again, I've worked for companies that try to save money, or accelerate development, by adding teams of mediocre (typically offshore) developers to a staff of great hackers. It almost never works, either because very few managers know how to use mediocre developers efficiently, or because it's impossible.
But I'm not sure that's always going to be so. I kept thinking as I read this article, "each year we write software that prevents us from having to write more software." WordPress means we don't have to make CMSes any more. Hadoop means we don't need to spend months writing ETLs like we did a few years ago. MongoDB makes it much easier to create and deploy a scalable data store. The list goes on—won't there come an inflection point when we've made so much software that the need for new code levels off?
And yet each time we discover a new thing software can do (mobile apps, social networks, big data, ...) it accelerates the growth of demand for software. I think this article might be roughly right about the trends for the foreseeable future. Carlo Cabanilla pointed out to me on Facebook that "as more and more software exists to solve common problems, Ops will become more and more valuable because you'll always need a scalable, cost efficient way to manage these things. You can have the best app in the world, but if it's always going down, it's like it doesn't exist." He should know, since he works at DataDog, which is trying to solve this problem.
Ken Young, who's solving big-data problems over at Mortar Data, thinks that "until the world has been faithfully modeled in software to the last degree there will be new need to predict and manipulate the real world in all its complexity. And since we are no closer to understanding the world than we were in Newton's time (or so it seems)...."
Right, and even if we did model the whole world, we'd need another system to model all the software we've written so we know whether it's running correctly, and so we can keep it running correctly. And as Turing proved, we can only get asymptotically close to that goal.