News

Perspectives April 1, 2015

The geek shall inherit the earth

The age of developer-defined infrastructure

By

If “software is eating the world,” then the meal will be prepared by developers.

Over the past several years, there have been articles about the primacy of software engineers. This reality is supported by the fact that technical majors are making more money coming out of college than their classmates and the average salary for a developer has risen dramatically over the past few years. In fact, developers will soon become some of the highest paid employees in a company — and I mean every company, not just in Silicon Valley.

We are entering the age of developer-defined infrastructure (DDI). Historically, developers had limited say in many application technologies. During the 1990s, we effectively lived in a bilateral world of Microsoft .NET vs Java, and we pretty much defaulted to using Oracle as a database. In the past several years, we have seen a renaissance in developer technologies and application infrastructure from a proliferation of languages and frameworks (Go, Scala, Python, Swift) as well as data infrastructure (Hadoop, Mongo, Kafka, etc.). With the power of open source, developers can now choose the language, runtime, and database that make sense. However, developers are not only making application infrastructure decisions. They are also making underlying cloud infrastructure decisions. They are determining not only where will their applications run (private or public clouds) but how storage, networking, compute, and security should be managed. This is the age of DDI, and the IT landscape will never look the same again.

Read more on VentureBeat.