Skip to main content

The geek shall inherit the earth: The age of developer-defined infrastructure

Image Credit: Docker

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


If “software is eating the world,” then the meal will be prepared by developers.

Over the past several years, there have been articles about the primacy of software engineers. This reality is supported by the fact that technical majors are making more money coming out of college than their classmates and the average salary for a developer has risen dramatically over the past few years. In fact, developers will soon become some of the highest paid employees in a company — and I mean every company, not just in Silicon Valley.

We are entering the age of developer-defined infrastructure (DDI). Historically, developers had limited say in many application technologies. During the 1990s, we effectively lived in a bilateral world of Microsoft .NET vs Java, and we pretty much defaulted to using Oracle as a database. In the past several years, we have seen a renaissance in developer technologies and application infrastructure from a proliferation of languages and frameworks (Go, Scala, Python, Swift) as well as data infrastructure (Hadoop, Mongo, Kafka, etc.). With the power of open source, developers can now choose the language, runtime, and database that make sense. However, developers are not only making application infrastructure decisions. They are also making underlying cloud infrastructure decisions. They are determining not only where will their applications run (private or public clouds) but how storage, networking, compute, and security should be managed. This is the age of DDI, and the IT landscape will never look the same again.

The roles of developer and system administrators were separate and defined: developers made decisions about developer tools like source code management and issue tracking (Git, Jira), and system administrators (server admins, storage admins, network admins) managed production and infrastructure standards. However, with the move to clouds like Amazon Web Services (AWS) have given developers choice in what infrastructure services they can use.

VB Event

The AI Impact Tour – Atlanta

Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.
Request an invite

How did we get here? Let’s take a quick look at the evolution of the data center.

The three ages of the data center

Physical-defined infrastructure (~1985 to 1999)
During the rise of the client/server era, corporations were moving from mainframes to mini-computers to a powerful server computers coupled with personal computers. This was the age when hardware design and hardware vendors drove IT strategy. Engineers debated CPU architectures (RISC vs CISC), Power vs X86 vs SPARC and the strategic vendors of this generation were companies like Sun Microsystems, IBM, and HP.

Software-defined infrastructure (~2000 to 2014)
In physical defined infrastructure, software was usually paired with hardware. In the early 2000s, we saw the Intel X86 architecture win out in the CPU layer, allowing servers and systems to standardize. Once you have hardware standards, the software ecosystem grew up around these servers started to decouple the logical from the physical. Operating systems like Windows and Linux became the layer to which software interacted with hardware. Eventually, VMware pioneered the idea of software virtualization, which enabled IT administrators to render virtual computers, disks, and networks all in software. Riding the power of Moore’s Law, VMware turned physical defined infrastructure into software-defined infrastructure. You can trace the evolution from physical to virtual just by looking at how profit margins flowed from system vendors like Sun to the winers of the software-defined infrastructure age like VMware, Microsoft, Red Hat, and Intel as the de facto CPU standard. VMware moved from virtualized compute to storage (vSAN) and then networking (NSX).

If VMware pioneered the idea of software-defined infrastructure, the web-scale giants like Google and Facebook perfected it. Living by the adage “software will eventually work, hardware will eventually fail,” they saw the value of using commodity hardware and software to make unreliable hardware reliable.

The impact of this software-defined data center can be seen by the fate of some of the leaders of the physical defined infrastructure age: Dell went private, IBM sold its x86 server line to Lenovo, and HP is undergoing an identity crisis right now.

Developer-defined infrastructure (2015 to ????)
Welcome to the age of DDI, where developers are making decisions on how, what, and where their applications should run. DDI is the natural evolution of software defined infrastructure. The power of turning hardware into software is partly the separation of logical from physical into software, but mainly the fact that once you have hardware represented in software, you can treat hardware like any other piece of code. You can move it, you can program it, you can write programs for it. For example, on AWS, everything has an application programming interface (API) and can be programmed: storage, compute, networking, security, etc. Today, developers need to think like IT and operations, and IT administrators must enable developers to make these infrastructure choices and not constrain them. With this rise of devops and cloud, developers are looking for technologies to build, run, and manage their applications that support DDI: If VMware was the platform for the last 15 years, then companies, like Docker, could be the platform for the next 15. In particular, Docker supports:

  • Programmability and portability: Infrastructure should be treated like any other bit of software. No longer tied to a physical location, Docker containers can easily be moved anywhere, from a laptop in a coffee shop to a cloud. Even more importantly, DDI like Docker can be programmed through its own API.
  • Consolidation: This is the attribute of Docker that wins over most of the IT and system admins. Just like virtualization replaced physical infrastructure by turning 10 physical servers into 10 virtualized servers on a single machine, Docker can further increase the consolidation ratio of server workloads while improving speed and performance.
  • Move to microservices: Companies, like Twitter, Google, and Facebook have adopted microservices to build their next generation of applications. Microservices are easier to scale, update, and develop by a larger engineering team.

The geek shall inherit the earth

What are the implications for vendors, developers, and IT professionals in the DDI era? Software vendors that don’t understand or embrace developers will fail to be relevant. Incumbents that sell to IT administrators like virtualization, storage, network, or security admins will need to understand how to sell to developers. IT professionals need to think about how to enable developer choice and not restrict it. Finally, developers need to expand what the definition of an application is. Code isn’t just a program, or an app on a smart phone, code becomes everything from metal to management, to the final pixel.

Jerry Chen is a partner at Greylock Partners. He sits on the board of Docker. Previously he was vice president of cloud and application services at VMware.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.