In celebration of Earth Day, we’d like to talk about the emerging concept of application sustainability. Application sustainability looks at the overall energy efficiency consumed both during the development of the application and the execution of it in production. The thesis is that we can help the environment by minimizing higher energy development and deployment models.
The Green Software Foundation created the Principles of Green Software Engineering in 2019. This outlines principles such as, “Build applications that are carbon efficient.” But what are the factors that really determine whether an application is sustainable? Some of you may have read about the sustainability benefits of programming in Rust, but language choice is only part of the formula.
Hardware vs. code
Back in the mainframe days, a big machine sat in the middle of a concrete floor in a room. That was the only requirement back then — concrete flooring. COBOL, a mainframe coding language, was massively used. In fact, every time you withdraw money at an ATM, a COBOL program might still be running somewhere to make it happen.
By the time we got to client-server deployments, machines were getting faster and smaller, and we engineered the hardware to deal with the fact that the code was taxing processing power. This drove up the electricity costs for the company. Rack density was a big change, too — where we previously fit a few machines in a rack, we now do hundreds — making the cooling of these machines the top-line cost in data centers.
Nowadays, if you’re a cloud-first or cloud-only customer, the cloud provider is everything. And it has the unenviable task of proving that it’s driving energy consumption (measured in terrawatt-hour, or TWh) down. However, data center TWh is up slightly, while efficiency in the hardware has gotten better over the years. So, if the hardware vendor is making the hardware more efficient (and they are), why doesn’t it translate to decreased energy usage? Or, as hyperscalers have been able to keep energy consumption levels modest (despite the huge computation growth), why are we still talking in terms of big TWh numbers that are worrying from an energy consumption and sustainability perspective? The answer is that we have to look at the application itself.
Higher abstraction = higher energy cost, but there are trade-offs
So, what makes a sustainable application? The early promotors have focused on the code, pointing out that lower-level programming languages like Rust and C are more efficient than Java and much more efficient than Perl, Python, or Ruby. That’s bad news for low-code platforms, which bring an ever-greater level of abstraction as they bypass professional developers entirely and cater to citizen developers. The reality is that by bringing the application closer to the business user (good), we’re making an implicit choice to lose control over how that application uses energy (not so good). But sustainability of low code or no code will depend on the high coders of the vendors building the platforms themselves; their libraries and abstracts could be less or more efficient.
Indeed, low code isn’t going anywhere. When skills shortages make experienced developers hard to find, low-code developers can quickly step in and build apps to meet business requirements. We’re not going to teach citizen developers Rust. And even if we tried, the energy costs in doing so, and in having a lot of inexperienced Rust developers trying to code, may very well cancel out any sustainability improvements from the language alone. There are a lot of ways to develop energy-hogging applications, and it’s not just by using abstract programming languages.
This blog post is part of Forrester’s Earth Day 2022 series. For more Forrester insights on sustainability, see the full set of Forrester’s climate action blogs.
This post was written by Principal Analyst Sandy Carielli and it originally appeared here.