Showing posts with label Tech Laws. Show all posts
Showing posts with label Tech Laws. Show all posts

Monday, 1 September 2025

Software Efficiency Matters as Much as Hardware for Sustainability

When we talk about making computing greener, the conversation often turns to hardware. Data centres have become far more efficient over the years. Power supply units that once wasted 40% of energy now operate above 90% efficiency. Cooling systems that once consumed several times the power of the servers themselves have been dramatically improved. The hardware people have delivered.

But as Bert Hubert argues in his talk “Save the world, write more efficient code”, software has been quietly undoing many of those gains. Software bloat has outpaced hardware improvements. What once required careful optimisation is now often solved by throwing more cloud resources at the problem. That keeps systems running, but at a significant energy cost.

The hidden footprint of sluggish software

Sluggish systems are not just an annoyance. Every loading spinner, every second a user waits, often means CPUs are running flat out somewhere in the chain. At scale, those wasted cycles add up to megawatthours of electricity. Studies suggest that servers are responsible for around 4% of global CO₂ emissions, on par with the entire aviation industry. That is not a small share, and it makes efficient software a climate issue.

Hubert points out that the difference between badlly written code, reasonable code, and highly optimised code can easily span a factor of 100 in computing requirements. He demonstrates this with a simple example: generating a histogram of Dutch house numbers from a dataset of 9.9 million addresses.

  • A naïve Python implementation took 12 seconds and consumed over 500 joules of energy per run.
  • A straightforward database query reduced this to around 20 joules.
  • Using DuckDB, a database optimised for analytics, the same task dropped to just 2.5 joules and completed in milliseconds.

The user experience also improved dramatically. What once required a long wait became effectively instantaneous.

From data centres to “data sheds”

The point is not just academic. If everyone aimed for higher software efficiency, Hubert suggests, many data centres could be shrunk to the size of a shed. Unlike hardware, where efficiency can be bought, software efficiency has to be designed and built. It requires time, effort and, crucially, management permission to prioritise performance over simply shipping features.

Netflix provides a striking example. Its custom Open Connect appliances deliver around 45,000 video streams at under 10 milliwatts per user. By investing heavily in efficiency, they proved that optimised software and hardware together can deliver enormous gains.

The cloud and client-side challenge

The shift to the cloud has created perverse incentives. In the past, if your code was inefficient, the servers would crash and force a rewrite. Now, organisations can simply spin up more cloud instances. That makes it too easy to ignore software waste and too tempting to pass the costs into ever-growing cloud bills. Those costs are not only financial, but also environmental.

On the client side, the problem is subtler but still real. While loading sluggish web apps may not burn as much power as a data centre, the sheer number of devices adds up. Hubert measured that opening LinkedIn on a desktop consumed around 45 joules. Scaled to hundreds of millions of users, even modest inefficiencies start to look like power plants.

Sometimes the situation is worse. Hubert found that simply leaving open.spotify.com running in a browser kept his machine burning an additional 45 watts continuously, due to a rogue worker thread. With hundreds of millions of users, that single design choice could represent hundreds of megawatts of wasted power globally.

Building greener software

The lesson is clear. Early sluggishness never goes away. If a system is slow with only a handful of users, it will be catastrophically wasteful at scale. The time to demand efficiency is at the start of a project.

There are also practical steps engineers and organisations can take:

  • Measure energy use during development, not just performance.
  • Audit client-side behaviour for long-lived applications.
  • Incentivise teams to improve efficiency, not just to ship quickly.
  • Treat large cloud bills as a proxy for emissions as well as costs.

As Hubert says, we may only be able to influence 4% of global energy use through software. But that is the same impact as the aviation industry. Hardware engineers have done their part. Now it is time for software engineers to step up.

You can watch Bert Hubert’s full talk below, where he shares both entertaining stories and sobering measurements that show why greener software is not only possible but urgently needed. The PDF of slides is here and his LinkedIn discussion here.

Related Posts:

Thursday, 17 October 2024

TechKnowledge Technology Stories (Series 1)

TechKnowledge is a series of Technology Stories looking at how technology has evolved over the years and how it will continue to evolve in the future. The series is targeted at youth looking to understand how technology has been evolving and how it will evolve further. It is our intention to make a ten part series but as of yet only four parts are complete. 

Part 1: 'Smaller, Faster, Cheaper and More…' looks at how technology has evolved by things getting smaller, faster, cheaper and much more. It investigates Moore’s law and how it has helped create a future technology roadmap.

Part 2: 'Connecting Everything Everywhere…' discusses different connectivity options available to connect various devices, gadgets and appliances to the internet. It highlights the fact that this is just the beginning, and everything that can be connected will eventually get connected.

Part 3: 'Satellites - Our Friends In The Sky…' discusses the fact that they are our friends and helpers in the sky. In discusses how satellites are useful as a connectivity option, how it helps us map and navigate, how we can use location based services, how we can watch broadcast video or listen to broadcast radio, and last but not least, how satellites are helping us observe and monitor the earth. 

Part 4: 'Devices and Gadgets - Our Companions and Life Savers…' looks at the fact that we use a variety of electronic devices/gadgets in our everyday lives to make it more convenient, efficient, and even keep us connected. From smartphones and laptops to smart home appliances and wearable tech, these devices simplify tasks, enhance productivity, and provide instant access to information and communication. They help us manage work, stay in touch with loved ones, and access entertainment on the go. Gadgets like fitness trackers promote healthier lifestyles, while others automate household chores, saving time and energy. Overall, the connected devices & gadgets have become essential tools in modern life, blending seamlessly into our routines and transforming how we live and interact.

The playlist of the videos is embedded below:

The slides can be downloaded from here.

Related Posts

Friday, 3 August 2012

Tech Laws we should all know about - #TechLaws

In many different events and conferences, these laws get quoted so I decided to collect them all in a place.

Moore's law: The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper.

Moore's law is the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. The period often quoted as "18 months" is due to Intel executive David House, who predicted that period for a doubling in chip performance (being a combination of the effect of more transistors and their being faster).

Gordon Moore himself predicts that Moore's Law, as applied to integrated circuits, will no longer be applicable after about 2020 - when IC geometry will be about one atom thick. However, recent technology announcements about 3-D silicon, single-atom and spin transistors gives another twenty years of conventional doublings before the electronics limit is reached. Inevitably, other technologies, such as biochips and nanotechnology will come to the forefront to move the equivalent of Moore's Law inexorably forward.

See Also: Transistor Wars: Rival architectures face off in a bid to keep Moore's Law alive


Koomey's Law:  The number of computations per joule of energy dissipated has been doubling approximately every 1.57 years. This trend has been remarkably stable since the 1950s (R2 of over 98%) and has actually been somewhat faster than Moore’s law. Jonathan Koomey articulated the trend as follows: “at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half.”

See Also: See Also: A New and Improved Moore's Law


Metcalfe's lawAttributed to Robert Metcalfe, originator of Ethernet and founder of 3COM: the value of a network is proportional to the square of the number of nodes; so, as a network grows, the value of being connected to it grows exponentially, while the cost per user remains the same or even reduces.

Within the context of social networks, many, including Metcalfe himself, have proposed modified models using (n × log n) proportionality rather than n2 proportionality.

See Also: Wikipedia


Gilder's Law: proposed by George Gilder, prolific author and prophet of the new technology age - the total bandwidth of communication systems triples every twelve months (some refer to the period as eighteen months). New developments seem to confirm that bandwidth availability will continue to expand at a rate that supports Gilder's Law.

See Also: Technology Needs for 40G–100G Network-Centric Operations & Warfare


Nielsen's Law: Network connection speeds for high-end home users would increase 50% per year, or double every 21 months. As a corollary, he noted that, since this growth rate is slower than that predicted by Moore's Law of processor power, user experience would remain bandwidth-bound.


Cooper's Law:



Cooper has found that the ability to transmit different radio communications at one time and in the same place has grown with the same pace since Guglielmo Marconi's first transmissions in 1895. The number of such communications being theoretically possible has doubled every 30 months, from then, for 104 years. This fact has been dubbed Cooper's Law.

See Also: ArrayComm: Cooper’s Law


Edholm's Law of Bandwidth: Edholm sets out three categories of communications – wired, wireless and nomadic. Nomadic is a form of wireless where the communicator is stationary during the period of communications. According to Edholm’s Law, data rates for these three telecommunications categories increase on similar exponential curves, the slower rates trailing the faster ones by a predictable time lag.



The chart above shows data rates plotted logarithmically against time. When drawn like this, it is possible to fit straight lines to each of the categories. The lines are almost parallel, although nomadic and wireless technologies gradually converge at around 2030. For example, in 2000 2G delivered around 10kbits/s, W-LANs connected to dial up delivered 56kbits/s, and the typical office local area network (LAN) provided 10Mbits/s. Today, 3G delivers 100kbits/s, a home wireless LAN with DSL or cable broadband access is about 1Mb/s and typical office LAN data rates are 100 Mbits/s. Edholm’s Law predicts that in 2010 3G wireless will deliver 1 Mbits/s, Wi-Fi connected via a faster backhaul 10 Mbits/s, and office networks 1Gbit/s.

Edholm’s Law overlaps with Guilder’s on the fixed bandwidth side and to some degree with Cooper’s on the wireless side. But perhaps key is its prediction that wired and wireless will maintain a near-constant differential in data rate terms.


Shannon's law (Shannon–Hartley theorem)In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. 

Considering all possible multi-level and multi-phase encoding techniques, the Shannon–Hartley theorem states the channel capacity C, meaning the theoretical tightest upper bound on the information rate (excluding error correcting codes) of clean (or arbitrarily low bit error rate) data that can be sent with a given average signal power S through an analog communication channel subject to additive white Gaussian noise of power N, is:

 C =  B \log_2 \left( 1+\frac{S}{N} \right)

where
C is the channel capacity in bits per second;
B is the bandwidth of the channel in hertz (passband bandwidth in case of a modulated signal);
S is the average received signal power over the bandwidth (in case of a modulated signal, often denoted C, i.e. modulated carrier), measured in watts (or volts squared);
N is the average noise or interference power over the bandwidth, measured in watts (or volts squared); and
S/N is the signal-to-noise ratio (SNR) or the carrier-to-noise ratio (CNR) of the communication signal to the Gaussian noise interference expressed as a linear power ratio (not as logarithmic decibels).


Finally,

Murphy's Law: Anything that can possibly go wrong, does


Further reading:

Please feel free to add any others you may know of in the comments and if they are popular I will add them in the blog post.