Low-Power Computing Is a New Trend!

As I cover the intersection of IT and energy, I go to many meetings and conferences. I cover software, hardware, applications, middleware, OS, services, as long as it is related to my coverage area. Recently, I was at ARM TechCon in Santa Clara, CA. ARM has become the dominant standard chip architecture and has been used mostly for embedded systems. But in recent years, it has been moving into the server market segment in direct competition with X86. AMD is planning to ship a server based on the 64-bit ARM architecture sometime next year, after its acquisition of SeaMicro earlier this year. In any event, the conference attracted a lot of participants.

Although I was involved in software development before and touched the embedded world a little bit, I am not an expert in the field. However, when I found that Prof. Jonathan Koomey was one of the keynote speakers in the conference, I decided to attend. He is a well-known researcher into the energy used by computing. One of his works was mentioned and used in the 2007 EPA report on power usage by US data centers.

Prof. Jonathan Koomey

His point was to show how computing has improved in terms of the number of computations per kWh. This is defined as:

[number of computations per hour at full load] /

[measured electricity consumption per hour  at full load (kWh)]

This is only for the full load. As there were a few questions from the audience, he did not measure anything other than the full load.

In contrast to this measure, Moore’s law states:

the number of transistors on integrated circuits doubles approximately every two years

Koomey’s metric may not cover all cases, but it is a little bit more scientific than Moore’s. With this metric, he analyzed the historical data and concluded that the doubling time for performance per computer was about 1.5 years in the PC era. This means that every 1.5 years performance got doubled. This translates to 100 times better performance per decade, from the 1940s to the present. This ushered us into the laptop/tablet/mobile era, then to the age of the likes of low-power computing and sensors. In the latter, low power is more important than high efficiency. He showed a few interesting examples of devices that harvest necessary power from digestive fluid (in the case of a tiny device embedded into a medicine capsule) and stranded electric signals, such as TV/radio broadcast, wireless waves, cellular waves, and other electric magnetic waves (this is in urban areas only). Other sources of power for these new tiny devices include heat, light, motion, and blood sugar. Of course, personal computing will not go away, but this is exciting.

Now the next question is how far can we push this trend? What is the limitation? Is sky the limit? Koomey referred to Prof. Richard Feynman’s work and said that with this trend, the theoretical limit would not be reached until the year 2041. That is another 30 years! In the development of processors, we seemed to have hit the limit of the number of transistors we can put on them, because of the heat-emission problem. But that is not the theoretical limit discussed here. There will be a different way to diffuse it, and this theoretical limit still holds. How much more progress will we make until then? I do not think I will be around to see that, but it is really intriguing.

He summarized his talk in the following slide.

Zen Kishimoto

About Zen Kishimoto

Seasoned research and technology executive with various functional expertise, including roles in analyst, writer, CTO, VP Engineering, general management, sales, and marketing in diverse high-tech and cleantech industry segments, including software, mobile embedded systems, Web technologies, and networking. Current focus and expertise are in the area of the IT application to energy, such as smart grid, green IT, building/data center energy efficiency, and cloud computing.

, , , ,

Leave a Reply


*