Power management in embedded software
Power consumption by embedded devices is a critical issue. There is always a need to extend battery life and/or reduce the environmental impact of a system. Historically, this was purely a hardware issue, but those days are past. In modern embedded systems software takes an increasing responsibility for power management. This article reviews how power management is achieved while a device is operating and looks at the techniques employed to minimize power consumption when a device is inactive.
There are broadly two contexts in which a device's power consumption may be considered: when it is in use and when it is idle. In the former, active power management is the key requirement; in the latter, the deployment of low power CPU modes may be advantageous.
Software Power Management
During use, there are a couple of measures that software can take to keep power consumption to a minimum:
- Switch off peripherals when they are not in use.
- Adjust the frequency and voltage of the CPU according to the current performance requirements (this is "Dynamic Voltage and Frequency Scaling" - DVFS).
It is quite obvious that the best way to save energy with any electrical or electronic device is to simply switch it off. So, it is logical to design electronic systems so that peripherals and subsystems may be switched on and off by the software, as required.
This facility is not as simple as it sounds, as some types of peripheral – like a network interface, for example – take a period of time to configure when they are switched on. This delay may be unacceptable if the peripheral is constantly switched on and off. There are also situations where a peripheral may continue transferring (sending or receiving) data after the CPU (i.e. the software) has finished addressing it; a premature power down would result in data loss. Of course, these are all details that would be accommodated in a power-aware device driver.
Dynamic Voltage and Frequency Scaling
To a software engineer, it is not immediately obvious how CPU voltage and clock frequency affect power consumption. Broadly speaking, the lower the frequency of operation, the lower the power consumption. This can be thought of in terms of how much “work” a given piece of software can do. For example, imagine that a CPU needs to execute 100,000 instructions of some software to get a job done and this needs to be performed every second. If the CPU were running at a clock frequency that enabled it to execute a million instructions per second, it would be capable of doing 10X the amount of required work. So, lowering the frequency of the clock by this amount [to facilitate 100K instructions per second] matches performance to requirements and optimizes power consumption.
Low Power Modes
When a device is not in use, it may be switched off entirely. This requires little in the way of software support, though some devices might need some status information saved on power down. The only problem is that a fully powered down device may take some time to start up. Even with a lightweight RTOS, modern large applications can take many seconds to boot. The alternative to power down is some kind of sleep mode.
We are all familiar with the two sleep modes used by most laptops:
Standby - The CPU and peripherals are shut down, but power is maintained to RAM. This mode has the advantage that wake-up is very fast, but the downside is that power continues to be drawn, so there is a limit on how long a device can be in standby.
Hibernate - Data is written out to (disk) storage and the system shut down. This mode has the advantage that there is no ongoing power drain, so the hibernated state can be sustained indefinitely. However, wake-up takes longer, as data needs to be copied back into RAM, but this is still much faster than a cold boot.
Increasingly, embedded processors are having similar sleep modes incorporated.
Power Management Implementation
With all aspects of software development (and hardware too), it is never really wise to “reinvent the wheel”. If someone has an effective solution to a problem or implementation of an algorithm, it makes no sense to code it yourself. Reusing existing intellectual property is much more cost effective. Power management is well understood, so implementations are readily available.
Real Time Operating Systems
Although this functionality may be implemented in application code, this is cumbersome and not very logical. It makes much more sense for the operating system to include a power management framework, as the correct operation of drivers, in particular, may be drastically affected by power saving measures and the OS can readily manage this. This is how power management in a modern power-aware RTOS is implemented.
Use Cases and Operating Points
An RTOS cannot simply power optimize some application code, as it is unaware of the performance requirements of the code, and it is the execution performance that primarily influences energy consumption.
A common approach is for the developer to analyze the application and define a number of “use cases” – specific situations where a device is used, which might be considered “states”. For example, a handheld measuring device might have a list of use cases like this:
- Making a measurement
- Displaying data
- Uploading data
Each use case demands a specific level of performance and a certain combination of available resources (peripherals etc.). Performance may be defined by a particular voltage/frequency combination and these are called “operating points”. Each use case is also assigned an operating point.
The RTOS power management framework takes the use case information as input and, hence, can determine how to regulate power at any given moment.
There are a variety of reasons why the power consumption of a device needs to be optimized. But among the most common incentives is the requirement to meet user expectations. For example, a wearable medical monitoring device would need to run for >18 hours on a single charge, as a user would not find it acceptable to have to recharge the device during the course of the day.
User expectations are not always obvious and can cause surprises. It is common to hear smart phone users complaining that their device is too power hungry and they would like the battery life that their simple handset of the 1990s offered. However, here is a transcript (almost word for word) of a real conversation, which shows that functionality can trump convenience (for some people, anyway):
"Hey guys - I got a new phone."
"What's it like?"
"It's just like an iPhone, but much cheaper and even better."
"Yes. It can take two SIM cards, so I can use the phone for both business and personal calls. No more carrying two phones."
"Cool. Where did you get it?"
"On eBay. It came from China."
"Sounds good. Any downsides?"
"Well ... it is a bit power hungry."
"How bad? What's the battery life?"
"About three hours."
"But it's OK. They supply two additional batteries. It's really great."
Clearly, some research into user experiences might be fruitful when setting the power budget for a design …
Colin Walls has over thirty years experience in the electronics industry, largely dedicated to embedded software. A frequent presenter at conferences and seminars and author of numerous technical articles and two books on embedded software, Colin is an embedded software technologist with Mentor Embedded [the Mentor Graphics Embedded Software Division], and is based in the UK. His regular blog is located at: http://blogs.mentor.com/colinwalls. He may be reached by email at firstname.lastname@example.org