Researchers from North Carolina State University (NCSU, Raleigh) say they have created software that manages all voltage regulation in an embedded system solely on the applications processor unit (APU), without resorting to expensive smart switch-mode power supplies (SMPSes) with their own microcontroller or application processors with higher-speeds than necessary just to ensure proper performance.
Today embedded systems use a dedicated microcontroller—or an excessively fast applications processor—to guarantee that supply voltage changes properly for under- and over-clocking. Using an interrupt-driven software on the APU, the researchers claim to have increased reliability at a reduced bill-of-materials (BOM) along with lower power consumption—90 to 95 percent efficiency with component costs of under 50 cents—according to professors Alexander Dean and Subhashish Bhattacharya at the North Carolina State University (NCSU, Raleigh) and Avik Juneja, now a power management architect at Intel Corp. (Hillsboro, Ore.)
“We show how to integrate switch-mode power supply control software into the application processor's software to reduce system cost and decrease circuit board space using real-time software algorithms,” the authors summarize in their free paper. “Our methods apply to a wide range of software task schedulers, from simple interrupt-based foreground/background systems to sophisticated preemptive real-time kernels to real-time operating systems [RTOSs] resulting in power savings.”
Using the application processor to manage system voltage levels—and thus clocking speed—is nothing new to embedded system designers. However, in the past there has been no proven, standardized technique to get the job done, according to Dean, resulting in over-provisioning embedded systems “just to make sure.” Now NCSU claims to have a foolproof method to achieving such power and space savings, for any embedded system design engineer, without wasting money or using excess power.
“The challenge here was understanding how much the application software can delay the SMPS control loop(s), and how such delays affect the voltage regulation,” Dean told EE Times. “Our methods reduce the 'margin of ignorance' which forces designers to over-provision with excessively fast processors.”
By following the blueprint offered in their free paper, embedded design engineers can be insured of lower cost and the lowest possible power consumption for embedded systems.
“In our system we run the power controller software on the app processor reliably and predictably, for example, with timing errors of under 20 microseconds,” Dean told us. “Sure, you could use a 2-GHz processor without our methods and make this work most of the time, but with no guarantees. With our methods you could use a much more affordable and energy-friendly 32 MHz processor and be guaranteed the system would always work.”
Their design methodology works by copying the techniques available in any RTOS, but without requiring a RTOS, which is convenient for small scale solutions, such as wearables.
“Our advance is that we’ve used design principles from real-time systems and incorporated the power converter software into the embedded system processor,” Dean told us. “These methods guarantee that the other software on the embedded system’s processor will not disturb the power converter’s correct operation. This eliminates the need for a separate processor or controller circuit on the power converter itself, which in turn makes the overall system less expensive.”
Dean also believes that microcontroller makers should burn these algorithms into their firmware, to not only save development time (shorten time-to-market-) but also to offer added value to their microcontroller by saving on overall system price.