Power vs. energy usage in embedded systems -

Hi Folks,

From my understanding, power is the rate at which work is done and

energy is the total work done by a system over a period of time.

Designing a processor for low power makes sense in desktop systems where lower power results in lower heat dissipation giving cheeper designs and more reliable processors. By lowering the frequency at which a processor is clocked the power can be reduced but the system still does the same work over a longer period of time, thus consuming the same amount of energy as the version running with a faster clock. In this scenario there's a low- power high-performance tradeoff.

Frequently it's claimed that low power consumption is important in embedded systems where devices are run off batteries. Are designs which minimise energy consumption rather than power not what's really of interest in embedded systems? Why are reductions in power (which do not necessarily reduce overall energy consumption) the focus of concern for so many people?

Are people using the term low power in place of low energy consumption (since other techniques which reduce power also reduce energy consumption), or if they really do mean low power, why is power an issue in embedded systems?

Many thanks, John

Reply to
divilment
Loading thread data ...

... snip ...

Please keep your lines under 72 chars (67 is better) to avoid the above type of line wrap.

In many cases the consumption of the single chip is the over-riding consideration. If that system has to work for two months in a remote location, power is a critical issue.

--
 
 
 
                            cbfalconer at maineline.net
Reply to
CBFalconer

In article , snipped-for-privacy@gmail.com writes

You just don't understand! For desktops they want as fast as possible because of the OS and the games. Also who cares if it is less reliable as they will be upgraded in 2-3 years.

Games players want HIGHER performance. Just like boys do with their fast cars.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

That sounds correct.

I would say that the usual concern is energy efficiency, the ratio of energy to the amount of work done. For most embedded applications, it is a given that the system runs fast enough to do the work in real time. Given that, the lower the average power, the more efficient the energy usage. Absolute power might come into play with a given budget, say running from USB power, but most of the time it is an energy efficiency concern.

--
Thad
Reply to
Thad Smith

That's correct for the original meanings of those terms, in physics. It may not be just as correct if extended to other types of "work", such as some amount of computation being completed. Mixing up these meanings of the term will cause confusion.

No. Batteries limit energy, not power. I.e. whatever the job is, if you can't do it with the amount of energy the battery packs then you're toast. This turns into a power requirement only if the jobs is of the type "do [...] for at least [...] hours before recharge is necessary".

But yes, there are reasons why the physical power available to a device might be limited. The technique providing electricity to the CPU may simply not be able to deliver more than a certain power (e.g. a car's generator has a max. rated power output). Or maybe it's impossible to get rid of the remaining heat.

Because it's not typically an option to just let the same job take longer. Embedded systems tend to have some real-time constraints: you _have_ to complete a given job in a given amount of time. That means you have to have enough computational power to do pull that off. Now the less electrical power it takes to provide that level of computational power, the longer you can run off the same set of batteries (or tank full of gas), and the cheaper it'll be to operate the device. Customers like that.

Reply to
Hans-Bernhard Bröker

Actually, they limit both. If you try to draw high power out of a battery its terminal voltage will drop. Make the draw large enough and the terminal voltage drops too low to be useful. When you do this you lose energy due to the internal IR losses in the battery. The available energy in a battery diminishes with increasing power draw.

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

A nice big capacitor parallel to the battery helps with that, as far as power limitations are concerned. In cases where that is not sufficient, the system is most likely limited by energy, not power.

Reply to
Hans-Bernhard Bröker

Correct.

I thought you said you understand the concepts above. It appears that you understand the words and understand how to make sentences out of them but fail to understand what they mean and imply.

Energy = Power * Time.

So Power is just a way of saying how fast you're using energy. When you want to specify energy "comsumption" it makes sense to specify it in terms of power. When you want to specify energy storage capacity then it makes sense to specify it in terms of energy.

Why? Think about it. What if you have a 10J CPU? What does that mean? Does it mean that after consuming 10J of energy the CPU is no longer usable? What, it turns to dust or something? Perhaps there is such a thing but for normal CPUs this is certainly not the case.

Normally you'll see the cpu's energy consumption specified more sensibly, for example 10J/s. Hey, 10J/s is just another way of saying

10W. That's power, not energy. So that's why people describe energy consumption in terms of power. Typically you'll see minimum, maximum, typical and sleep power ratings specified.

To wrap up, "low power" == "low rate of consuming energy". Come on, this is highschool physics where I come from.

Note: Going further off-topic, battery capacity should be specified in terms of energy. Usually manufacturers do this in units of Wh. This is especially nice in calculating how long the batteries will last in your system. A 100Wh battery driving a 0.1W CPU = 100Wh/0.1W (cancel out the units) = 100h/0.1 = 1000h = approx 41.7 days

Reply to
slebetman

Depends on how long the sustained peak is. If it's more than momentary space limitations may keep you from having a suffiently large capacitor to keep peak power draw from being a consideration.

Probably not an issue with a lot of handheld instruments, power tools on the other hand...

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.