Ok - let’s see how you did…
1) True or false: A specific battery’s capacity in mAh is affected by the environment.
Higher temperatures will produce higher capacity, both if higher temperatures are during charge, and if they're higher during discharge. Standard temperature for rating is 20C/68F. As temperature increases, the battery will be able to produce more power from the quicker chemical reaction, which is particularly noticeable at low discharge rates. So a laptop battery in a laptop with bad ventilation and its fan off (50C/122F) that is idling at low power draw (0.05-0.1C) can potentially demonstrate over 110% of the labeled mAh capacity.
2) When comparing two batteries the best value to compare is
C) Watt hours
Since batteries can have different voltages due to the number of cells or cell chemistry, it’s necessary to take this into account when comparing. Older laptop batteries only list voltage and amp hours, but newer ones have started listing watt hours or kilowatt hours (kWh) since modern systems have an easier time using whatever voltage is convenient and can potentially take batteries with different numbers of cells.
3) A fully charged lithium ion battery from a laptop at room temperature (25C/77F) loses how much capacity per year?
While it’s true that lithium batteries have no "memory effect", they do still degrade over time, at a pretty significant rate. Typically a battery at room temperature will lose about 20% of its capacity per year.
4) The battery from #3 is left in a plugged in laptop that keeps it near a temperature of 40C/104F all the time, how much capacity per year does it lose?
If you think that's bad, consider at 60C/140F it will be more along the lines of 80% in 6 months! So if you constantly leave your laptop or cellphone or iPod in a hot car, start saving up for a new battery. We have several MacBook Pro and HP laptops, and within 2 years, their batteries were toast - local components just heat up too much.
5) A fully charged battery at a higher temperature will produce _____ energy compared to one with the same charge that is cooler.
More heat means faster chemistry means more power. See the graph up by the answer to question 1.
6) If a battery has its capacity in mAh on the label, the standard rate of discharge used in determining that rating was
E) 20 hours
This is a real killer isn't it? The numbers on the label of a battery are based on tests done at room temperature (20C/68F) and 20 hours discharge (0.05C)! Does your laptop have a 20 hour runtime?
7) A battery that is discharged intermittently at high and low rates will appear to have ____ capacity compared to one discharged continuously at the average rate.
Way more. Constant discharge is a nightmare for most batteries. They prefer short bursts of discharge followed by rest. So if your "real world" test has brief intermittent activity followed by lots of "think time", the battery is going to really love that! The people who buy the laptop might not though.
8) A lithium ion laptop battery would suffer the least permanent capacity loss if stored
C) in a refrigerator
Ok, yes a freezer would be colder and many batteries might be fine with it, but it can be a bit risky. So generally it's not recommended to store them below freezing. Oh, and if you do store them someplace cold, a plastic freezer bag to keep condensation out is a really good idea.
9) The battery from #8 would be best stored with what amount of charge?
C) about half
The guides say about 40% is the best charge for long term storage. It can reduce the amount of permanent lost per year quite a bit for lithium ion batteries. On the chart we can see a battery at 25C/77F and 40% charge loses less capacity than one kept cool at 0C/32F with 100% charge.
10) If a battery lasts for 20 hours at a discharge rate of 1/20th its rating (0.05C), about how long will it last at the rated discharge rate (1C)?
B) 30 minutes
Not what your calculator said is it? But it's true.
So, what can we take away from all this? Well for one thing, the label on a battery is not an accurate indicator of real world performance. Higher temperatures can be used to make a laptop look good on the tests, especially tests with intermittent power draw. But this comes at a huge price, namely much faster decay of the battery in the long term, resulting in a need to replace the battery early. Just a few degrees can change the expected lifetime massively, and this makes us wonder just how tempting it is to set up a "disposable razor model" where significant revenue comes not from the devices but from the need to replace the batteries every few months. The 1-2 watts of power a fan takes is not very significant to battery run time, but the temperature decrease is very significant to battery lifetime. So why are fans ever being turned off or throttled down? If fan power is even 10% of the total draw in the worst case, in just a few months a laptop with the fan running all the time will last longer than one with the fan off, because of the difference in battery decay rate. So unless you want to buy a new battery every 6 months, turning the fan on high is a net win! Buy some headphones.
Also, the eyebrow-raising claims coming from MobileMark 2007 are suddenly understandable. A combination of much lower power draw (60 nit screens, no wireless?) and intermittent power demand (with lots of "think time" in between) are making batteries perform almost as good as the label says they should. But in real-world use, they deliver more like half that capacity, and will continue to do so until we see 10-20 hour battery times (probably 50 hours on the benchmarks).
At the current rate of progress, this will happen in approximately never!
The real question is, was "think time" really about accurate simulation of real world use? Or was someone staring at a Peukert curve and figuring out the best way to design a benchmark to make laptops look good with the intention of selling lots and lots of copies once flattery earned them the preference of manufacturers?
Perhaps the most important revelation for the consumer is that batteries should probably be taken out of laptops when the laptops are plugged in, to distance them from the hot CPUs (suddenly much hotter now that they can come out of power saving mode) and keep from needing a new battery every year. Yet, when have we ever seen this practice recommended by manufacturers? Who includes a "dummy battery" or cover to use when the system is on AC power? Does anyone even make such things? Should we really expect them to bundle an extra dollar's worth of plastic when it means that you're less likely to buy another hundred dollar battery in a year - someone forbid to put the question of why a battery should be active if the system is running on AC power... why not use the UPS concept of operation that results in NiMH and Li-ion batteries lasting for a decade?
So, now that we see the problem, what can we do to fix it? What should a battery benchmark consist of? Is just one result going to be able to convey a useful picture to people, or will there need to be multiple parts for different types of activity? With the huge difference between battery rating conditions and the average laptop, is there any point to putting numbers like kWh in the mix, even if just to compare different battery options on the same product?
Should 3D gaming be part of testing, even though many products like netbooks and ultralights are not designed for or well suited to it? If only some laptops should be tested for gaming, what should be the deciding factor? Feel free to comment, we've enabled anonymous commenting - even though we would recommend to register on our site.
© 2009 - 2014 Bright Side Of News*, All rights reserved.