Today's AI boom will amplify social problems if we don't act now, says AI ethicist
Google's Bard AI says urgent action should be taken to limit (*checks notes*) Google's power
8 habits of highly-secure remote workers
Microsoft Build 2023: How to watch and why you should
If you're anything like me, you have a number of gadgets within arms reach that are either on charge right now, or will need charging at some point during the day. Maybe it's a smartphone or tablet, or a laptop, or a wearable. They all have one thing in common - they need power.
And that power costs you money. But how much money?
Over the years I've seen a lot of estimates as to how much power smartphones, tablets, laptops, and other devices consume, but it has always concerned me that these numbers either appear to be pulled out of the air, numbers copied off another website that pulled them out of the air, data derived from published battery capacity figures, or are figures derived from lab testing as opposed to real-world usage.
As regular readers will know, I'm a big fan of real-world testing. The only drawback of real-world testing is that my "real world" is going to be different to your "real world," which means that your mileage can, and probably will, vary. But, as long as a few variables are nailed down, these differences shouldn't be too great.
I also have to make some assumptions. I'll outline those as I go.
Finally, I have a lot of ground to cover, and that means I might use terms that are unfamiliar to you. I'll provide links to information where you can learn at your leisure.
I'm going to be using two bits of kit:
- Watts Up? PRO power meter (primary measuring device)
- Drok USB power meter (secondary measuring device)
- Genuine chargers - no third-party devices
Must-have PC, smartphone and tablet repair tools
What I'm looking for
Power is priced in kilowatt-hour (kWh), which is 3.6 million joules of energy. A device rated at 1,000 W running for one hour will use 1 kWh, while a device rated at 100 W will take 10 hours to consume 1 kWh.
Because consumer electronics draw so little power, I will also be using watt-hour (Wh), where 1,000 Wh equals 1 kWh.
According to figures published by the US Energy Information Administration for January 2016, the average cost per kWh in the US was $0.12. This is the figure I will be using.
I've taken different approaches depending on the device I'm testing.
For smartphones and tablets, rather than find how much power it took just to charge the battery from 0 percent to 100 percent, and try to fudge that into some real-world figure, I did what most people do and put these devices on to charge overnight and measure the power consumption. I chose this method for two reasons:
- It's a usage pattern that matches how many people use their device
- It is more real-world, since when the device is on charge overnight, not only is power being used to charge the battery, but also to run the device (remember, your device is doing stuff in the background like checking email), so this goes beyond just measuring the power used to charge the battery
I know that some people make heavy use their devices while they are on charge (I'm one of those people usually), but this didn't feel representative to me. I'm not looking at how much power I can use, but instead looking for a more normal usage pattern.
When it comes to laptops, I want to know two things:
- How much power does it take to charge the battery under "normal" usage conditions
- How much power does a laptop draw at peak load when it's connected to the power outlet
Again, I'm trying to replicate "real world" (or my "real world" as best as I can.
None of this is perfect, but overall I'm happy with the approach. If readers have any suggestions, I'm open to feedback and commentary.
OK, with all that out of the way, let's answer those burning questions you have about charging devices.
How much does it cost to charge an iPhone for a year?
My test subject is the iPhone 6 Plus, which has the biggest battery that Apple offers. I'm also a pretty heavy user, and this meant that going all day was sometimes tricky (the things I do for you).
During this time I chose not to charge the device during the day or to charge it while in the car.
Here's what I found.
On average, during an overnight charge, the iPhone consumed an average of 19.2 Wh. A miniscule amount, but over a year that translates into 7 kWh, which will set you back $0.84.
Smaller iPhones will consume less, while heavier users will see that bill rise, but you'd have to be a really heavy user for your iPhone to cost you more than a few dollars a year.
How to get years of service from your rechargeable batteries
What about an iPad?
Following a similar methodology, an iPad Air 2 consumes around 35.3 Wh during an overnight charge. Over a year that works out at 12.9 kWh, costing $1.55.
Heavy users might find they need to charge the battery more than once a day (or they might keep it tethered to the power adaptor for longer), while lighter users might get away with only charging it every few days, reducing the cost of ownership.
What about a MacBook Pro?
The test subject here is a 15-inch MacBook Pro with Retina display, which has a pretty beefy battery.
On average, I found that an overnight charge of the battery consumed 128.5 Wh during an overnight charge. That works out at 46.9 kWh per year, or about $5.63.
But, remember that laptops pull more power when they're connected to mains power.
I found that an hour of usage consumed an average of 65.2 Wh. Five hours of usage per day, that translates into 119 kWh over a year, costing you $14.28 per year.
What about a Kindle Paperwhite?
Anyone who owns one of these will know how long they go on a single charge. I find that I only charge mine overnight once a month, consuming only 7.7 Wh. Over a year that's less than 0.1 kWh, which costs a little over a cent.
Do wall warts/power adapters consume power while plugged in with no load?
Yes, but only a small amount. My testing suggests that a genuine Apple iPhone charger uses about 130 Wh of power a month, or about 1.5 kWh a year, equaling around $0.18.
Doesn't seem like a lot, does it? But take the following into account:
- How many chargers do you have plugged in?
- Non-genuine chargers can draw a lot more power.
Factor the total environmental cost of these chargers. Millions of chargers left plugged in 24/7 are consuming millions of kilowatt hours every year. And each kilowatt-hour results in about a pound of CO2 being released into the atmosphere.
Putting this into perspective
Are these figures high? Are they low? What do they mean?
On the whole, these figures are pretty low, especially when you consider that a gaming PC can eat through about 3 kWh a day, and a regular desktop about 0.5 kWh per day.
Five iPhone battery-saving tips that really work (and several that are useless)
But, remember it's not nothing either. Every device you have needs power, and the more devices you have, the more it adds up. Every smartphone, every tablet, every laptop, every TV, IoT thermostat or smoke alarm, every router, every piece of home entertainment gear.
It all adds up.
And don't forget the chargers. They add up too. And when you think about how many devices are sold every year - last quarter Apple sold 75 million iPhones alone - that's a lot of devices sucking on the power grid, which in turn makes lot of CO2.
- Move over Raspberry Pi, here is a $20, coin-sized, open-source Linux computer
- How to quickly and easily find any setting on your iPhone or iPad
- Here's how Apple made the iPhone SE so cheap
- Tech firms have an obsession with "female" digital servants, and this needs to change
- Stop panicking! Here's why Apple and the iPhone are far from doomed