iPhone MagSafe Charging?

I may be getting an iPhone 15 soon so am contemplating magnetic charging. Curious what is considered a safe mag charging wattage, I see ones that sport a 25W brick but only 15W goes through the magnetic connection (I think I’m reading that correctly). Pretty sure I can use a far higher wattage brick if not mag… the question is 15w the max for the highest speed of charging?

Put another way, the one I looked at came with a brick and the mag pad. Brick is 25w, mag pad is 15w. Will it charge faster on the brick by itself or brick plus mag pad?

The maximum MagSafe wattage for the iPhone 15 is 15 W. If you get a quality charger, even its wattage far exceeds that, you have nothing to worry about. The phone controls the power and throttles MagSafe charging for eg. heat. Even if the brick offers plenty more than 15 W, that presents no hazard (assuming it’s not some garbage charger).

1 Like

More to the point: any hazard will be unrelated to its power capacity.

The nature of electricity is that power supplies (at least any normal kind used outside of a lab environment) are “constant voltage” devices. They will put out a particular voltage (5v, 9v, whatever) and will attempt to maintain that voltage no matter what is connected to its output.

It is the load (that is, whatever is connected to the power supply’s output) that determines how much current (and therefore power) is drawn.

The capacity of a power supply is the maximum that it is capable of supplying, not some fixed constant amount. There is only a potential for danger if the load is higher than the supply’s maximum. For instance, if your phone tries to draw 50W from a 30W power supply. If that should happen, then one of three things will happen:

  • The power supply detects the overload and shuts off. This is what I’d expect any reasonable unit to do.
  • It may lower the voltage, to reduce the amount of current/power drawn by the load. Bench power supplies (like you may find in an electronics lab) do this when you configure a current limit. But it’s not appropriate for consumer devices, because low voltage may damage your device, or at least cause it to malfunction.
  • It may have no protection circuitry and overheat until something burns out (annoying) or catches fire (dangerous). El-cheapo power supplies built without proper safety circuitry, like you find from no-name and counterfeit manufacturers often fall into this category.

But in the opposite situation, where the load is lower than the supply (e.g. phone draws 30W from a 50W power supply), there’s no problem.

Of course, modern smart devices (like phones and laptops) have fairly sophisticated power management circuitry that includes two-way communication with power supplies. USB Power Delivery and many other supplies used these days all implement this. So your phone/laptop will determine the power supply’s capabilities (available voltages and power limits for each) and will use that data to tell the power supply to send the correct voltage, and to limit its own current-draw to not overload it.

So a power supply that burns out or catches fire when plugged into a phone/laptop is defective in that way as well - by failing to communicate accurate data about its capabilities to the load.

4 Likes

Thanks guys… the one I was looking at was an Anker… should be as good a quality as anything on the market, I DO have a few of their charger bricks. I am still curious about how fast using the mag puck vs. using a 25w brick by itself… will the brick alone charge it any faster?

You’re welcome.

Charging from the brick over USB-C will always be faster than over MagSafe (the latter’s purpose is convenience). Wired charging is inherently more efficient and the difference is palpable: MagSafe charging will result in a warmer iPhone than charging over USB-C (at the same brick wattage) due to losses at the MagSafe coil via conductive charging (think a transformer without iron core). Those losses have to come from somewhere and they are brick power that ends up heating your phone rather than charging your battery (on top of the heat created in the battery through its chemistry during charging).

If there is one thing a Li ion battery doesn’t like, it’s heat. Which is why MagSafe power is capped at 15 W charging power (resulting in <15 W actually charging the battery). If you connect a USB-C charger, you can get up to 20 W charging (assuming the brick supports it) and of those almost all go to charging the battery (ohmic losses in the USB cable are far lower than the conductive losses in wireless charging).

Of course, by the same token, if you have lots of time to charge (like in my case, I charge my 15 over night on my nightstand), you’d for this reason probably prefer a low wattage brick. The iPhone will charge slowly, keeping it cooler, the battery friendliest way to charge.

4 Likes

Cool, thanks. Makes perfect sense. So the question becomes when does a larger capacity charger drop off in efficiency & time to charge? Looking for a rough idea of how much power never gets used nor speeds up charging. More of a just in case scenario, my beside has a cable from mt lapm t charge, doubt it’s more than 5 or 10w.

See this

Apple iPhone 15 Pro Max Charging Test - ChargerLAB Compatibility 100 - Chargerlab.

Your iPhone 15 will charge as quickly as it can (around 20 W via USB-C, meaning a charger with 25 W would be plenty) until it reaches about 80% charge and then it will slow down to ~5 W until it hits the full charge. So there’s no benefit from attaching a 65 W charger. Neither for USB-C nor MagSafe. OTOH if you have a 5 or 10 W charger it will charge at the maximum that brick can deliver until it reaches 80% and then trickle at ~5W until it’s full. The only difference is how long it takes to charge (and perhaps the resulting battery aging due to heat). A 5-W charger will take ~3 hrs to charge the iPhone 15 battery (just shy of 13 Wh) from empty to full assuming no trickle.

1 Like

The absolute maximum is going to be the limit of your device. You should be able to look up the specs for your phone/tablet/laptop to determine the maximum.

According to Apple, the iPhone 15’s maximum charging power is (scroll down to the “Power and Battery” section):

In other words, although you should be able to use any quality adapter, anything more than 20W will be wasted on an iPhone 15. (You might still want a large adapter if you plan on also using it to charge other devices, of course).

When you talk about “drop off in efficiency & time to charge”, are you referring only to higher capacity adapters or the fact that charging slows down as the battery approaches full?

If it’s the latter, this is deliberate, in order to keep the battery from getting too hot, which can shorten its lifespan. For a technical explanation why, keep reading, otherwise, you can stop here.

Technical reason

In the following discussion, “charger” refers both to your external power brick and the charging circuitry in your phone/tablet/laptop. Both work together to implement charging. This is necessary for many reasons, including the fact that a USB charger outputs a constant 5v (possibly higher, if the power delivery protocol is used), while your phone’s battery is different (typically 3.6v). The charging circuitry in the phone will transform the USB voltage to something higher or lower depending on the requirements of the charging algorithm.

That having been said…

Most battery chargers operate in one of two modes: constant voltage or constant current.

A constant-voltage charger is a pretty straightforward concept. It outputs a fixed voltage until the battery is charged. For example, to charge a 3.6v battery until it is full, you might push charging current at a constant 3.6v. The amount of current flowing into the battery will go down, approaching zero when the battery is full, because the charging voltage “presses” against the battery’s output voltage. The net voltage (what’s actually going into the battery) is the difference between the charger’s output voltage and the battery’s output voltage. So an empty battery will charge at some rate, while a mostly-full battery will charge much slower.

You can, however, charge a battery with higher voltage. For instance, you might push 6v into that 3.6v battery. This will charge faster, because it will always be providing more voltage than the battery’s output. But once the battery’s output voltage reaches 3.6v (meaning it’s full), the charger needs to sense this and stop, because overcharging a battery can damage it (and in extreme cases, cause it to catch fire).

The alternative to this is constant current. A constant-current charger varies its voltage output in order to maintain a constant rate of charge. As the battery starts filling, the charger increases its voltage in order to keep pace with the back-pressure from the battery’s output.

In theory, a constant-current charger’s voltage would go up to infinity as the battery approaches full (and becomes unable to hold more charge). In practice, they have limiters, based on their design and the charging current limit of the battery pack. So constant-current chargers do slow down as the battery fills up, but not nearly as much as a constant-voltage charger. And, of course, they must have circuitry to detect when the battery is full and turn off at that point, in order to prevent overcharging and damaging the battery.

The problem with constant current is that is makes batteries get hot, especially when it approaches full. So, in order to keep the heat down and maximize battery life, modern chargers work in both modes. When the battery is mostly-empty, they operate in a constant current mode with a fairly high current (close to the battery’s maximum), which is the so-called “fast charge” mode. Then as the battery fills up (maybe at 50%, maybe 80%, maybe some other threshold), the charger will switch over to a constant voltage mode until the battery is full.

And, of course, the charger circuitry can be more sophisticated than this, selecting a variety of currents in the constant current mode and a variety of voltages in the constant voltage mode, in order to strike a balance between charging time and battery life. And they can even interact with operating system software (e.g. Apple’s Optimized Battery Charging) in order to implement algorithms too complicated for a basic battery charging chip.

3 Likes

Real world testing shows that the iPhone 15 series use up to 27 watts for charging, and that a 30 watt charger does charge about 10 minutes faster on a 15 PM than a 20 watt charger. FWIW.

See iPhone 15 charging slowly? How to fast charge iPhone with USB-C - 9to5Mac and iPhone 15 Pro Max battery charge test shows why 20W power adapter is ideal [Video] - 9to5Mac

1 Like

Interesting… I have no idea what my lamp can deliver, I kinda assume 5w. I know stuff charges slower there than my chargers out in the living room. BUT if it’s ~3 hrs. from empty to full, that works as I usually put the phone on the lamp when I go to bed.

WOW, very exhaustive treatise, I sincerely appreciate it. I kinda like to at least have a passing understanding of what’s going on under the hood.

One last thing… many folks with better qualifications than I say to really preserve the life of a lithium battery, charge it when it reaches between 20 to 30%. Is this fair to say?

I don’t know. But here’s what I do know…

It is well known that if a lithium battery drains all the way to zero, it becomes impossible to safely recharge. Battery packs almost always have safety circuitry to lock-out chargers if this should happen, and devices typically auto-shutdown when the battery level gets low, before it reaches zero in order to prevent this situation.

Of course, if you run your phone until auto-shutdown, and you then put it away in a drawer for a while without charging it, the batteries could self-discharge to zero, and then they’ll never charge again.

I’ve also read that repeatedly letting it run all the way down to auto-shutdown can shorten its life, but I’m not so certain of that. As I understand it, the lifespan of a lithium battery is primarily a function of time since manufacturing (which you can do nothing about) and how many “charge cycles” it goes through. If you let it drain to shutdown and then charge it to full, that’s one cycle. If you let it drain to 50% before charging, that’s half a cycle, etc. A chip inside most batteries tracks the number of cycles, although not all devices provide an easy way for you to see it (iOS doesn’t have a built-in way, but macOS does via the System Information app).

So if you let it drain all- or most-of the way before charging, you will go through more cycles. But I don’t know if there may be additional factors shortening its life if you let it drain that far on a regular basis.

It has also been established that charging a battery to 80% prior to storage works best. But I’ve never been able to understand why charging it to 100% (where it will self-drain to and past 80% over time) isn’t just as good.

And it has been established that fast-charging to 100% can shorten a battery’s life. Which is why Apple’s Optimized Battery Charging tends to fast-charge to 80%, then pause charging, resuming at a low rate such that it will be at 100% when you typically remove it from the charger.

But I can’t give you any concrete answers. Modern batteries live at the cutting edge of chemistry and physics, so for people who are not experts, it seems like a whole lot of voodoo. :slight_smile:

3 Likes

FWIW, as soon as I got my 15 Pro, I changed it to optimized charging to 80%. So I’ll see over time if there really is any difference, because until now, I have always, almost every single day, charged my phone to 100% as I slept overnight. My 13 Pro was set for its own optimized charging setting, which charged to and held at 80% and then charged to 100% a little while before I usually took the phone off power in the morning. As it turned out, my long-term battery health was no better than my iPhone X, which did not have optimized charging at all. Battery health fell at almost exactly the same rate.

I’m interested to see if optimized charging to 80% leads to better battery health over time, only negligible improvements, or even no change at all.

I’m not sure if 80% is truly a good limit, or if Apple is just being safe by setting the limit for fast-charging and optimized charging there.

But I believe that the reason is that as the battery pack approaches full charge, resistance in the battery pack grows, which causes heat, and that heat can affect the long-term ability of the battery cells to hold a charge going forward.

I was a bit surprised to see this. This Apple article says:

“If your Mac uses USB-C to charge, you can charge your Mac laptop with any USB-C power adapter or display. You can safely use a power adapter or display with higher or lower wattage than the adapter that’s included with your Mac”.

The article is not talking about phones and does recommend using the correct one, but does also say ‘safely’.

This article is about iPhone and implies any of 5W, 18W or 20W chargers can be used.

There is a school of thought that slower charging (which is what happens if a lower rated charger is used) is a good thing for battery longevity. I think it is not uncommon practice to charge phones with a lower rated charger.

Indeed. Because this is not a problem at all. Your Mac will not “ask” for more than the charger can deliver. If the charger is low wattage, the Mac will just take what it can get and hence it will simply take longer to charge your Mac. But neither your Mac nor your charger (assuming it’s not garbage) will be in any danger.

Charging a Mac more slowly is good for the same reason as on the iPhone. The Mac uses Li polymer batteries vs. the Li ion in an iPhone 15, but both battery types suffer from heat and age slower if exposed to less heat.

This makes perfect sense. USB-C chargers (I think) are required to implement the power delivery spec. So the Mac will ask what the charger is capable of, it will then pick a voltage and request it of the charger and limit its charging to some amount less than the charger’s maximum.

Of course “safely” assumes that the charger is complying with the PD spec. If it lies about its capabilities (e.g. a 20W charger claiming to support 50W), as some knock-off devices have been shown to do, then the Mac is going to try to draw 50W and make it overheat and probably fail.

Well, slower charging will generate less heat. If your phone can’t dissipate the heat (bad design, or maybe wrapped in a heavy silicone case), then faster charging may cause the battery to get hot enough to shorten its life.

But under normal circumstances, I am skeptical of broad sweeping claims like this. Companies making phones and batteries (including Apple) don’t want their recommended practices to result in premature failure. They’re going to design their products to operate as documented for at least as long as a typical refresh cycle (which is 2-3 years in the cell phone industry).

If Apple’s batteries started failing prematurely as a result of using the published fast charging capabilities in accordance with published instructions, the PR backlash would be a nightmare for them. They know this and they don’t want to invite that kind of reputation.

Maybe being extra careful about charging may extend your battery’s life from 3 to 4 years, but I also think that once you get that far along, you’re on borrowed time no matter how careful you are, and you should expect to spring for a new battery if you don’t want to replace the phone at that point.

Because, at least with the battery technology in use in the mid-twenty-teens, battery life degraded as a function of time spent at or near full charge. See, for example, Keil, Peter “Aging of Lithium-Ion Batteries in Electric Vehicles” (2017). Sorry that my reprint doesn’t have a DOI, but it shouldn’t be hard to find with a search. It has to do with lithiation/delithiation of the electrodes, which occurs when near 0% and near 100% SoC and isn’t readily reversible.

That said, when it comes to the $20,000 battery in my EV, I never charge above 80% unless I’m planning on discharging it right away. When it comes to my small electronic devices, however, I just try to avoid egregious excursions in charge (like leaving it on a charger for weeks at a time without using a Chargie). To me, it isn’t worth getting stressed and compulsive about optimum charging routines for a device that will likely be obsolete in 5 years anyway.

1 Like