This post is to share my results from testing a number of batteries under high amp loads. I question whether buying a Thunder Power is worth it's cost if an alternative brand consistantly performs better and is half the price. I wanted to know if 60 and 65C ratings were just hype. I've considered different ways to test batteries and evaluated what to look for to determine if a battery is actually putting out "60C" for example. I could find no definition for measuring "C" rating. Battery manufacturers never, ever, ever talk about it so there's probably a lot of hype going on.
We decided the only way to get some answers was to put a few batteries under a serious high load and see what squeezes out. Regardless of what the manufacturers imply with their high C ratings, even a gold plated battery pushing 200 amps through 10 gauge wires will melt the solder off the junctions. A case in point is the first test of Thunder Power's flagship 5000mah, 65C, 6S $250 Lipos showed that the 4mm bulleted split pack link connector melt off their wires at the equivalent of 32C.
To be fair TP's have the reputation of running strong long after the cheapies will turn to junk. However I take that on faith because I've never put 100 cycles on any one battery so I have no other way to tell. But what I do think is if I reeeeaallly load up a battery a few times it will start to show it's true colors. It will run hotter then the higher quality cells, get puffy when maxed out, etc.
Lipo break in is another issue. I've seen that the better batteries tend to get stronger as more cycles are run but have seen little change with the cheapies. I made sure all the batteries had a minimum of 6 light duty break in cycles on them before testing.
Voltage available under high load was another area of question. I've seen ESC's cut out when the throttle is opened past 75% because a crap battery couldn't keep up with a motors current demand so the ESC shuts down on low voltage. So what's a "respectable" percentage of voltage a battery will hold under load? Does one hold 5% voltage reserve at WOT conditions while another will hold 25%?
To get a start at some answers I built a box that puts a constant resistance across the batteries terminals. Using OHM's law and 6S as my standard I calculated what the nominal current should be and provided a number of different "taps" or resistances I could plug into. The resisters are large spiral types that act like a powerful heater so 1200 SCFM of air is blown over them during testing. While the batteries are under load the voltage, current and temperature are recorded. A common lipo tester is plugged into the balance taps so the actual percentage of voltage remaining is displayed.
There are the caveats: This is called "resistive" loading. When we run our motors we are putting them under an "inductive" load which is a different kind of load. A resistive load will tell you alot but there will be difference in the results. Also differences whether the load is pulsed or sustained, etc. Therefore the second half of this posting will be with an inductive load producing tester.
We decided the only way to get some answers was to put a few batteries under a serious high load and see what squeezes out. Regardless of what the manufacturers imply with their high C ratings, even a gold plated battery pushing 200 amps through 10 gauge wires will melt the solder off the junctions. A case in point is the first test of Thunder Power's flagship 5000mah, 65C, 6S $250 Lipos showed that the 4mm bulleted split pack link connector melt off their wires at the equivalent of 32C.
To be fair TP's have the reputation of running strong long after the cheapies will turn to junk. However I take that on faith because I've never put 100 cycles on any one battery so I have no other way to tell. But what I do think is if I reeeeaallly load up a battery a few times it will start to show it's true colors. It will run hotter then the higher quality cells, get puffy when maxed out, etc.
Lipo break in is another issue. I've seen that the better batteries tend to get stronger as more cycles are run but have seen little change with the cheapies. I made sure all the batteries had a minimum of 6 light duty break in cycles on them before testing.
Voltage available under high load was another area of question. I've seen ESC's cut out when the throttle is opened past 75% because a crap battery couldn't keep up with a motors current demand so the ESC shuts down on low voltage. So what's a "respectable" percentage of voltage a battery will hold under load? Does one hold 5% voltage reserve at WOT conditions while another will hold 25%?
To get a start at some answers I built a box that puts a constant resistance across the batteries terminals. Using OHM's law and 6S as my standard I calculated what the nominal current should be and provided a number of different "taps" or resistances I could plug into. The resisters are large spiral types that act like a powerful heater so 1200 SCFM of air is blown over them during testing. While the batteries are under load the voltage, current and temperature are recorded. A common lipo tester is plugged into the balance taps so the actual percentage of voltage remaining is displayed.
There are the caveats: This is called "resistive" loading. When we run our motors we are putting them under an "inductive" load which is a different kind of load. A resistive load will tell you alot but there will be difference in the results. Also differences whether the load is pulsed or sustained, etc. Therefore the second half of this posting will be with an inductive load producing tester.
