This article is part of a series dealing with building best-in-class lithium battery systems from bare cells, primarily for marine use, but a lot of this material finds relevance for low-voltage off-grid systems as well.
Batteries are about voltage, current and capacity first and foremost. This article discusses the performance characteristics of lithium iron phosphate cells in service and the key concepts associated with them. It is very important in the context of setting up lithium battery systems, but also useful when living with and operating one. The chemistry and internal construction of the cells is detailed in a separate article of a more fundamental nature.
What is a Battery?
A battery stores electricity and the question may appear trivial, but it is not. An ideal battery would supply any current at a voltage purely dependent on its state of charge. Real batteries don’t. Real batteries see their voltage drop under load and suddenly step up while being charged. The reason for this phenomenon is that they have an internal resistance. The higher the current flow, the higher the voltage lost to this internal resistance. The electrical symbol for a single battery cell looks like this:
In order to represent the variation in voltage caused by changes in current and understand the behaviour of batteries, we need to add internal resistance to this ideal battery:
If no current is flowing, the internal resistance has no effect on the output voltage; this is why it is important to measure cell voltages at rest if the objective is obtaining an idea of the state of charge. Otherwise, the effect of electrical resistance is skewing the voltage proportionally to the current according to the relation:
ΔV = R x I
Upon discharge, we can now observe the following effect, which does model the reality of a battery discharging at a steady rate:
As a consequence, the voltage measured at the terminals of the battery no longer reflect its state of charge. This is why the state of charge of any battery can only be deduced from a stabilised voltage measurement taken at rest: it is called the stabilised open-circuit voltage (OCV). A similar situation arises when charging:
Now, the internal resistance of the battery is making the charging voltage at the terminals look higher than it actually is in terms of actual state of charge of the battery. Here is a real-world illustration of this behaviour:
We were in the process of building a brand new lithium iron phosphate battery bank on a sailing catamaran, charging 400Ah of cells for the first time with both engines running. The charging current had been a solid 180A for almost an hour. The cell voltages, which had initially jumped up around 3.40V, were gradually rising. When they reached 3.60V, we shut one engine down in order not to exceed this value, reducing the current by half, down to 90A.
The cell voltages instantly dropped down to 3.45V.
We therefore lost 0.15V in cell voltage by reducing the current by 90A. We can use these figures to calculate the internal resistance of the cells using the relation presented earlier, ΔV = R x I:
In this case, we have ΔV = 0.15V and I = 90A. As a result, we can write R = ΔV / I = 0.15 / 90 = 1.66mΩ
1.66 milliohms is a very small resistance figure typical of lithium battery cells, but it is nevertheless enough to significantly skew the voltage reading at high amperage. At a current of 10A, its contribution becomes only ΔV = R x I = 0.00166 x 10 = 0.0166V = 16.6mV, but still enough to be measured. We will refer to this again when discussing alternator voltage for charging, low voltage cut-off limits and cell balancing boards amongst other topics.
In a bank, all cells don’t share the exact same internal resistance, so their voltage doesn’t automatically read the same when there is current flowing, even when their state of charge is identical. It becomes increasingly true as cells age.
Before moving on, let’s point out that the battery model we used above featuring the cell internal resistance is correct as long as the current is steady and the voltage at the terminals has had a few seconds to stabilise. A more complex electrical model would need to be used if the transitions when the current varies were of interest, because of capacitance effects.
Current measurements related to batteries in general are expressed in relation with their capacity rather than in absolute terms: a 100Ah battery operated at 100A is said to be charging or discharging at 1C: one time its capacity rating. A 10A current would only amount to 0.1C; a full charge at a rate of C/5 would represent a 5-hour (approximately) charge, etc.
Charge and Discharge Ratings
Prismatic LiFePO4 battery cells are normally conservatively rated for charge at up to 1-2C and discharge to 3C, and this implies they could theoretically be charged in 30 minutes and discharged in 20 minutes.
Practically, even 1C is often quite a formidable figure when related to the size of a house bank on a yacht and recharging a near-empty battery in one hour is seldom achievable. It is not entirely desirable either for reasons that were developed earlier.
The maximum recommended routine charge and discharge rate is normally about 0.3C for long-term, sustained operation, but higher currents are obviously acceptable at times, if achievable at all.
The short-circuit current capacity of LiFePO4 cells can easily exceed 20-30C, which is far more than needed to cause catastrophic heat damage. The greatest of precautions must be taken when working around cell connections as dropping a non-insulated tool onto any battery bank can result in molten metal flying around, a fire, disastrous burns or any combination of the three.
The practical difference between working near the common deep-cycle lead-acid batteries on board and working around lithium cells is that there are a lot more exposed connections in much closer proximity and even small tools or metallic objects can be long enough to cause a short-circuit. Furthermore, in the event of short-circuit, even relatively small lithium cells are capable of delivering extremely intense and sustained currents.
Incidentally, manufacturer tests have repeatedly shown that a healthy LiFePO4 cell can be bluntly short-circuited to complete destruction without reaching ignition temperature: this is due to the fact that its internal resistance is very low. The same may not hold for a previously damaged cell with an elevated internal resistance and the outcome could then be extremely different.
This image was extracted from a video released by Sinopoly Battery Ltd, China, where other common battery failure modes were investigated, such as when a crew shoots into the battery with an automatic pistol.
Typical Cell Operating Limits
Manufacturers ratings for LiFePO4 battery cells have become more conservative in recent years as more experience was gained with the practical operation of these cells. Nowadays, the typical operating specifications for LiFePO4 prismatic cells  look as follow:
|Charging||Maximum charge voltage||3.65V|
|Recommended charge current||0.3C|
|Maximum charge current||1-2C|
|Charging temperature range||0°C – 45°C|
|Discharging||Minimum discharge voltage||2.5V|
|Recommended discharge current||0.3C|
|Maximum discharge current||2-3C|
|Discharge temperature range||-20°C – 55°C|
|State of Charge||Recommended operating window||10-90% SOC|
Back in 2007, Thundersky, a manufacturing company later absorbed by Sinopoly Battery Ltd, was advertising its prismatic cells for charge up to 4.25V using a current of 3C and their maximum rated discharge current was 10C. Those who followed these guidelines quickly came to a great deal of grief, first and foremost with charging, destroying cells left, right and centre while charging up to the 4.25V “target” at low current.
Today’s charging specifications may still appear as being on the high side, but they must be understood in the context of a constant current/constant voltage (CC/CV) charge regime with charge termination and charging to maximum capacity as the aim. The recommended upper SOC limit is 90% however, not 100%, and charging to 100% SOC in this context means absorbing the cells at 3.65V until the residual current is C/30. Anything short of this will not – by definition – achieve 100% SOC.
All maximum ratings must be understood as absolute limits, not standard operating values, which is why the simplistic reasoning suggesting that 4 cells in series can be charged at 4 x 3.65V = 14.6V couldn’t be more wrong. Just as wrong as the suggestion that any old lead-acid charging system is fine for operation with lithium cells “because the voltage range is compatible”. The voltage range can be quite close, but the charging process required is very different because it needs to provide for charge termination.
The specifics of charging lithium cells on board will be the subject of a separate article due to the extent of the subject, but the essential charging characteristics of LiFePO4 cells are discussed further below.
Peukert’s Law and Lithium Batteries
The capacity of a battery is not a constant figure: it depends on the charge and discharge current. The phenomenon was documented whilst working with lead-acid batteries as Peukert’s Law in 1897. In simple terms, Peukert’s Law states that the available capacity shrinks as current increases.
The answer to the question of whether Peukert’s relation can really be applied to lithium chemistries is essentially negative , but the capacity of Li-ion batteries does also vary with discharge current and Peukert’s Law is all we have at present. Peukert’s Law was only ever formulated to be valid at constant temperature and we do know that Peukert’s effect in LiFePO4 batteries becomes increasingly noticeable as temperature drops below 15°C.
Peukert’s relation is characterised by a supposedly constant exponent k and, in the case of LiFePO4 batteries in house bank applications, experimental data at modest temperatures has suggested a value of k=1.04. An exponent of k=1.00 would indicate no dependency between storage capacity and current, i.e. an ideal battery, and lead-acid batteries often score around k=1.25, with the figure getting worse as they age.
Configuring Battery Monitors
This value of k=1.04 can make for a useful starting point when configuring battery monitors, but temperature variations (which are never accounted for) can easily throw the calculation out, especially when large swings from winter to summer are involved. In the tropics, with batteries at 25°C or over, a value of k=1.02 for the exponent may be more appropriate.
Trying to configure battery monitors designed for lead-acid batteries – where they already perform suspiciously at the best of times – to operate with lithium cells is fraught with uncertainty: the supposedly constant exponent k has been shown to be anything but constant with lithium-ion chemistry . Provided the temperature doesn’t change significantly and the currents in operation are reasonably consistent, a set of parameters can be derived to obtain seemingly sensible readings.
Rated Capacity and Actual Usage
Lithium cells are usually capacity-rated at much higher currents than lead-acid batteries and the battery is deemed discharged when it can no longer supply the discharge current. Capacity rating for discharge at 0.5C (2-hour discharge) or 0.3C are common for prismatic lithium cells, while lead-acid cells are normally rated at C/20 (20-hour discharge). The practical consequence of this is that lithium batteries commonly appear to exceed their capacity ratings at the average currents normally run on board a yacht.
Peukert’s Law can be formulated as: C2 = C1 x [ C1 / (I2 x T1) ] (k-1), where:
C1 is the battery capacity when discharged in T1 hours at a current I1, and
C2 is the calculated capacity when discharged at a current I2. k is the Peukert exponent discussed earlier.
What can we expect from a 100Ah lithium battery rated at 0.5C = 50A when used as a house bank and discharged at C/20 = 5A instead?
We have C1 = 100Ah, I1 = 50A, T1 = 2 hours, I2 = 5A and we will use k = 1.04:
C2 = 100 x [ 100 / (5 x 2) ] (1.04 – 1) = 100 x 10 0.04 = 109.6Ah
The same 10% gain stands for a 200Ah battery discharged at 10A, etc.
|Rated lithium battery capacity at 0.5C||Discharge current at C/20||Effectively available capacity|
These differences can become quite significant in larger banks, as a 400Ah battery discharged at 10A only would now exhibit a theoretical capacity of 476.6Ah. Such calculations are fraught with uncertainty however due to the temperature dependency for the value of k, but matching results have been demonstrated experimentally. At very low temperatures, some of the battery capacity simply becomes inaccessible altogether.
Capacity is also quite sensitive to temperature effects. Lithium cells offer more capacity and higher performance at higher temperatures, including at excessive temperatures causing accelerated ageing. At freezing temperatures, the available capacity upon discharge shrinks quite significantly , but is recovered once the cell warms up again.
This phenomenon highlights the fact that lithium ions become more and more difficult to extract from the graphite matrix of the anode as temperature drops and only relatively superficial charge carriers are available at low temperatures; the balance of the capacity effectively becomes “locked-in” out of reach. This loss of available capacity also translates into a lower discharge voltage, with the low voltage cut off point being reached earlier.
Accepting a lower low voltage cut-off threshold would be a way of regaining access to some of this locked-in capacity in sub-freezing conditions, but the matter is only of real interest for automotive applications.
|Cut-off voltage = 2.5V||Cut-off voltage = 2.0V|
|T = 25°C||C = 100%|
|T = 15°C||C = 98%|
|T = 0°C||C = 90%|
|T = -10°C||C = 74.5%||C = 87%|
|T = -20°C||C = 56%||C = 72%|
For all practical purposes on marine vessels, battery temperatures below freezing should be uncommon unless the water also freezes around the hull. Capacity reduction is then limited to about 10% only in the worst case, which should be negligible. While discharge at low temperature yields both reduced power and capacity, it is harmless to the cell. The same cannot be said of low temperature charging.
Cold Temperature Charging
Cold temperatures are known to be detrimental to the cells if they are exposed to charging. Cycling performance tests at varying temperatures showed the apparent existence of a threshold below which capacity fade with cycling suddenly accelerated. This threshold appeared to be above the temperature of 0°C often suggested as limit for recharging, but the data available was limited and the exact details of cell manufacture are likely to influence this value.
The intercalation of lithium ions into the graphite matrix of the anode becomes more difficult as well at low temperatures and lithium ions ejected out of the cathode and unable to soak into the anode instead plate its surface and edges; this lithium is then irreversibly lost.
This suggests that fast charging in particular becomes increasingly harmful to the cells as temperature drops.
A LiFePO4 cell has a rated nominal voltage of 3.2V. In practice, 3.2V is only reached when heavily discharged (or under significant load) and the normal operating voltage is about 3.3V. This implies that a 12V nominal lead-acid battery made up from six cells in series for a total of about 12.7V in operation can be substituted with four LiFePO4 cells instead, for a resulting voltage of about 13.2V.
On-board power from a lithium bank shows an improved and much more constant system voltage; most of the equipment runs noticeably better, from pumps to SSB transceivers. Lights don’t dip either when a load is turned on, because its low internal resistance translates into much less voltage sag.
The state of charge (SOC) of a lead-acid battery can normally be deduced from its voltage, but only as long as the battery has been at rest long enough for the reading to stabilise. Lead-acid batteries have significant internal resistance, especially when no longer in their prime and drawing current from them immediately skews the reading to the downside.
Lithium batteries are similar, other than for their much lower internal resistance and a more complex relation between state of charge and voltage, which exhibits a prolonged flat when the cells are in the 40% to 65% SOC range. Outside of this region, voltage readings do provide very useful indications of the state of charge.
The cell voltage differs depending whether the cell was being charged or discharged before the voltage was allowed to stabilise. In nearly all instances on board yachts, small loads quickly bring the voltage back in line with the discharge curve.
If this higher resting voltage following charging appears to dissipate very quickly, it is a tell-tale sign that the cells have been abused and suffered electrochemical damage.
At rest, or for low charge and discharge currents, the above plots are extremely useful for estimating the state of charge, even just by glancing at the voltmeter:
|13.3V or more||Near full||Over 80%|
|Above 13.2V||Plenty of reserve||At least 70%|
|Below 13.15V||Getting on the low side||Less than 40%|
|Below 13.0V||Definitely getting low||Less than 25%|
The owners of installations cycling moderately who can refrain from making an automatic beeline to the nearest marine electrical retail store can be pleasantly surprised to discover that the addition of a
random number generator battery monitor to the system can be completely superfluous with lithium, as long as a simple voltmeter and a little knowledge are available.
Current and Power Efficiency
Lithium batteries in general are near 100% current efficient: this means that charging 1Ah yields a typical discharge of 0.997Ah at a similar current. This is hugely higher than what lead-acid chemistry can achieve and often results in gains of 30-50% in charging efficiency when a lead-acid house bank is replaced by LiFePO4 cells on a yacht.
The net effect with solar arrays is as if the size of the array had suddenly become significantly larger and a change to a LiFePO4 bank can be a more sensible answer to energy issues than adding more panels or running an engine.
Power efficiency, on the other hand, sits around 95%, but varies with current: expend 100Wh charging and you will retrieve about 95Wh on discharge. The difference stems from the fact that the charging voltage needs to be a little higher than what is available afterwards during discharge.
In marine use, current efficiency is what matters, because finding a little more voltage is never an issue.
With regard to charging, lithium cells are both far simpler to charge and totally different than lead-acid cells. As a consequence, they should also be managed differently. Another important aspect is that recharging a fresh, new cell can be very different and much easier than recharging a cell which has just seen a large number of partial charge and discharge cycles, due to memory effects which are discussed further below.
The most commonly documented charging regime for lithium cells is constant current, constant voltage (CC-CV). It is also one that is essentially never achieved with marine installations: on-board systems deliver variable current, limited voltage mixed with partial charge/discharge cycles.
As a result, the only parameters that actually matter are the maximum voltage the battery is allowed to reach during charging and the way the charge is terminated, because those determine the outcome of the charging process.
What is Charging Voltage?
The charging voltage is basically the voltage at the battery terminals during charging. The battery user essentially has no control over this voltage for most of the charging process: the battery absorbs all the current provided and the voltage rises at its own pace, as the state-of-charge increases.
The voltage can only be controlled – by reducing the charging current – once it would start to exceed a limit.
I remember once reading a senseless post about an alternator. The author was complaining that the regulator was “useless” because “it limited the voltage instead of charging at the desired setpoint”.
What this person didn’t understand is that the voltage reaches a value that depends on the state of charge of the battery and, with the alternator at full output already, there is nothing more the regulator can do until the voltage naturally rises enough to warrant limiting it.
The parameter the user has control over is the end-of-charge voltage. The end-of-charge voltage is simply the voltage limit used by the charging system before the charge is terminated. Because of the higher internal resistance of lead-acid batteries, the charging voltages rises both earlier and a lot more rapidly than what is observed with lithium cells.
Lithium cells commonly charge at 3.4V or less for very long periods of time while soaking up full current and, when the voltage finally begins to increase, the battery is already significantly charged.
The Relation Between End-of-Charge Voltage and State of Charge
The relation between the end of charge voltage and the state of charge eventually achieved by a LFP cell can be explored by charging battery cells using a range of maximum voltage limits until the current has reduced down to a very small value each time before discharging them again to assess capacity.
Such an experiment was conducted by Powerstream  in 2014 with four different brands of LiFePO4 cells of the same size, which were charged until the current had reduced down to about 0.013C. This is quite a low charge cut-off current and it must have resulted in extended absorption times.
I used their published experimental data to plot a more interesting graph showing the state of charge reached against the absorption voltage limit.
LFP cells simply don’t really charge at voltages up to 3.3V and then fully charge already at 3.4V and upwards. The transition is so abrupt that claiming to control the charging process by adjusting the voltage is purely and simply bound to fail.
Charging at reduced voltages, down to 3.4V/cell, only increases the absorption time and therefore the overall charging time, but achieves strictly nothing in terms of preventing the battery from getting fully charged and then overcharged. It only takes longer for this to happen. Furthermore, low-voltage charging opens the door to severe longer term performance issues which arise from memory effects in the cells.
Memory effects in LiFePO4 cells were discovered and studied by Sasaki et al.  and the results published in Nature Materials in 2013. The authors illustrated that, under specific circumstances, the prior cycling history of a cell alters the voltage curve during charging by causing the voltage to increase faster and earlier than expected.
For a memory effect to appear, an incomplete charge cycle followed by a discharge must have taken place earlier (memory-writing cycle). In this case, an abnormal increase in voltage can be observed afterwards as the charging process approaches the point where charging had stopped earlier; this creates a bump in the charging curve. Partial charging of all common types of lithium cells (with the notable exception of lithium titanate oxide Li4Ti5O12) leaves the cell with divided lithium-rich and lithium-poor phases which persist during and after discharge. In order to erase the cell memory of the previous interrupted cycle(s), a full charge must be performed (memory-releasing cycle) and this requires overcoming the bump caused by past partial cycles.
The memory effect was found to strengthen with the number of incomplete charge cycles performed before the erase cycle. It was also strengthened when a partial charge was followed by a shallow discharge, rather than a deep discharge.
These latter aspects have proved to be of key significance when considering the longer term performance of LiFePO4 batteries in house bank applications, because incomplete charge cycles are common when relying on renewable energy sources and shallow discharge cycles are also frequently experienced. These have the potential to render battery banks near unusable after as little as 2-3 years in regular service in the absence of memory-releasing cycles. Ineffective memory-releasing cycles are very common in DIY installations where the charging process is not properly controlled and/or configured incorrectly by fear of overcharging or due to widespread mythologies.
An absence of memory-release cycles caused by ineffective charging allows the voltage bump caused by the memory effect to grow over time. If the absorption voltage and/or the absorption time are insufficient to overcome it, the charging process gradually terminates earlier and earlier. This has a compounding effect as memory-writing begins to occur at lower and lower values of SOC over time and the available capacity of the battery can disappear almost completely without any loss of lithium or chemical degradation as such. Recovering battery banks in this state can be challenging and require many memory-release charging cycles using high absorption voltages, followed by deep discharge. For these reasons, LiFePO4 batteries should be charged properly whenever the opportunity arises, so the effects of unavoidable previous partial cycles can be wiped out while it is still relatively easy to do so. This calls for a robust absorption voltage and a charging strategy providing adequate charge absorption. Anything else falling short of this will eventually result in significant performance and capacity issues.
While we showed earlier that voltages as low as 3.4V/cell were able to fully charge and even overcharge a LFP cell, this must now also be considered in the context of memory effects altering the charging curve of the cells. My experience so far has been that any termination voltage below at least 3.5V/cell should be considered as inadequate if the installation experiences incomplete charge cycles. Any charging system that is unable to provide an adequate absorption down to at least C/20 or less when required should also be considered as unfit for purpose, because it will fail to deliver charge cycles capable of erasing the cell memory.
Overcharging means applying a charging voltage to an already fully charged battery. As we just highlighted the fact that – given enough time – lithium batteries always fully charge at 3.4V/cell or above, any voltage from 3.4V up can most definitely overcharge and damage a lithium battery.
How quickly this happens certainly depends on how high this voltage is, but – unlike what is observed with lead-acid chemistry – there is no such thing as a safe charging voltage that can be maintained continuously with lithium cells. All charge cycles must end when or before the battery becomes full.
A lead-acid battery benefits from what is known as a shuttle reaction, which does (within reason) allow excess energy to be absorbed and dissipated. This mechanism is not present in lithium batteries and it makes them very intolerant to overcharging.
A lithium battery that is being held at an elevated voltage with zero current flowing in has been overcharged and is getting damaged. This situation commonly happens with many marine charge controllers, including and especially some supposedly designated for lithium banks.
The “lithium” versions of the Genasun GV-5 and GV-10 MPPT solar charge controllers are prime example of this as they maintain 14.2V on the battery indefinitely (based on units inspected in 2015)
Since absorption voltage can’t practically be used to limit charging, it becomes a matter of determining when to stop. Charge termination ideally needs to occur before the battery is completely full, because most of the stress on the battery happens when it runs out of lithium to transfer, or when it can’t transfer lithium ions fast enough, such as when the charge rate is very high and the voltage is allowed to rise excessively.
The tell-tale sign of a fully charged (or overcharged) battery is that it is no longer able of absorbing any significant current
If charging at very low currents, such as 0.05C, where internal resistance doesn’t meaningfully skew the voltage reading, termination can be implemented based on a voltage threshold on the basis that the current is then known to be low. A small solar system charging a sizable bank can fall in this category. In this case, charging must stop when the target voltage is reached and not resume until the voltage has dropped to a level indicating that the battery can and needs to be recharged again.
At higher currents, this strategy would err on the safe side by leaving an undercharged battery, but it is unsatisfactory, because charge absorption is still essential with lithium cells in order to erase the memory from previous partial cycles and make a good use of the capacity installed.
Schemes involving a timed absorption period perform an approximate charge termination only. If the battery requires bulk charging and the duration of the absorption period has been determined wisely, a good charge cycle may result. If the battery is already full when charging begins, it will invariably suffer throughout the undesirable absorption phase; using a lower absorption voltage limits the stress placed on the cells, but fails to properly address the issue, increases the overall charging time and open the door to long-term capacity problems resulting from memory effects.
Nearly all so-called “smart” alternator controllers typically implement a time-based absorption strategy to provide a charge termination that is anything but smart… any charge termination is still a lot better than none however.
Absorption times with lithium iron phosphate batteries should typically not exceed 30-35 minutes in most situations, and much less if the battery is being charged at low current. If a time-based termination is going to be implemented, then the absorption time should be determined experimentally by monitoring the current taper.
Optimal Charge Termination
In all instances where significant charging currents are present, achieving proper termination requires monitoring both current and voltage to make an informed decision.
The voltage must be up at the absorption setpoint while the current is down at the charge termination limit; this indicates that the ability of the battery to absorb further charge is near its end. The final state of charge achieved depends on the combination of maximum voltage and minimum current, but changing the termination current is the only reliable way of altering the state of charge obtained and the voltage must always be sufficient to ensure memory effects from previous partial cycles can be overcome.
Charging equipment intended for lead-acid batteries is hardly ever able to perform a proper charge termination, because overcharging lead-acid cells (with the exception of gel-cells) is acceptable to some extent, there are no real safety considerations arising and batteries are relatively inexpensive. The functionality required is not present and the addition of the word “lithium” in the product brochure typically does exactly nothing to remedy to this situation. While battery voltage is always available, battery current is either not measured or the information is not exploited by the equipment. For this reason, the only place for realistically determining charge termination in a lithium battery system is at the BMS and the BMS should supervise the charging process.
 CALB CA180FI and Sinopoly LFP200AHA cell datasheets.
 D. Doerffel, S.A. Sharkh, A critical review of using the Peukert equation for determining the remaining capacity of lead–acid and lithium-ion batteries, Journal of Power Sources, 155 (2006) 395–400
 N. Omar, P. Van den Bossche, T. Coosemans and J. Van Mierlo, Peukert Revisited—Critical Appraisal and Need for Modification for Lithium-Ion Batteries, Energies 2013, 6, 5625-5641; doi:10.3390/en6115625
 L. Lu, LiFePO4 battery performance testing and analysis for BMS, Department of Automotive Engineering, Tsinghua University (2011)
 T. Sasaki, Y. Ukyo and P. Novak, “Memory effect in a lithium-ion battery”, Nature Materials, Vol. 12, June 2013; doi:10.1038/nmat3623