Feb 212021
 
Lithium Iron Phosphate Cell Charging Diagram

Last Updated on 26 February 2021 by Eric Bretscher

This article is part of a series dealing with building best-in-class lithium battery systems from bare cells, primarily for marine use, but a lot of this material finds relevance for low-voltage off-grid systems as well.

Lithium iron phosphate (LiFePO4) battery banks are quite different from lead-acid batteries and this is most apparent when it comes to charging them. Lithium battery banks charge much more easily and overcharge just as easily. They degrade gradually when kept full for extended periods and can develop memory issues when cycled inadequately.

On the other hand, lead-acid batteries resist charging, are tolerant to – and even require – a degree of overcharging and degrade rapidly when not fully charged regularly. This has given rise to a range of technology to meet these needs: it delivers aggressive charging, always errs on the side of overcharging and tries to keep batteries full. Trying to use this lead-acid charging technology to charge lithium cells certainly charges the battery, but it also damages it, so in other words it doesn’t and can’t actually work properly. No amount of searching for the Holy Grail of Settings can offset inadequate charging system design or the use of inadequate equipment.


Disclaimer

A good understanding of DC electrical systems is needed to build and commission a lithium battery installation. This article is aimed at guiding the process, but it is not a simple blind recipe for anyone to follow.

The information provided here is hopefully thorough and extensive. It reflects the knowledge I have accumulated building some of these systems. There is no guarantee that it will not change or grow over time. It is certainly not sufficient or intended to turn a novice into an electrical engineer either. You are welcome to use it to build a system, but at your own risk and responsibility.


Lithium Battery Charging

At a glance, a lithium battery charges just like as a lead-acid one: its voltage rises as it absorbs current until reaching a limit that must be respected. This means that it follows the well-known bulk and absorption pattern of lead-acid batteries. Because lithium batteries accept charge much more readily, they reach a higher state of charge before absorption begins, no forced charging, “boost” and other gimmicks are required or even desirable and the absorption phase is comparatively short. Its duration depends on the absorption voltage, the charge rate, cycling history as well as the age and condition of the cells. At low charge rates, the absorption time can amount to very little or nothing and when charging at high currents, such as when using powerful alternators or chargers, absorption is very significant. It also becomes more significant as the cells age and their internal resistance increases. The consequence of this is that it is impossible to pin down any duration as to how long the absorption stage should last. There is no correct setting.

A typical charging cycle for a lithium battery on a marine vessel can be summed up as follow: the battery is charged with whatever current can be produced with an upper limit on the battery voltage. The exact absorption voltage limit isn’t important, because of the inherent trade-off that exists between absorption time and absorption voltage. As previously illustrated, given sufficient time, any voltage from 3.40V/cell up will eventually fully charge and then overcharge a lithium iron phosphate battery. Lower voltages will fail to charge it extremely quickly and voltage cannot be used as a mean to control the outcome of charging as a result and absorption voltage alone has no bearing whatsoever on the final state of charge reached. Absorption voltages above about 3.55V/cell quickly exacerbate small differences in cell balance and become impractical to operate at. In my experience, 3.50V/cell has been a very good conservative charging voltage for LiFePO4 battery cells; it is just high enough to perform some automatic cell balance corrections when needed and also sufficient to properly recharge cells that haven’t seen a full charge in a long time, or cells no longer in their prime. Charging must stop when the absorption current falls below the termination current threshold, because this means that the cells ability to absorb current has reduced down to the point where the cells must be considered to be full. Any further charging constitutes overcharging and leads to a point where the cell completely runs out of free lithium and no current can flow any more. Therefore the battery current is a critical piece of information and it must be known and used to control charging.

While lithium battery cells are capable of charging extremely fast and absorb current at rates of 1C and more during the bulk stage, this is not desirable and can progressively cause irreversible damage as developed here. Newer generations of LiFePO4 cells are rated for regular charging at up to 0.5C and older ones usually 0.3C, so, in other words, even for modern cells a full charge can’t be achieved sustainably in less than about 2.5 hours when absorption is taken into account. This only becomes a concern when the capacity of chargers or alternators is very significant in relation with the size of the bank, but it also means that ridiculously large alternators are not actually as usable as some hope on small vessels. This can prompt using current-limiting chargers or regulators in some cases. Current acceptance capability also reduces with temperature, usually once outside the window of 5°C to 55°C (40°F to 130°F).

Charge Termination Condition

The charge termination parameters are nominated by the cell manufacturer as a pair of values for cell voltage and residual charging current: typically 3.65V and C/30 = 0.033C (C/20 = 0.05C for some of the newer generation cells). This means that if the charging voltage is limited at 3.65V, then charging must end when the charging current has reduced down to the specified level and the cell is then said to be “full”. However, we are not after the fastest possible charge rate, but rather long battery life, and we don’t need or want to charge to 3.65V/cell, so we need to interpolate down the termination condition for lower voltages. A fully charged LiFePO4 cell at rest has an Open-Circuit Voltage (OCV) of about 3.37V and the termination current at that voltage would therefore be zero. Since the charging cell at a given SoC can essentially be seen as a voltage source in series with a resistive element, we can easily calculate the residual termination current for intermediate absorption voltages up to 3.65V and these are presented in the table below for both older and recent cells:

Manufacturer-specified termination condition 3.65V @ 0.033C 3.65V @ 0.05C
     
Cell voltage (V) Termination current (C) Termination current (C)
3.370 0.000 0.000
3.400 0.004 0.005
3.425 0.006 0.010
3.450 0.009 0.014
3.475 0.012 0.019
3.500 0.015 0.023
3.525 0.018 0.028
3.550 0.021 0.032
3.575 0.024 0.037
3.600 0.027 0.041
3.625 0.030 0.046
3.650 0.033 0.050

 

Example:

Charging an older generation cell up to 3.500V, we need to terminate the charge when the current is at most 0.015C. When there is no guarantee that the charge current will be at least as high as the termination current at the chosen absorption voltage, adaptive termination still works: if we were charging the same cell with solar power on a dull day and the current was only 0.012C, the above table shows that the termination condition should be deemed to be hit when the cell voltage reaches 3.475V.

Terminating the charge at a higher residual current simply equates to stopping the charge short of 100% SOC; this is commonly done as well as it is perceived to be easier on the cells, but the effective difference only amount to a few minutes of charging in most cases and a tiny fraction of capacity only. Lithium batteries just charge too easily.

Lithium Iron Phosphate Cell Charging Diagram

The green shaded area depicts the charging envelope of a LiFePO4 cell. Below 2.000V, the cell is not rechargeable as its chemistry becomes damaged. Once the voltage rises and the current drops to the point where the termination limit is reached, the cell must be deemed fully charged and charging must stop. If the charging process is allowed to progress into the lower right corner, the cell is being overcharged. The upper and right edges correspond to increasingly aggressive charging regimes, forcing current and/or voltage.

Battery Cycling Management

Once a lithium battery has been charged, not only charging must stop, but it should be allowed to discharge meaningfully before being allowed to be recharged again. Charge controllers designed for the lead-acid chemistry implement algorithms that recharge periodically whether the battery needs it or not, because this strategy delivers far more benefits than drawbacks with cells that sulfate and deteriorate as soon as they are not kept full. Solar charge controller restart charging in bulk every morning and there is typically nothing the user can do about it. It is not a configurable setting. Alternator regulators will restart a new cycle every time they are turned on, even if the battery is full. Mains-powered chargers blindly initiate new charging cycles periodically to make sure that the battery stays full. Unfortunately, this kind of treatment is very detrimental to lithium cells, especially when associated with high-availability energy sources like solar and it becomes disastrous for unused installations with little or no load at all on the battery. A lithium battery that is not in use should be at a low State of Charge (SOC) and able to spend months or more without being charged at all.

Lastly, partial charging of a lithium battery should not be followed by a period of rest, especially if this is going to happen repeatedly at the same point on the charge curve, because this constitutes a memory writing cycle. Incomplete charging cycles are very common in marine house bank applications simply due to energy running out before the bank is full; these are of no concern because discharge normally occurs right away and charging ends randomly. On the other hand, systematic weak charging followed by a holding period, as it easily occurs when charging systems are misconfigured, gradually leads to near-complete loss of usable capacity from cumulative memory effect. Conversely, the battery must be charged properly from time to time to reset the state of the chemistry and erase any traces left by memory writing cycles.

House Banks and the Charge Termination Problem

Let’s consider a simple application like a battery-powered tool: its battery is either being charged, or the tool is being used and the battery is being discharged; the tool is not used while the battery is also connected to the charger. Similarly, we can’t plug an electric vehicle into the mains and drive it around town. This is not true in marine house bank applications: a lot of the time we simultaneously produce and consume energy and the battery acts as a buffer; it can either be charging or discharging at any time. Charging sources often supply current into loads instead of charging the battery. This is not a simple application and it creates a more complex operating context for the charging equipment.

Tool or EV-type application

In the case of a battery-powered tool or electric vehicle, the battery is either connected to one charger and not in use, or it is in use and not charging. In this case, the output current of the charger equals the battery current and a good charger can terminate the charge when the battery is full.

In the case of a power tool or EV, there is one charger and the total charging current (i.e. battery current) is equal to the charger output current:

IBat = ICharger

The charger can regulate its output voltage and measure its output current and it has all the information it requires to terminate the charge when the battery is full.

Typical lead-acid charging configuration with multiple charging sources and loads

While this diagram represents the typical installation including a lead-acid battery, multiple chargers and loads, it is totally incapable of charging the battery correctly. None of the chargers can determine when the battery is full and charging should stop because the battery current is an unknown quantity.

In the case of a marine house bank, there are (often) several chargers and loads all operating simultaneously; the battery current is the difference between the sum of all the charging currents and the sums of all the load currents:

IBat = ∑ ICharger – ∑ ILoad

We can immediately see that measuring the output current of any charger (or even all the chargers) yields no usable information whatsoever for controlling charging, because loads can rob some of this current. Yet this is what regular lead-acid charge controllers do when they measure their own output current. Many, like alternator regulators, don’t even do that and blindly apply fixed or “cooked up” charge absorption times, because the algorithm is blind and has no idea of the battery current. This kind of gear is completely inadequate in itself for charging lithium batteries, regardless of what the manufacturer claims. There are no correct charge control settings for it. The only practical way of knowing whether the battery is charging or discharging, and whether the termination current threshold has been reached, is directly measuring the battery current IBat using a dedicated sensor at the battery itself. Because knowing the battery current is so essential for charging lithium cells without damaging them, there are only two valid system topologies to control charging correctly:

  1. Each charge controller must have a dedicated input (like a shunt input) for sensing the battery current and terminate on charging based on a residual current condition; or
  2. Each charge controller must be enslaved to a “master” that controls the charging process by measuring the battery voltage and current and tells the controllers when to stop based on a residual current condition. This master is typically the Battery Management System (BMS).
Battery charging with external current sensing

If all the battery chargers are capable of measuring the battery current and perform a correct charge termination on their own, then a capable distributed charging system can be built.

We will note here that a current measurement shunt can be shared with multiple measuring devices without issues because it is an extremely low-impedance voltage source.

Lithium Battery Charge Control

From the above, we can see that obtaining correct charge termination imposes very strong constraints on equipment selection: either the charge controller must be equipped with an external current sensing input and implement residual current termination, or it must be controllable externally by a “master” using at least some kind of “remote enable” signal, and such a master must exist in the system and it must have the required charge control capability.

Battery charging under BMS control

A battery management system (BMS) measures both the battery voltage and battery current to determine the state of the battery. There are no “chargers” any more. The charging process is supervised by the BMS, which ensures that correct charge termination takes place. The BMS controls voltage regulators and those ensure that the battery voltage doesn’t exceed the required value.

In most situations, it is necessary to use a BMS that offers one or more charge control outputs simply because charging equipment using external current sensing and a correct charge termination algorithm is not available or cannot be sourced. Using a charge control output also has other benefits because the BMS can disable charging ahead of disconnecting the chargers from the battery in case of problem, an essential aspect discussed under the subject of electrical design for lithium battery systems. The ability of arbitrarily turning off charging sources also makes it possible to prevent charging when it is unnecessary or undesirable, like when the installation is not being used, and this prevents the battery from being abused by being held at a high state of charge indefinitely. It also allows the BMS to pause charging while re-balancing the cells if necessary: if cells happen to have drifted too far apart out of balance between two full charges, there comes a point where a balancing circuit can’t handle enough current to keep up when finally recharging. If pausing charging can’t be achieved, the system is left with no other option besides tripping on a cell high voltage condition.

In some situations, we don’t need or want to charge the battery, but we would like to take advantage of available free renewable energy, because we know we will be making use of the stored energy later. Powering loads without meaningfully charging or discharging the cells is achievable by lowering the “charging” voltage below 3.37V/cell for LiFePO4 chemistry, which is the resting voltage of a fully charged cell. Because keeping cells full is detrimental and they should be allowed to discharge to some extent after a charging cycle, a reasonable practical voltage for supplying loads while preserving reserve capacity for short periods of time is 3.325V/cell, which equates to 13.3V and 26.6V for typical 12VDC and 24VDC systems respectively. This however requires an additional degree of charge control as we now either want to charge, hold, or let the battery discharge.

Whenever reserve capacity is not required, then the bank should be kept at a very low state of charge and loads can be powered by supplying current into the installation at a voltage corresponding to that low state of charge, like 3.2V/cell. This situation is encountered when shore power is available continuously and a mains “charger” is used. In this case, charging is the last thing we want and the equipment is best configured to operate as a constant voltage DC power supply. Some chargers can be persuaded to supply a desired constant output voltage by configuring identical absorption and “float” voltages and the better ones actually offer a constant voltage power supply mode. Others are simply intractable and must be thrown out. Another circumstance where capacity is not needed is when the vessel is laid up; a BMS can then maintain a very low state of charge in the bank through charge control.

This sums up the necessary and sometimes desirable ways of managing the charging process for a lithium house bank on board. The other aspect of battery management is deciding whether the battery should be charged or not, and/or how much; such strategies come with a view of preserving and extending battery life. Fully-engineered commercial solutions simply ignore it and charge the battery to full whenever possible. This maximises reserve capacity – which the end-user notices and appreciates, reduces battery life expectancy – which the end-user only discovers too late down the track, and eventually brings in more business, because it means re-purchasing the proprietary battery with integrated BMS. This is why the very high cost of these systems tends to translate more into superior performance while they last than actual value over their lifetime.

A wise system designer will ensure there are ways of keeping the cells at a low state of charge when capacity is not needed and charge the battery wisely. When energy availability is plentiful, there usually is no need to recharge to full, or the battery can be allowed to cycle much more deeply between full charges, which reduces the frequency of cycling and the time spend at a high state of charge. A battery that spends its life only as charged as it needs to be, rather than full, will last considerably longer.

Overcharging, Power Quality and Cell Destruction from Charging

The most common misconception about charging lithium batteries is believing that the State of Charge and by extension overcharging have anything to do with voltage. They don’t. A cell is being overcharged once the lithium ions are becoming depleted. The telltale that the cells are becoming full and charging must stop is reduced current acceptance, which is why using battery current for charge termination is mandatory. When a battery is being overcharged, its ability to absorb current trends towards zero and its apparent resistance to charging becomes increasingly high. The result is that the battery can no longer clamp the voltage down if it spikes. This has catastrophic consequences when the charger output is not well filtered because an overcharged battery becomes increasingly exposed to the peak ripple voltages. Eventually, the battery can no longer absorb any current at all and it is exposed to the full fluctuations in the supply voltage. If these exceed about 4.20V/cell for the LiFePO4 chemistry, the electrolyte is broken down into gaseous products and pressure starts to build up into the cells.

We sometimes see claims that charging at 3.60 or 3.65V/cell ruined a bank and caused the cells to swell, but a smooth DC voltage at that level is insufficient to decompose the electrolyte. The problem comes from overcharging and poor power quality.

The worst ripple voltage is produced by solar PWM charge controllers, followed by old-style transformer/rectifier battery chargers, which should not be associated with lithium batteries. In the case of a solar PWM charge controller, the solar array is connected and disconnected from the battery at a fixed frequency. The open-circuit voltage of a solar array charging a battery in a 12VDC installation typically reaches up to about 22V (36-cell panel). Once the battery can no longer accept enough current to keep the voltage down, every time the controller sends a pulse to the battery, the cell voltages are gradually driven towards 22 / 4 = 5.5V. If the pulse voltage reaches 4.2V, there is sufficient energy for the electrolyte decomposition reaction to take place and the cells get rapidly destroyed, even if the average battery voltage as measured by a multimeter appears acceptable.

LiFePO4 cells destroyed by overcharging. The pressure in the cell casings was sufficient to push the cells apart.

Some of the power sources we use for charging batteries do not produce clean, filtered DC power, like alternators. As long as the peak ripple voltage can’t reach dangerous levels and correct charge termination always takes place, the situation is acceptable.

System Architecture and Topology

In a typical small-scale marine lead-acid battery system, multiple charge controllers follow independent algorithms and charge the battery in an approximate way while erring on the “safe” side for achieving battery life, which is overcharging it mildly. Overcharging, which causes gassing and recombination, stirs up the electrolyte and promotes voltage equalisation across the cells. This means a distributed charge control architecture which provides at best a roughly acceptable result. When it doesn’t, the battery gets damaged and the blame simply goes to whoever configured the myriad of “charge control settings” available.

In a lithium battery system, overcharging cannot be allowed to happen because it damages the cells and correct and accurate charge termination is needed. This requires an algorithm with a knowledge of both battery voltage and battery current. As common charge controllers are incapable of performing this function, they are incapable of performing charge control and this function must be implemented by the BMS. This leads to a different charge control architecture where the BMS is the only true charge controller and the slave devices only perform a voltage regulation (and sometimes power conversion) function: they really are just voltage regulators in this context because their only function is limiting the output voltage and wait for the BMS to signal the end of the charge.

As a result, the architecture and topology of lithium charging system is very different from what is typically found in lead-acid systems. However, if lead-acid batteries were being charged properly and with all due care, as they are in large stationary installations where the battery cells represent a very large investment that must last, the charge controller(s) would also feature an input for sensing the battery current, a lot of obscure programmable “charge control settings” would become pointless, and this difference would not exist. It is the result of the industry selling easy-to-install garbage equipment into a DIY marine/RV market and most of the difficulties with building good lithium battery solutions on board trace back to having to try and integrate garbage consumer-grade products. This situation has only been improving very, very slowly over many years and is still far from satisfactory.

The problems arise at the interface between the central BMS and the voltage regulators:

  1. Ideally, the BMS should be able to transmit the desired voltage setpoint to all the regulators. A lack of standardisation, compatible communication interfaces and product capability makes this impossible. Only one-brand, proprietary systems can achieve this. Al Thomason advocated using the defunct RV-C standard over CANbus to control distributed charging systems in an open architecture and it would have merit. Victron Energy published some details about its communication protocols and interfaces, which is very commendable, but there is a lack of commonality across products and, while remote configuration would be possible over a VE.Direct port, that information has not been released and VE.Direct is a point-to-point data link between a master and a slave, the devices are not addressable.
  2. The next option is accepting a lower degree of control and flexibility and configure the regulation voltages at regulator level; this we can generally do by using programmable devices, but it then leaves the matter of enabling and disabling the regulator. Some regulators feature a digital “enable” input: use it! When they don’t, problems grow because the “charger” must be either disconnected from the battery (when feasible without risking damage) or its power feed must be interrupted in order to disable it. Neither option is very attractive as both require interrupting a high power path using the like of solid-state relays, but sometimes it is possible and there is no other way.
  3. Short of being able to transmit the voltage regulation setpoint to the regulators, we would still like to be able to control whether they should charge or hold, i.e. supply current to the loads without charging. As most programmable charge controllers have voltage setpoints for absorption and “float”, we could achieve this quite simply using the “float” voltage setting if there was a way to force them into “float”. Unfortunately, there generally isn’t. They go into float when they feel like it and the victim installer is left to play the Game of Settings to try and approximate a desired outcome without ever getting there reliably.

Besides being externally controllable by a BMS, the charging sources must also be able to cope with a battery disconnect event. A controllable charger will normally be disabled by the BMS ahead of such an event and this normally takes care of most issues. Some voltage regulators however resist just about all attempts at integration in lithium battery systems. Wind generators are notorious for this, as well as for sloppy, horrible voltage regulation and surging. Any disconnection under load usually destroys them and they must see a battery at all times in order to operate, which unfortunately defeats the strategies discussed here. They are among the worst charging devices to integrate safely and properly with lithium battery systems.

The Losing “Game of Settings”

Most of the DIY lithium installations fail to terminate charging correctly because their design and the hardware employed make them incapable of doing so. They try to hold together by relying on a precarious balance between “charging parameters” and energy consumption. A lot of the time, the battery gets overcharged to various degrees, sometimes every day. Any meaningful change in the operating conditions of the installation throws the balance out: in the absence of consumption, the battery gets slammed to 100% SOC; start the engine when the battery is already charged up and it gets abused by the alternator.

Solar charge controllers are notorious to overcharge lithium batteries. First, they initiate a new charging cycle every morning as the light comes up, so in the absence of sufficient overnight consumption, the bank cannot cycle properly. Configuring a “float” voltage low enough as discussed earlier does allow to create some kind of charge termination for sure… after a fixed absorption time, which is wrong most of the time. Many models won’t allow absorption times short enough to be even hopefully realistic, so they overcharge every time. Some units offer optional termination based on the residual (or “tail”) current as an improvement, but as they measure their own output current, they can’t tell whether the current is going to the battery or a load… This only works if consumption is zero, so let’s say that the average background consumption on board at anchor is 2A and charging should terminate when the battery charge current reduces down to 6A. We configure a tail current setting of 8A to try to account for this and terminate the charge and now it works… until the background load is suddenly higher, because the vessel is under way and a whole lot of instruments are on all the time, and from there on the cells get overcharged because the “controller” is fooled by the extra load.

The Game of Settings doesn’t work. These strategies are not solutions, only hopeful attempts at damage minimisation.

A compounding factor is the fact that people who want to install a lithium battery bank also want to reuse the gadgetry they already own. Reprogramming a typical lead-acid charging system differently, when this is possible at all, doesn’t actually lead to any solution, because it is conceptually inadequate for charging a lithium bank: it can’t possible operate the way it should.

Approximating the Solution

One of the reasons why this article has been a very long time in the making is because the only way to build an actually acceptable lithium charging system for a marine vessel was (and still is to a very large extent) by using custom-built electronics. In the past few years, this state of affairs has started to evolve very, very slowly, but it is far from being satisfactory. It leads people towards trying to approximate the solution and it nearly always falls short of the mark.

Considering that the biggest issue is the absence of correct charge termination, one approach can be giving up altogether on absorbing the battery when it is not possible to do it properly:

  1. at least one charging source is capable of performing a full charge with correct termination based on residual battery current every time; and
  2. this source (which can be the engine alternator for example) is used occasionally; and
  3. all the other charging sources are programmable and can be configured to skip absorption, so they just switch to “float” immediately when the absorption voltage gets hit; and
  4. there is nearly always a load on the installation, so the bank doesn’t get partly charged and then rested; and
  5. the “float” voltage can be configured low enough to ensure the battery will nearly always discharge; and
  6. these sources do not restart charging before the bank has been able to discharge meaningfully,

then all the “bad” chargers will perform partial charges only without leaving any memory in the cell because of the immediate subsequent discharge and the one “good” charger will reset the cell chemistry and erase any trace of memory if needed, as well as allow cell balancing to operate during absorption, whenever it is used. Such a system can hold together without tripping on cell high voltage and offer good battery life if its user has also ensured that the depth of cycling is sufficient by matching up charging capacity and battery size to the average consumption. This means coupling relatively small battery banks to good charging capacity. If the average consumption reduces too much, this fragile equilibrium will suffer and human supervision and intervention become essential at such times.

At this point, the hopeful system builder will discover that most charging sources cannot be prevented from restarting to charge whether the battery needs it or not and many cannot be controlled externally either, so in other words they simply constitute lithium battery overchargers. Most DIY systems in operation today use overchargers and the amount of damage they cause varies with the load patterns experienced by the system.

Many “charge controllers” sold with the word “lithium” in the accompanying pamphlet are simply unusable, some to the point of being purely destructive. Some of the most infamous examples are the “lithium” versions of the Genasun GV-5 / GV-10 solar MPPT controllers: these little marvels of engineering and efficiency respectively deliver up to 5A and 10A at a steady 14.2V output… forever! There is no way to adjust anything and no way to turn them off without seriously hacking the circuit boards. These are the best lithium plating controllers on the market. Genasun exited the lithium arena many years ago now, but they keep marketing some of the garbage technology that was destroying their batteries.


Charging lithium batteries requires precise control because no overcharging can be tolerated. If the topology of the charging system is incorrect, the system is not capable of charging a battery correctly and no amount of “programming” will ever change that.

We need good charging equipment and that is:

  1. Charge controllers with a battery current measurement input and a residual current termination algorithm; or
  2. Programmable voltage regulators interfaced to a BMS using an external enable input and (ideally) another input to switch them to a lower holding voltage when the charge is complete; or
  3. Voltage regulators we can control with a BMS by sending a setpoint via a digital bus.

In all cases, we need regulators that don’t engage in rogue, arbitrary recharging for no reason, so we can allow battery banks to discharge and keep them at a low state of charge when this is desirable.

Summary

  4 Responses to “Charging Marine Lithium Battery Banks”

  1. Hi Eric, very tough matter, thank for the explanation. I have a question though. Say I measure amps at the LiFePO4 battery and use that to terminate charging @ 0.033C. Say I’m charging with solar so the charging amps depends on the sunlight and it might be only a few amps. How can I distinguish that from 0.033C charge current? As far as I can tell both situations seem the same!

    • Rob,

      You seem to be referring to the case where the solar charging current is below 0.033C. In this case, 100% SOC is effectively reached at a voltage lower than the specified termination voltage for 0.033C, see the example provided in the text. It is not the same and only a good BMS programmed to deliver a smart charge termination can help you there.

      Kind regards,

      Eric

  2. Hi Eric,

    thanks for your detailed articles.

    A have 2 victron smart lithium batteries in series for 25.6V battery, in a solar system that is cycled down to 70% every day (sometimes down to 40% but rarely). I have monitored this for almost 18 months with the Victron VRM system and am pretty familiar with its behaviour.

    As you may know, the Victron active balancing on the batteries only works above 28V (3.50V/cell) – so I currently have the system configured to charge to 28.4V every four days for 1 hour to achieve cell balancing (if venus could be configured to use the tail current from the BMV as its switch between bulk/float, I’d use that instead of a fixed time, but at the moment thats not possible).

    The rest of the time the controllers switch to float straight away.
    Victron recommend a float voltage of 27V (3.375V/cell) – this is in the “charging” voltage range. This is evidenced by the batteries continuing to take current after the fast switch from bulk to float.

    I have found that setting the float voltage to 26.7V (3.33V/cell) stops this – i.e. the batteries don’t accept any current.

    Would be interested in your thoughts…..

    • Jason,

      The problem comes from the fact that the people at Victron have been marketing lithium batteries for quite some time, but they still don’t fully understand the technology. 3.375V/cell is excessive as a “float” voltage. Your installation is cycling regularly and it is fortunate, because this otherwise shortens the life of the batteries by holding them full as if they were lead-acid. It is unfortunate, but when companies are in the business of selling batteries, there is an immediate conflict of interest between maximising life for the end-user and maximising revenue.

      Kind regards,

      Eric

 Leave a Reply

(required)

(required)