Jun 162016
 

This article is part of a series dealing with building best-in-class lithium battery systems from bare cells, primarily for marine use, but a lot of this material finds relevance for low-voltage off-grid systems as well.

Integrating a lithium battery bank on board a vessel introduces a few additional constraints and challenges that don’t exist with lead-acid batteries. Let’s consider two key statements:

A key difference between a lead-acid and a lithium battery is that the former can be damaged safely

While this may come across as provocative, it is nevertheless very true. Overcharging or flattening of a lead-acid battery is detrimental to its life. That’s about it. A lithium battery quickly gets totally destroyed and becomes a fire risk in the same circumstances.

Another main difference between a lead-acid and a lithium electrical system is that, in the second instance, the battery may become entirely disconnected from the installation, which can result in considerable damage

Protecting a lithium battery from damage may ultimately require isolating it from the system following a dangerous adverse event. A charge regulation failure or a complete discharge, for example, are such events. Unfortunately, there tend to be charging sources in marine DC electrical systems that are typically not designed to operate or cope without a battery in circuit in most instances: disconnecting the battery has a strong potential for causing malfunctions and sometimes considerable and very expensive collateral damage.

The battery is the base load in the charging system and is required to prevent the voltage from spiking up, sometimes considerably; many charge regulators cannot function or regulate properly without it.

In this article, we will discuss some avenues and options to design systems taking care of these aspects.

Disclaimer

A good understanding of DC electrical systems is needed to build and commission a lithium battery installation. This article is aimed at guiding the process, but it is not a simple blind recipe for anyone to follow.

The information provided here is hopefully thorough and extensive. It reflects the knowledge I have accumulated building some of these systems. There is no guarantee that it will not change or grow over time. It is certainly not sufficient or intended to turn a novice into an electrical engineer either. You are welcome to use it to build a system, but at your own risk and responsibility.

Basic Electrical System Design for Lithium

Due to the above considerations, the electrical system on board needs to conform with a model that allows battery disconnection without creating additional problems. In nearly all instances, alterations need to be made to the existing installation before a lithium battery bank can be considered. This assessment should take place before anything else.

There are absolutely no issues with electrical consumers on board; the voltage out of a lithium battery bank not only is within the range of what is experienced with lead-acid systems, but also exhibits less variation. A typical lead-acid system operates between 11.5V and 14.4V (less for gel cells). While the practical voltage range of a lithium system extends from 12.0V to 14.2V at the very most, the bulk of the cycling takes place between 13.0V and 13.4V only.

The challenge resides with charging sources and the risk of seeing them being disconnected, including under load, or even worse, directly feeding into the boat’s electrical system without a battery present.

Dual DC Bus Systems

Dual DC bus systems represent the optimal solution in reliability, resilience and functionality with lithium batteries:

  • Power on board is not lost if an issue is detected with a cell reading excessive voltage. This can happen if a charger regulates poorly, cell imbalance is developing, or there is a system setup issue.
  • A low-voltage disconnect doesn’t compromise recharging and the system has a chance to recover by itself.

This makes the dual DC bus topology very desirable on board marine vessels, but it also comes with higher engineering requirements.

The conversion of an existing installation to use a lithium battery bank with a dual bus system first entails segregating charging sources from electrical loads. Skipping this step is not really possible unless another (lead-acid) battery remains in circuit after the lithium bank is disconnected.

Lithium battery disconnector relays

Twin battery disconnectors are at the heart of all dual DC bus lithium systems. Those are top-quality Tyco Electronics latching relays that offer zero standby consumption and a 260A continuous current capacity. The battery bank connects on the middle post, while the load and charge DC buses tie on the sides.

Creating a separate charge bus and load bus normally requires some rework of the heavy current cabling. Choosing a judicious location for the disconnector relays goes a long way towards minimising the impact of the changes. Electrical distribution is normally either carried out close to the battery compartment, or a feeder cable runs from the batteries to a distribution panel where the main positive and negative busbars are located.

Occasionally, marine electrical systems conform to another topology known as a Rat’s Nest. Those need to be pulled out before any further considerations

In essence, the positive busbar must be duplicated to separate charging sources from loads; the negative busbar normally stays as it is. The battery disconnectors are inserted close to this point to tie the bank into the system and any feeder cables normally remain unaffected.

The split DC bus architecture offers the highest level of reliability and great simplicity, but it can be demanding in terms of engineering and design.

The split DC bus configuration is the gold standard in terms of reliability and functionality for lithium battery installations. It is the preferred pathway for engineering elaborate lithium-only systems and for critical applications as it allows for specific and optimal responses to both excessive charge and discharge situations. It requires capable equipment and good system design to achieve this result.

Controlling a dual DC bus system requires a BMS offering suitable outputs: this is not commonly found on systems intended for electric vehicle (EV) conversions, which tend to rely on a single “disconnect all” contactor.

Attempting to build a dual bus system with an inadequate BMS all too often results in installations where both buses can (and therefore will, sooner or later) end up connected with no battery to charge; at this point, an unregulated charging voltage usually gets fed straight through into the boat’s electrical system, leading to a memorably expensive wholesale fry up. The ultimate in terms of the depth of thoughts afforded by the incident is when it happens at sea.

Key Challenge with Dual Bus Lithium Systems

It is fair to say that, today, a majority of DIY dual DC bus lithium systems contain critical design flaws their owners are often unaware of, or have decided to ignore because they could not solve them properly.

This is often related to the use of a junk-grade BMS solution, carefully selected for no other reason that others have used it, coupled with a lack of design analysis.

A system is not good because it works, it is only good if it isn’t also going to malfunction under specific circumstances

Dual DC bus systems come with one main challenge: in case of a high-voltage event causing a charge bus disconnection, charging sources can end up:

  1. Disconnected under load, which can destroy some charging devices by causing their output voltage to spike; and
  2. Subsequently linked together with no battery to charge, which can also result in damage due to excessive voltages for some devices. Many charge controllers require the presence of a large capacitive load (the battery) to operate correctly.

These two situations need to be analysed carefully and mitigated if required.

Typical examples:

  1. A simple PWM solar charge controller switches the panels on and off rapidly to keep the battery voltage at a setpoint. The voltage varies very little because the battery absorbs the current while the panels are turned on. If the battery is removed, the open-circuit voltage of the panels is directly transferred to the output and injected into the charge bus: this means about 22V at times with standard 36-cell panels.
    While it doesn’t really matter in itself and the controller can always take it, if other charging devices are also connected to the charge bus, they suddenly get exposed to that voltage that may prove excessive.
  2. Many simple wind generators can be disconnected under load without getting damaged (as long as they don’t reach excessive speeds afterwards), but a very significant voltage spike can result, high enough to damage other electronic charge controllers that would happen to share the charge bus.
    High voltages also keep being produced at the output afterwards if the unit spins up. This is generally completely unacceptable.
  3. Some modern wind generators can’t be disconnected at all under load, or their charge controller will be destroyed by the resulting voltage surge.
  4. Some, but not all, MPPT charge controllers can fail from an output voltage spike if disconnected under (heavy) load.
  5. Alternators nearly always fail with considerable damage to the rectifiers and regulator if disconnected under load. Interrupting the current causes a collapse of the magnetic field in the stator, which induces an intense surge, sometimes in excess of 100V.

The best and the simplest avenue, by far, would be retaining charging equipment that can be disconnected under load without issues and won’t output wildly unregulated voltages if there is no battery to charge. Unfortunately, this is not always practical, like in the case of alternators, or economics can favour trying to keep pre-existing gear: this is not always feasible, for a number of reasons.

Possible Solutions to High-Voltage Disconnect Issues

Typical solutions to address these problems fall into three categories.

Disabling the Device in Advance

This involves turning off the charging device before it gets disconnected:

  • Alternators can be disabled by interrupting the field circuit with a relay.
  • Shore power chargers can be disconnected on the mains side.
  • Wind generators often need to be diverted into a dump load or a short-circuit, which stops them.
  • If concerns exist with solar systems, disconnecting the panels before the charge controller is an effective measure and normally always safe to do.
  • Many externally-regulated wind generators are best disconnected (and short-circuited) before the charge controller as well.

In all cases, powering a relay or other disconnection device to disable a charging source is completely unacceptable. These systems must be fail-safe and not charge by default in the absence of control signal, so disabled charging sources can’t restart producing power after the battery has been disconnected and an additional layer of protection is created. This requires – for example – using relays with normally open (NO) contacts or bistable latching relays, so even a loss of control power can’t lead to a reconnection.

The best is often using fail-safe solid-state switching devices to minimise the current consumption while held on and maximise reliability.

In order to implement an advanced disconnection scheme, the BMS must support it and provide an adequate signal to act upon at least a fraction of a second before the DC charge bus gets isolated.

This can take the form of a “OK to charge” control signal and/or some kind of dedicated “charger enable” output, which would both get turned off long before a high-voltage (HV) protection event occurs.

Here again, junk-grade BMS products typically never offer such functionality and are therefore completely unsuitable to build such systems.

Individual Disconnection

If damage to other charge controllers is the main concern, disconnecting a device on its own is effective. This equates to giving it its own charge bus and disconnector. This can work very well for some unregulated wind generators, which are notorious for producing voltage surges and very high open-circuit voltages. Units featuring external charge controllers (in contrast with those equipped with built-in regulators) can be disabled by intervening upstream of the controller.

The drawback is the cost of an additional disconnector.

Absorbing/Deflecting the Surge

Another very effective option is ensuring that the current has somewhere to go following a disconnection: the output of a charge controller can be split over an isolator (diodes) and shared between the lead-acid starting battery and the lithium battery charge bus.

In this case, the presence of the lead-acid starting battery becomes essential to the safe operation of the system.

Not all charge controllers accept being wired this way however, because it effectively “hides” the battery voltage until charging begins. Some controllers draw on the battery to power themselves and operate in standby before starting to charge. Many wind generators fall into this category and simply refuse to operate when cabled this way.

More relevant information can be found further below under charge splitting, because the strategy can be, partially or wholly, applied to the charge bus of a dual bus system.

Voltage Sensing

As long as a power source only charges the lithium bank, the reference voltage can normally be obtained from the bank.

The alternative is getting it from the DC charge busbar, which is the same, but upstream of the feed line and disconnector. The benefit is that it keeps reflecting the charger output voltage after a disconnection and can prevent over-voltage on the charge bus; the drawback is that it ignores the losses in the feeder cable and disconnector relay.

Many charging devices fall back on regulating their own output in the absence of a signal at the voltage sensing input, but this usually needs to be tested on vase-by-case basis if the installation is going to rely on it for proper operation.

These two strategies can be mixed and matched as required by charging devices, but the analysis needs to be carried out.

If a charge splitting strategy is used, then the corresponding guidelines apply to the chargers featuring a split output.

Simplistic Alternatives to the Dual DC Bus Topology

Building and commissioning a dual DC bus system can be demanding. It requires a good understanding of the behaviour and capabilities of the equipment used on board and some kind of “what-if” analysis must be carried out to ensure that simple unusual events are not going to result in serious malfunctions.

For these reasons, there appears to be no shortage of dangerous and irresponsible advice to be found under the KISS moniker when it comes to building lithium battery banks and installations. Let’s just say that, provided the cells have first been balanced, it always “works” – until something suddenly goes very wrong. Badly engineered lithium battery systems are still causing enormous amounts of electrical damage on board vessels, which typically doesn’t get reported back. I do hear about those however, quite regularly.

System design doesn’t lend itself to browsing around and averaging; it needs to be consistent and robust

Here, we will try and explore a couple of actually valid avenues to “simplify” the construction of a lithium system without creating additional risks.

The simplest way of resolving the issue of the disappearance of the battery in the electrical system following a safety disconnect event is… ensuring that a battery remains afterwards.

Two examples of simplistic, but safe and functional, topologies are provided below. In each case, we deflect and negate the problems instead of eliminating them at the source. While these schemes can easily be implemented successfully, they remain workarounds with some drawbacks and limitations.

There is no simplification down to the point of just dropping some lithium battery cells in a battery box

Regardless of the system design retained, all the charging voltages still need to be adjusted in order to stay clear of over-voltage problems at cell level and due care still needs to be taken not to overcharge the lithium cells.

The new battery also needs to be protected just the same, because of its different electrochemical nature.

Alternative 1 – Lead-Lithium Hybrid Bank

The simplest way of resolving all the challenges mentioned at the beginning of this article is running the lithium bank in parallel with some standard lead-acid capacity. If any issue arises with cell voltages or temperatures, the lithium bank can be disconnected and the installation will revert to a simple lead-acid system. In some instances, this lead-acid capacity could get damaged or destroyed if the event that resulted in the disconnection of the lithium cells was severe, like an alternator regulation failure.

The simplest lithium battery installation

The simplest safe lithium installation: leaving a sealed lead-acid battery in parallel with the lithium bank at all times allows disconnecting the lithium capacity in case of problem without any issues. The additional SLA doesn’t contribute to any meaningful capacity; its function is ensuring charging sources always see a battery in circuit.

The practical result of such an arrangement is that the lithium battery ends up doing virtually all the work, because it is first to discharge due to its higher operating voltage. The charging voltages are no longer high enough to provide effective charging for the lead-acid cells, but as those are being trickle-charged above 13V all the time, they can be expected to remain essentially full and it hardly matters.

The lead-acid battery needs to be able to absorb whatever “unwanted” current may come its way if the lithium bank gets disconnected due to a high voltage event for example. In some instances, a single sealed lead-acid (SLA) battery can be sufficient. SLAs are the best choice for this application as they don’t consume water and are very inexpensive; gel cells should be avoided as they are costly and a lot more intolerant to overcharging and AGMs would be a complete waste of money in this role.

The drawbacks are:

  • Some charge gets lost trickling continuously into the SLA, more so in a lead-acid battery in poor condition.
  • It doesn’t fully eliminate the lead and associated weight.
  • Removal of the SLA from the system, at some point in the future, would create an unexpected liability.

Some advantages are to be found as well:

  • Disconnection of the lithium bank can be managed with a single contactor; there is no need to implement a split bus. This can allow using some small BMS solutions incapable of managing a dual DC bus.
  • The lithium bank is literally added to the installation in place, normally without cabling alterations required, but not without voltage and regulation adjustments.

With this in mind, it certainly is the simplest fully functional design one can build, as long as protection and automatic disconnection are still very properly implemented for the lithium bank.

Should the lithium bank ever become heavily discharged, the additional lead-acid capacity can start contributing, but this would also leave it at a reduced state of charge for a time afterwards and cause it to start sulphating. This is not automatically much of a concern, because it may not happen (this depends on the BMS low-voltage disconnect threshold) and it doesn’t actually result in much harm if it does. The SLA needs to remain in a reasonable condition however, in order to be able to absorb any transients if the lithium bank gets dropped off due to excessive voltage and not continuously discharge the lithium cells at an excessive rate.

Voltage Sensing

NEVER, EVER, SENSE THE CHARGING VOLTAGE DIRECTLY AT THE LITHIUM BANK TERMINALS IN THIS CONFIGURATION

The sensing voltage required for charge control must be sourced upstream of the lithium battery disconnector, or in other words from the SLA battery, so it remains valid even after a disconnection of the lithium capacity. This is very important, otherwise uncontrolled, unlimited charging of the lead-acid battery will occur after the lithium capacity gets isolated.

Alternative 2 – Split Charging

Considering that, in most instances, good system design practices lead to keeping a separate SLA battery for starting the engine, one can be tempted to derive similar benefits from it, instead of carrying one or more additional SLAs as required by the Lead-Lithium Hybrid topology.

Charge isolator

Charge isolators are extremely useful devices for building lithium battery systems and can be found in a variety of configurations, 1 or 2 inputs connected to 2 or 3 outputs. They are extremely rugged and robust. The best ones all seem to be manufactured in the USA: Sure Power Industries, Hehr and Cole Hersee are all excellent sources for quality units. Inferior products generate considerably more heat.
If efficiency is a key concern, isolators using MOSFET transistors instead of diodes are available, albeit at significantly higher cost.

Using a charge isolator (also known as blocking or splitting diodes) can provide at least a partial solution, depending on the nature of the charging devices present. It is a good option with alternators and any chargers that don’t need a voltage originating from the battery to begin operating.

A charge isolator is another option for keeping a lead-acid battery in the charging circuit at all times if a lithium bank must be disconnected.

Since most of the electrical issues with the integration of lithium batteries in traditional marine systems arise with battery disconnection, splitting and sharing a common charge bus with the engine starting SLA battery is a very simple and effective way of addressing the matter.
Unfortunately, some battery charging devices refuse to operate behind an isolator; this prevents adopting this configuration as a universal solution, but it is nevertheless valuable.

Alternators and unregulated/crudely-regulated wind/tow generators are usually happy to function this way behind a diode. Internally-regulated generators commonly refuse to start unless they can “see” the battery voltage, because they require a small amount of power to first “release the brake”.

If this configuration can be achieved, then again the lithium bank can simply be dropped using a single disconnector without any ceremony, should some adverse event occur. One side-benefit is that the charging systems feed both into the lithium bank and the start battery, even though the voltage isn’t ideally quite high enough for the latter. This can be remediated by the addition of a small dedicated charger for the lead-acid battery, either solar or through step-up DC/DC conversion from the lithium bank.

In such a configuration, it is very important that the lead-acid battery always remains present in the charging path. A battery switch to isolate the engine circuit is fine and desirable, but the charge isolator(s) should remain directly connected to that battery at all times to provide a pathway to dissipate any surge, as well as a nominal base load for the charge regulators.

Voltage Sensing with Charge Isolators

Any serious charge controller comes with a battery voltage sensing input. When the charger output is split to charge multiple banks, this becomes even more important as any losses over the charge isolator must be compensated for and a quandary always arises as to where to source the charging reference voltage.

Accurate battery voltage control is only going to be achieved for the battery being sensed, because there are voltage losses proportional to the current in charging systems. With a lithium bank in the system, sensing should reflect the voltage of the lithium bank and this will result in best performance for charging it; this is usually the desired outcome.

NEVER, EVER, SENSE THE CHARGING VOLTAGE DIRECTLY AT THE LITHIUM BANK TERMINALS IN THIS CONFIGURATION

Voltage sensing for a lithium battery in a split-charging topology must be performed at the output of the charge isolator, upstream of the battery disconnector, so disconnection of the battery doesn’t dissociate the sensed voltage from the charging voltage altogether: this would otherwise lead to uncontrolled, unlimited overcharging of the remaining lead-acid batteries in the system.

Voltage sensing can sometimes be performed at the input terminal of the charging isolator instead, for some equipment such as alternators typically. In this case, the charging voltage adjustment must be made for the lowest voltage drop that can be experienced over the isolator. This is normally about 0.3-0.4V for Schottky diode type units and essentially zero if a MOSFET-based isolator is used instead.
The difference in system performance is subtle and yields a less aggressive charging characteristics with lithium cells in particular.

General Electrical Installation

Fusing & Feeder Cables

A heavy-duty fuse should normally be found very close to the bank to protect the feeder cables. This fuse should be sized so it will never blow unless an intense short-circuit occurs, or it may create the potential for at least accidentally destroying the alternator, and often much more.

ANL fuse

ANL fuses are cost-effective, easy to source and can offer interrupt ratings up to 6kA at 32V, but some are only good for 2kA.

The nominal current capacity of a fuse reflects the current it can conduct indefinitely without blowing. Currents above this value will cause the fuse to heat and eventually blow; the time it takes for this to happen is related to the ratio of the over-current and can range from minutes or more to milliseconds.

The interrupt rating of a fuse is considerably higher than its current capacity and defines how much current the fuse can successfully interrupt by blowing; values beyond this figure may result in continued arcing over the fuse after it has blown. The interrupt rating is very voltage dependent, for obvious reasons, and increases significantly at lower voltages.

Unless the feeder cable leaving the battery compartment is of an exceptional size and the battery bank is very large, a common low-voltage ANL fuse with an interrupt rating of 6kA at 32VDC is normally adequate. There is too much resistance in the cells, connections and cables to sustain the hypothetical currents (and associated apocalyptic predictions) that would supposedly arise from a short-circuit.

For a 13.3-volt source to supply in excess of 6000A, the total circuit resistance would need to be below 2.2 milliohms. Small lithium battery systems of interest for pleasure crafts normally fall short of such capability simply due to the size of the cabling used and number of bolted connections involved.

In the case of larger installations, a proper prospective fault current calculation should be carried out and the fusing should be selected to match the required interrupt rating.

Class T fuse

Class T fuses offer much higher interrupt ratings (20kA) than the common ANL fuses and can become necessary to protect the feeder cables in large lithium battery bank installations.

The feeder cables should be sized according to the maximum acceptable voltage drop they can induce under normal operation. Quite often, alternator charging currents and inverter loads represent the maximums the installation can be expected to see.

Using unreasonably heavy cables or seeking negligible voltage drops at peak current also increases the maximum prospective short-circuit current the installation can produce and results in a higher level of risk. The cables need to be able to hold until the fuse blows and, until then, their resistance is precisely a good part of what limits the fault current: it pays to keep this in mind and take advantage of it.

Common Negative

In the case of a system with more than one battery bank – a very common configuration due to the presence of at least a starting battery – it is usually wise and sensible to tie all the negatives together, because it simplifies the integration of any device connected to more than one bank.

If charge splitting is to be used one way or another, then a common negative to these battery banks is mandatory.

Battery Sensing

Battery Voltage

If not already present, a dedicated battery voltage sensing cable with its own small fuse at the battery end should be run from the source of the sensing voltage, which often is not at the battery itself, to wherever the charging equipment is/will be located. All voltage sensing can then be consolidated onto a dedicated terminal block rather than having multiple wires all running back to the same location for an identical purpose.

A great deal of damage and destruction can result from sourcing the charging reference voltage inadequately in an installation with a lithium bank

Where the voltage sensing cable should be connected in the system depends on the topology of the installation and the subject was discussed on case-by-case basis earlier.

Battery Current

Many systems also include a current measurement shunt associated with a random number generator battery monitor or amp meter. The shunt is almost always found on the negative side, because it is technologically simpler and cheaper to measure the current there. Run a twisted pair cable from the shunt block directly to the measuring instrument.

Other than for the negative voltage sensing core and any BMS wiring, there should be nothing else than the lithium bank connected to the battery side of the shunt. This includes the negative of other batteries, such as a starting battery: failure to observe this will result in the current of the other batteries to also be measured, when it shouldn’t.

Temperature Sensors

Any battery temperature sensors associated with charge controllers and pre-existing lead-acid cells must be disconnected from all charge controllers and removed altogether. Some controllers may signal a fault as a result, but normally keep operating assuming a default constant battery temperature: this is exactly what we want. Occasionally, an ill-tempered controller may refuse to operate without its temperature sensor. Most temperature sensors are 2-wire negative temperature coefficient (NTC) thermistors (resistors whose value is temperature-dependent). Measure it at ambient temperature with a multimeter and replace it with an approximately equivalent fixed resistor (the nearest standard value will do) at the controller terminals.

This aspect is in fact part of the integration of lithium batteries with other equipment, but as the task of removing the sensors takes place within the battery compartment, it seemed logical to include it here.

Temperature sensors have their place in a lithium battery bank, but they are part of the battery protection circuitry and completely unrelated to the charging voltage. Lithium batteries in marine installations should always operate within a degree or two from ambient temperature, without exhibiting meaningful differences between cells.

Battery Switches

Single-pole battery switch

Simple heavy-current battery switches are a much better choice than combining switches with lithium batteries, as paralleling of batteries is usually most undesirable.

On dual DC bus systems, it is highly unadvisable to leave or install a battery master switch in the feed line between the batteries and the bus disconnectors. The correct way of achieving battery isolation is by opening both the charge and load bus disconnectors, which is a function that is normally provided by the BMS; failing to observe this point would again result in removing the battery while leaving both buses linked together as described earlier.

The only acceptable function for a manual battery isolator switch is turning the power off to the vessel, i.e. disconnecting the load bus.

If complete manual battery disconnection is desired, then either two single-pole battery switches or a 2-pole switch must be used to isolate both positive buses. Some analysis must be carried out to determine whether leaving the charging sources tied together at the “floating” charge bus with nothing to charge could result in equipment damage or not.
While the BMS may be able to provide “advanced notice” of a charge disconnect and turn the chargers off, a manual disconnect typically won’t.

Paralleling Switches

Paralleling batteries is a concept that evolved from trying to crank diesel engines with proverbially flat lead-acid batteries. One good engine starting battery is all it takes to do the job. Unless the engine is truly large, a single battery is normally ample, and more is just dead-weight.

If either the lithium or the lead-acid battery is heavily discharged, closing a parallel switch can initially result in an intense discharge current, with a risk going towards the cabling and the lead-acid battery due to the formation of explosive gases.

Systems including isolated banks of each type normally also include provisions for charging the lead-acid capacity properly (i.e. at higher voltages, using a temperature-compensated voltage and float-charging) and this makes the paralleling switch a very dubious proposition, because it exposes the lithium cells to a completely inadequate charging system. The fact that you “won’t leave the paralleling switch on” only means that it will happen anyway, sooner or later, because it can.

On a dual DC bus system, there is also the question of where to connect the switch: the tie-in can typically both consume and supply energy and it can only be cabled to either the charge or the load bus, leaving the system vulnerable to discharge through the charge bus, or overcharge through the load bus afterwards.

I personally prefer having the option of using jumper cables if ever warranted, rather than creating an unnecessary and permanent liability by having a paralleling switch in a dual DC bus installation.

Simple systems that don’t feature a dual DC bus can actually be designed with a paralleling switch, but it must join past the lithium bank disconnector relay, not on the battery side. This ensures that the BMS can break the parallel link if trouble is coming from there. Regardless, it is still a bad idea.

Voltage-Sensitive Relays (VSR)

Voltage Sensitive Relay

Voltage Sensitive Relays (or VSRs) are always poor solutions in marine electrical systems and, at best, next to useless with lithium batteries. The one depicted above, with a cut-in voltage of 13.7V and a cut-out threshold of 12.8V, would essentially remain closed until deep discharge has occurred.

Voltage-sensitive relays are another plague of modern marine electrical systems. They gained ground after people experienced issues with diode-based charge isolators due to the voltage drop they induce and because VSRs are seemingly easier to deal with and understand.

Each battery bank has it own state of charge and needs in terms of charging profile. Paralleling banks together is never a great idea, even when the batteries are of the same type and require the same voltages.

Some VSRs sense the voltage on one side only, others on both; some offer adjustable thresholds and others not. Unless the unit is fully adjustable and includes both low and high voltage disconnection points, it is normally completely useless (and equally harmful) around lithium batteries.

Forwarding a charging voltage from a lithium bank to a lead-acid battery won’t result in a good charge characteristics. Doing the opposite requires observing both a connection and a disconnection voltage threshold, because lead-acid battery charging reaches excessive voltages. The resulting charge characteristics for the lithium battery is typically not good either, because no absorption time can be provided. It keeps getting worse: should one of the banks become heavily discharged, closing of the VSR can easily result in a sustained discharge current way beyond its current capacity, leading to some catastrophic failure.

On dual DC bus systems, VSRs normally bring all the same issues as paralleling switches: there is no correct place to wire them in and they have no place there.

Regardless of brand or type, VSRs never seem to lead to any good solutions in systems with both lithium and lead-acid cells. Fortunately, there seems to be an endless queue of ill-inspired people keen to buy them and this makes them very easy to get rid of.

The best answer to charging auxiliary engine starting SLA batteries is using a battery isolator, if an alternator is present, and DC/DC chargers from the lithium bank (or an auxiliary solar panel) to ensure full charge can be reached. The installation can then simply be configured to charge the lithium bank optimally.

Engine Starting Batteries

Internal combustion engines can be cranked with LiFePO4 batteries, very successfully at that, and even when the battery is low on charge, within reason: a lithium bank down to 3.0V/cell can struggle to crank a diesel. There are however a number of good reasons for not doing it when the vessel is large enough to sustain a dual bank installation:

  • Redundancy and the ability to still start the motor with a discharged house bank are lost.
  • Unless the lithium bank is huge and a current of some 100A means little, engine cranking still causes the voltage to sag at the battery and creates transients in the system.
  • Lithium batteries are harder on engine glow plugs, because they supply a higher voltage under load.

Unless low weight is everything, using a lithium battery as a separate starting battery is possible, but usually not sensible:

  • A SLA purely used as a starting battery is very easy to keep at full charge and commonly lasts 8 years or more on a marine vessel. A very small solar panel can be dedicated to floating that battery at the appropriate voltage if needed.
  • The comparatively very high cost (and added complexity) of a lithium battery in this application cannot be justified.
  • A lithium starting battery should be kept at about 50% SOC in order to age well; it introduces a new lithium charge control regime in the system.
  • As highlighted earlier, there are often technical benefits to be found in still having a SLA in the system and dedicating one to cranking the engine is a good use for it.

Next Steps

Once the new battery bank has been balanced, assembled, protected and installed in an electrically correct configuration, it needs to be integrated with existing charging equipment.

Due to the large variety of gear found on the market, with hardly any of it ever intended or properly designed to charge lithium batteries, chargers require a lot of attention in order to function without tripping the high voltage protection limit or overcharging the bank over time.

The subject is extensive enough to be treated separately.

  2 Responses to “Electrical Design For a Marine Lithium Battery Bank”

  1. This is a really good article. Thanks for the time you took to write it and I hope it helps a lot of cruisers!

    • Bob,

      Thank you for your kind words, you have been in this field for quite a while… Late last year I saw a lithium battery fiasco of such a magnitude that it prompted me to start writing this material. At the time, the owner didn’t even understand how and why it had suddenly gone so wrong.
      The electrical engineering component present in these systems is too often not identified properly or discounted, but it really is the backbone of the installation.

      Best regards,

      Eric

 Leave a Reply

(required)

(required)