Assessing lithium-ion battery life for implantable medical devices

Battery life is a vital factor in many of today’s applications. For implantable medical devices, patients need to have confidence that the battery will give them a long time between needing to be re-charged, known as the charging interval.

Almost as important, the battery’s useful capacity, and hence the time between charges, will gradually decrease over its usable life, and this dictates how many years it will last before it needs replacing. This determines the longevity of the battery, and how many useful charge/discharge cycles it can be used for. Longevity can be a key buying criterion when choosing a battery – once the battery has reached the end of its useful life, it will need replacing, which will involve some kind of surgical procedure in an implantable device.

Introduction

Generally, a 20% decrease in the time that the battery can be used, before it needs to be charged, is seen as the point where this capacity decrease becomes an issue. As a result, the useful life of a rechargeable battery is typically defined as the number of charge-discharge cycles before the capacity falls to 80% of its original value.

For designers of medical devices, it’s important that they get accurate information on the different rechargeable batteries available on the market. They need to be confident that they are comparing the same parameters when they look at batteries from different manufacturers, and that the numbers on a datasheet are reflected in likely real-world behavior.

In this article, we’ll look at how designers can ensure they get the right information on which to base their decisions. We’ll review which factors can affect lithium-ion battery lifetime, which are particularly important because lithium-ion batteries are more susceptible to variations than other technologies, and their performance is much affected by the conditions under which they are tested, used and stored.

How battery lifetime can be affected

For any rechargeable battery, how often users will need to find a charging point depends on multiple factors.

Firstly, there are environmental factors, like temperature and vibration. Environmental factors can make a big difference to battery lifetime, with the least degradation at a temperature around 25°C considered typical for lithium-ion cells. This also means that thermal management can be important in some applications, such as electric cars, to ensure the heat generated by charging or discharging does not raise the temperature of the batteries too high.

But for batteries in implanted medical devices, these factors do not usually have a significant impact. This is because after they are implanted, medical devices stay at an approximately constant temperature of 37°C, and shock or vibration are only minor.

For medical devices, the major impact on the charging interval is down to what is referred to as ‘operational factors’. These include the rate of charging and discharging and the percentage of full capacity that the battery is charged to and discharged to. Storage is also important – what percentage of charge a battery has been kept at during lengthy storage periods can affect its behavior, once it’s installed in a device and put to use.

Charging voltage is key

In lithium-ion batteries, the potential difference between the positive and negative terminals increases as the battery is charged, and energy is put into the battery. It then decreases as the battery is discharged in use, and energy is taken out.

This means that the voltage that can be measured across the terminals is a reliable indication of how fully the battery is charged, and therefore how much energy remains in it. Such voltage measurements are how your smartphone or laptop, for example, can determine the percentage of its battery life that’s remaining, and from that can then make an estimate of how much time you’ve got before the battery loses all charge.

For medical applications, a lithium-ion battery might typically be rated at a nominal voltage of 3.6V or 3.7V. In practice, standard procedure is to charge the battery to a maximum of about 4.1V and to allow it to discharge to a low point of 2.7V. This maximum voltage, at which charging is stopped, is called the End-of-Charge Voltage (EoCV) level.

But what happens if we change those parameters? If, instead, the battery is only discharged to a minimum voltage that is higher than 2.7V, then it will only discharge part of its capacity. For example, if we stop running the battery down when it’s reached a voltage of 3V, that might mean that the battery is only discharged to 40% or 50% of its capacity.

This low voltage point defines what is called Depth-of-Discharge (DoD). Fully discharging the battery is therefore considered to be 100% DoD, whereas we could measure lifetime with a smaller DoD percentage. Another change that can have an impact on lifetime is reducing the upper voltage endpoint from 4.1V to a lower value.

The reason for these changes is different chemical reactions taking place in the lithium-ion cells, for example, to degrade the electrolyte, or to deposit insoluble compounds on the anode, thus decreasing its efficiency.

These voltage changes make a surprisingly big difference in practice. If we change the upper and lower voltage limits, even by a little, the number of charge-discharge cycles in a battery’s life can reduce to only 20% of what it was before, or even less.

Although we are only talking about small variations in charge and voltage here, it is relevant to remember that a rechargeable battery may end up being fully discharged in the field. For example, a patient may simply forget to charge the battery at the correct time, allowing its voltage to drop too low, or the battery may be stored for an extended period.

This is a different issue than the variations in the lifetime we have just discussed, but for many lithium-ion batteries, this full discharge can cause damage which greatly reduces their usability. EnerSys® has addressed this problem with its Zero-Volt™ technology, which ensures batteries can still function at peak capacity, even after they have been discharged to zero volts (figures 1 and 2).


Figure 1: Usable battery capacity


Figure 2: Quallion batteries

Testing it out

When we have tested our own Quallion® batteries, we have demonstrated excellent low capacity fade performance while cycling the batteries at Depth-of-Discharge (DoD) values of 100%, right down to 20%. By this, we mean that the loss in battery capacity is minimized, even after many charge-discharge cycles.

The capacity fade performance may be improved even further during the electrical cycling by reducing the End-of-Charge Voltage (EoCV). Modifying the EoCV from the maximum recommended value of 4.1V to a lower value of 4.0V will increase the useable capacity of the battery.

When we look at 100% DoD, which is a common use case in a medical application, we examine the cycle life of the battery at the point where 80% of the initial retained capacity is reached. Most medical applications specify this operational cycle life value, and the value which is required varies upon the intended use of the medical device.

Routinely, medical applications require a battery to meet 500 to 1,000 cycles under a 100% DoD cycling condition while maintaining 80% of the initial retained capacity of the battery. The Quallion® chemistry used in its medical batteries exceeds these cycle requirements with an 80% retained capacity or better.

Conclusions

Even fairly small differences in operating conditions, such as charge and discharge voltages, can have a big impact on battery lifetime, or useful life. This means that equipment designers should ensure they are comparing batteries in a like-for-like way, and should check the testing conditions specified by battery manufacturers on their datasheets.

>> This article was originally published on our sister site, Power Electronics News.


Kevin Schrantz is Director of Global Medical at EnerSys.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.