Battery Assessment: Page 5 of 5


Inside this Article

Filthy battery bank.
Take stock of your battery bank’s health with this step-by-step assessment guide.
Using a battery room for storage
Using a battery room for storage can lead to poor accessibility to the batteries for maintenance, and can be a safety hazard.
Class C fire extinguisher
Besides eye protection, acid-proof gloves, baking soda, and distilled water, a Class C fire extinguisher and an eyewash station are important to have near the batteries.
An eyewash station
Besides eye protection, acid-proof gloves, baking soda, and distilled water, a Class C fire extinguisher and an eyewash station are important to have near the batteries.
Battery terminal corrosion
Battery terminal corrosion can lead to poor system performance and/or reduced battery life.
Improperly installed temperature sensor
Only properly installed temperature sensors can adjust the charge regime correctly. This one should have been installed two-thirds of the way up on the side of the battery, and between two batteries.
Testing the battery voltage
Testing the voltage of each battery is a quick way to identify a weak or failing cell.
Checking electrolyte level
The electrolyte level in this battery is very low, exposing the plates to air, which causes permanent damage.
Testing specific gravity of the electrolyte
The specific gravity of the electrolyte can be tested with a hydrometer.
Testing specific gravity of the electrolyte
The specific gravity of the electrolyte can also be tested with a refractometer.
Checking cell temperatures
Cell temperatures that are too high can impact the number of available cycles.
Battery Temperature Sensor
The BTS communicates the battery cell temperature to the inverter/charger and the charge controller to adjust the charging setpoints.
Filthy battery bank.
Using a battery room for storage
Class C fire extinguisher
An eyewash station
Battery terminal corrosion
Improperly installed temperature sensor
Testing the battery voltage
Checking electrolyte level
Testing specific gravity of the electrolyte
Testing specific gravity of the electrolyte
Checking cell temperatures
Battery Temperature Sensor

Some components, like charge controllers and system monitors, can display the battery’s temperature by using a battery temperature sensor (BTS) mounted on the case of one of the batteries. The BTS’s cable is plugged in to the inverter/charger or charge controller and automatically adjusts the charging voltages in relationship to the battery’s temperature. As the battery temperature increases above 77°F, the charging voltage is automatically lowered. At lower temperatures, the charging voltage is automatically increased.

Positioning and adhesion of the BTS to the battery’s case is important as the BTS will determine the battery’s charging voltage levels based on the battery temperature. If a BTS reads a lower temperature than the actual battery temperature, the charge controller or inverter/charger will overcharge the battery, leading to excessive gassing, increased water usage, and excessive battery temperatures, substantially decreasing the battery life. Note that a BTS may not work well on batteries with steel or plastic double cases. In these cases, the installer will need to consult the manufacturer and may need to make provisions to attach the BTS to the inner battery case.

During a battery assessment, a more accurate measurement of each cell’s temperature can be accomplished by inserting a glass thermometer directly into the electrolyte of each cell. Measuring each cell’s temperature also allows for an interbattery comparison to determine if some cells are operating at higher temperatures than others. The resulting average temperature can then be compared to the reading from the BTS to determine its accuracy.


It is important to also check and record all of the inverter and charge controller settings, such as the charging voltages (bulk, float, and equalize); charging time (hours); and charging rate (charge amps). These values dictate how the battery will be charged: at what voltage; for what duration; and at what am­perage. These parameters must be set for the specific battery type and size used (see the “Case Study” sidebar for a detailed example).

Depending on the application, other setpoints may also need to be evaluated to see how the battery is being managed on a daily and monthly basis. Understanding how settings for the low-voltage disconnect (LVD), load shedding, alarms, and backup charging sources impact the operation of the battery is often also required when completing a detailed battery assessment.

Data Logging

Performing an assessment several years after a system has been installed can be easier if the system has a data logger that has been recording information. If there is no online data logging system, then the data will need to be downloaded from the system’s components using memory cards or via a connected laptop computer. Some system monitors can provide the daily minimum and maximum battery SOC and voltages, and some can tell you how long it has been since the battery was equalized. This information can be helpful in understanding how a system has been operated and also can be used to estimate the remaining life of the battery based on the number of DOD cycles or the cumulative amp-hours that have been removed from the battery over the system’s life.

The data collected and its format varies between manufacturers. Assistance of the inverter and controller manufacturers is usually required to analyze and draw conclusions from the collected data.

If the system does not include a data logger, the manually written operator logs are sometimes available. They may reveal how often the battery was watered, when the generator was run, or how frequently the battery was equalized. This information can be useful in developing recommendations for the system’s future management.


Carol Weis is a NABCEP-certified PV installer and ISPQ Master PV trainer. She writes curricula and teaches national and international PV classes to technicians and end users. She has worked as a licensed electrician and solar installer in Colorado, and was part of Solar Energy International’s PV technical team for 15 years.

Christopher Freitas is an engineer and project manager for international RE projects. He was a cofounder of OutBack Power Systems and was the director of engineering at Trace Engineering.

The Improving Health Facility Infrastructure project is funded by the U.S. Agency for International Development (USAID) and implemented by Tetra Tech ES. The data used in this article was made possible through support provided by the USAID Office of Economic Growth, Trade and Agriculture under the terms of Contract No. EPP-I-00-03-00008-00. The opinions expressed herein are those of the author(s) and do not necessarily reflect the views of the USAID or Tetra Tech.

Comments (8)

Brian Ellul's picture


Good article!

Regarding the load testing, can you give an indication of what should be the voltage under load for a 2v Cell. I would like to test a set of brand new (however left unattended) forklift batteries before purchasing them however I'm finding it difficult to arrive at a definate conclusion as to their actual state/capacity.

The batteries are brand Fulmen 680AH - 20cm width x 12cm depth x 46cm tall, I've charged one cell, voltage reached 2.11v and SG=1.28.
I discharged this cell with a small load. Discharged for 85 hours at 3.3 amps (not easy to find loads at such a low voltage). At the end of the discharge test, the voltage came down to 2.01 and SG=1.19.
During discharge, the voltage read 1.95v. Is this OK since I'm finding it a bit low? How can I determine the battery capacity based on my tests?

Thanks a lot for your help.


Christopher Freitas's picture

Yes - that was my error. I had changed the text during editing to show the watts (which decrease by a factor of 4 when the voltage is halved) and then switched it to amps (which only halves when the voltage is halved) and forgot to switch the values. I apologize for this oversight and appreciate you bringing it up.


Richard Bosse's picture

I believe there is a mistake in your calculations under the heading "load testing." For the 12 Volt battery, R=E/I. R=12V/500A, R=0.024 Ohms. For the 6 Volt battery, I=6V/0.024 Ohms, I=250 Amps not 125 Amps as stated in the article. For the 2 Volt battery, I=2/0.024, I=83 Amps not 30 Amps as stated in the article.
Rich's picture

Hi Christopher,
Thanks for the quick reply! The setpoints are 29.6 v, absorb (2 hours), 26.4 float and 31.0, EQ. I believe those are pretty standard for this application but I could be wrong. The batteries were using a bit of water, about normal, I'd say, and I was aware that the Outback is set up for AGM out of the factory as to not inadvertently overcharge them if the default was FLA.
It's true that I could no doubt make the RTS work, but what would be the point? I'd really like to understand the chemistry more and know what I should expect from using the BTS.
Maybe the reduced SpG at elevated temperatures is what results in longer battery life, albeit at a reduced SOC. Or maybe the BTS really provides the most benefit at reduced temperatures where the setpoints would increase to drive the SpG. higher than w/o the BTS. The Trojan tech was not impressed with my particular results!

The house location experiences very little temperature fluctuation during the year.

Although I can't imagine what I could be doing wrong it would be interesting to have others verify (or refute) my findings!

Thanks for your time. This battery technology still seem like a trial and error implementation, even after all these years of experience with it.

PS: If this tread is getting a bit "over the top" for the 'Comments" section, let me know!!

Christopher Freitas's picture

Hello Marty - Interesting problem. You didn't mention what the setpoints are that you are using for the Trojan T105s. I've found that with PV systems that the settings may need to be different than what is recommended by the manufacturers since they are typically thinking of applications which are charging 24 hours a day continuously (such as the utility grid) and not a source which floats the battery for only 6 or so hours a day.

The bottom line is what you are observing - the specific gravity is better with the higher setpoint - by defeating the RTS you are essentially raising the setpoint up a couple of tenths. This could be done also by having the RTS connected and just adjusting the float (and absorb probably as well) a couple of tenths. This way the setting will be compensated for the change in temperatures better over the year.

BTW - the default settings in the OutBack products are intended for sealed type batteries - which are always lower setpoints then recommended for flooded type batteries.

You also didn't mention if your batteries are using any water - all flooded deep cycle batteries should use some - if they don't then they most likely are not being charged fully and may become sulfated.

Let me know what your setpoints are currently and how much water you've been adding.

Christopher's picture

After many posts and phone calls, I've never been able to find the answer to a BTS (Outback calls theirs RTS) problem I had. Never having used an RTS before, I properly placed and connected a new one via an Outback MX60 to a new bank of 4X2 Trojan T-105s. I noted that it was operating correctly and reducing the Absorb set point a few tenths to compensate for a slightly elevated ambient temperature in accordance with a table of values I had, temp vs change in voltage.
Everything was looking fine until I measured the temperature compensated specific gravity (at Float) which showed a consistent under charge at about 1.230 (SOC 80%). Nothing I could do (e.g., lengthen absorb time) would bring these new batteries up to 100% SOC or 1.27 SpG. An EQ worked but the RTS is not used during equalization.
Trojan couldn't explain it but was adamant that the SpG be brought up to 1.27, whatever it took. So, my new RTS is disconnected and the batteries happily float at 1.26+ after 3 years of use.
I'd like to find out how to implement the RTS correctly but have yet to discover what to expect from the RTS implementation.
Ideas? Next issue?

robert wimberly's picture

Good article and focuses on some of the most important checks. I would only add that when checking specific gravity you may be slightly mislead if you do not correct for the electrolytes temperature. The normal correction factor is 0.004 points of gravity for every 10 degrees above or below 80F. For example if you are outside in a cool area at 40F then a reading of 1.265 g/ml is really approximately 1.249 g/ml.

Justine Sanchez's picture

Hi Robert,
Good point! And timely as our next issue of Home Power includes an article specifically focused on checking specific gravity, where we do touch on the need to adjust for the electrolyte temperature. Thanks for posting!
Justine Sanchez
Home Power Magazine

Show or Hide All Comments