1 Introduction

Hybrid energy systems refer to combinations of two or more energy sources integrated for power generation. These sources collectively ensure reliability, efficiency, scalability, and energy resilience while offering environmental benefits. Among the most common configurations are solar and wind hybrids, which offer low carbon footprints and renewable availability. However, renewable sources are inherently intermittent. Wind energy is not viable at all sites due to variable speeds and unpredictability, while solar energy is only available during daylight and may be hindered by cloudy conditions. Ideally, a hybrid system leverages the strengths of one source to compensate for the limitations of another [1]. In remote or high-altitude regions, the combination of energy sources with batteries as storage or with a generator as a backup offers improved reliability [2,3,4].

A key objective in hybrid system design is to reduce the size and cost of energy storage. Although batteries are essential to buffer inconsistencies in renewable supply, increasing energy generation capacity often proves more cost-effective than scaling up storage [5]. Devices like PV modules or wind turbines tend to have longer service lives and lower environmental footprints than batteries, especially when considering transportation and installation in remote regions [5, 6].

Hybrid systems have been consistently shown to be more reliable and cost-effective than conventional setups [3, 7,8,9], offering both short- and long-term economic advantages [4, 10], along with reductions in greenhouse gas emissions and fossil fuel dependency.

1.1 Hybrid system architectures

Recent literature emphasizes the growing role of hybrid systems in diverse application settings. Come et al. [2] and Ijeoma et al. [4] highlighted the role of solar-wind-battery-generator combinations in off-grid electrification for developing regions, demonstrating improved reliability and sustainability. Similar studies by Alturki and Awwad [5] and Yimen et al. [10] evaluated combinations of wind, PV, diesel generators, and pumped hydro, focusing on improved system performance under resource variability.

Dalwadi et al. [1] and Kaladgi et al. [6] discussed the optimization and fabrication of small-scale hybrid systems, proposing simple, modular architectures for distributed generation and portability. Al-Rawashdeh et al. [11] and White et al. [9] further explored multi-scenario planning for hybrid integration in green buildings and rural contexts.

1.2 Sizing, optimization, and economic feasibility

Proper component sizing remains crucial for system efficiency and cost-effectiveness. Bansal [12] and Khan et al. [13] underlined the importance of optimization techniques to size each subsystem (PV, WT, battery) based on load forecasting and resource availability. Huang et al. [14] reported energy gains up to 54% by substituting multiple small WTs with a single larger turbine. Neubauer and Pesaran [15] and Kusakana [16] examined the economic impact of batteries and hydrokinetic combinations, respectively. However, Maheshwari and Guptha [17] cautioned that in some cases, hybrid systems may not be economically favourable compared to traditional energy sources.

1.3 Battery management and smart control systems

Battery health, charge balancing, and real-time control are central to hybrid system performance. Several algorithms have been proposed to enhance SOC estimation and active balancing in lithium-ion batteries, showing improvements in thermal uniformity and energy retention [18,19,20]. While the current study does not incorporate such algorithms, it experimentally evaluates voltage and temperature behavior across different input modes, providing baseline data for future controller integration.

1.4 Emerging digital and decentralized technologies

Smart energy systems are also being enhanced through digital platforms. For example, SCADA, IoT, LSTM-based optimization frameworks tailored for semi-grid and mining section operations have successfully been demonstrated [21,22,23]. Blockchain solutions have been proposed to secure energy transactions and improve load coordination [24,25,26]. Ahmed et al. [27] introduced real-time monitoring tools using FFT-based detection for power quality improvement. While these approaches primarily target interconnected systems, the decentralization principles are relevant for remote and stand-alone applications.

1.5 Gap in literature and present work’s contribution

Most previous studies are simulation-based or aimed at large-scale installations. Very few works experimentally validate compact, low-load hybrid systems (10–100 W range) tailored for inaccessible, high-altitude locations. The present study addresses this gap through real-world testing of a solar–wind–battery hybrid prototype with integrated AC backup. Key contributions include thermal profiling of LFP batteries under real loads, source-switching logic for autonomous operation, and a tower-integrated, portable structural design suitable for energy-resilient deployment in challenging terrains.

2 Experimental set up and working procedure

2.1 System requirement specifications

A modular system was conceived which works autonomously. The power to be generated by its own for powering the surveillance cum communication devices, i.e., a CCD Camera and a radio transceiver. The combined power consumption of these devices being 10 W. The overall system is required to be designed in such a way that the battery should support 3–4 days of uninterrupted operation even under harshest of environmental conditions with neither solar, nor wind power is available.

2.2 Site conditions and resource availability

The system is designed for deployment in remote, high-altitude regions such as Leh, Ladakh (34.1° N, 77.6° E, elevation ~ 3500 m). To validate environmental feasibility, publicly available satellite-modeled climate data were obtained from the NASA POWER database (https://power.larc.nasa.gov/).

Table 1 presents the monthly average solar irradiance and wind speed over a typical year. The region receives 4.5–6.0 kWh/m²/day of solar energy and average wind speeds of 5.5–6.2 m/s, with moderate seasonal variation. These inputs justify the use of PV–wind hybridization with battery backup for consistent load support.

Table 1 Average monthly solar irradiance and wind speed in target areas

2.3 System arrangement

We proposed a hybrid energy system in the form of a tower (Fig. 1), where a vertical axis wind turbine (VAWT) is placed the top of the axis, and the solar panel(s) were attached to the pole/axis of the tower. The battery and other components of the hybrid energy system were placed in a suitcase, attached to the tower. Thus, the system comprised of two PV arrays, wind turbine (WT), battery bank, inverter/charger, and a maximum power point tracking (MPPT) controller. The configuration and switching logic of the proposed hybrid PV/wind and battery system is illustrated in Fig. 2. A flowchart of the working of the system is as shown in Fig. 3. The DC power generated was converted to AC by the inverter to supply the load. MPPT controller protected the battery system from damage and ensured the PV system operates at its maximum power point. At present, a CCD Camera and a radio transceiver were co-localized on the tower, which drew AC power.

Fig. 1
figure 1

Assembled view of solar/wind/battery hybrid energy system

Fig. 2
figure 2

Block diagram and switching logic of Solar/Wind/Battery hybrid system. As the system switches on, the battery SOC is checked first. If the battery SOC > 50%, and load is connected, the battery immediately starts discharging. In case battery SOC < 50%, battery draws current from the available source to charge. The default source for charging of battery is solar. In case, radiance < 400 W/m2, priority no. 2, i.e., wind turbine is used to draw the current as long as wind speed > 1 m/s. When both solar and wind sources are not available, available AC source (diesel generator) is switched on to charge the batteries. When AC source is available, it can directly be used to provide current to the AC load

Fig. 3
figure 3

Operational algorithm of the hybrid solar–wind–AC–battery energy system

The system continuously monitors the battery’s state of charge (SOC) and follows a prioritized logic for energy input selection. When the load is active and battery SOC is below the threshold (typically 50%), the system sequentially checks for availability of solar, wind, and finally AC sources. Upon detecting an available source, the system begins charging the battery until a predefined upper SOC limit (~ 90%) is reached. If no source is available, the system enters a low-power protection mode to prevent deep discharge. This flowchart represents the autonomous decision-making logic embedded in the system’s controller for uninterrupted off-grid operation.

2.4 Modelling

2.4.1 Battery storage

In off-grid renewable energy systems, batteries are the most commonly used devices for storing energy. The intermittent nature of energy produced by wind and photovoltaic systems makes it crucial to choose the right size (in terms of capacity) for the battery bank, which supports the energy system and ensures continuous supply to meet demand [28]. The overall battery size largely depends on the required duration of load support without recharging. Additionally, the allowable depth of discharge (DOD) is a key factor in battery sizing. For this reason, deep-cycle energy storage units in battery form are widely used in off-grid renewable energy systems [29]. To determine the required battery size for a hybrid system, following equation [29] was used-.

$$\text C_{BR}=(L.D_{atn})/DoD_\text{max}.{D_f}$$
(1)

Here, CBR = Battery bank capacity (W).

L = Consumed load through the day (W/day).

Datn = Numbers of autonomous days.

DoDmax= Maximum depth of discharge.

Df = Derate factor.

In present case, L, as explained above is 240 W/day. Datn required was 3–4 days, as described above. For the worst case scenario, we considered it to be 4 days. DoDmax has been taken up as 80%, because batteries with such depth of discharge are available, and Df has been drawn to 0.5 (maximum stress) based on the datasheet of Lithium Smart Battery of Victron Energy, Netherlands.

With above values in hand, the CBR was calculated as below-

CBR = (240 × 4)/(0.80 × 0.5) = 2400 W

It is pertinent to mention that the 10 W baseline load represents the continuous operation of a compact video surveillance module and radio communication device typically used in remote military posts. Peak power consumption may reach 25–35 W, which is accounted for in the battery sizing and energy source prioritization logic. For higher requirements as well as for more than one camera based systems, the system can easily be scaled up, and battery capacity can be enhanced.

Among the battery types, Lithium iron phosphate (LFP) battery was identified suitable because of its high current rating and long cycle life (upto 5000), besides good thermal stability, enhanced safety and tolerance if abused. LFP is more tolerant to full charge conditions and is less stressed than other lithium-ion systems if kept a high voltage for a prolonged time. LFPs use iron in place of cobalt or nickel, and is therefore considered the most suited for battery energy storage system (BESS) applications. LFP batteries have a lower power density, but this characteristic is less important for present application.

2.4.2 Solar PV

Photovoltaic (PV) energy systems generate DC electricity from solar radiation through the use of PV modules. These modules can be connected in series or parallel configurations to form strings or a PV array. To supply electrical loads, an inverter is required to convert the DC output from the PV system into AC power, which is needed to power devices. The power generated by a PV system is influenced by factors such as solar irradiation, temperature, and cell type [29]. Notably, the output power of a PV system decreases as temperature increases, though other factors may also play a role.

Solar power generation depends primarily on the surface area of the solar panel (A, in m²), the average solar irradiance at the location (H, in kWh/m²/day), and the panel’s efficiency (PR). In the present study, the panel efficiency (PR) is assumed to be 20%, as typical for monocrystalline silicon modules [30].

The total power output (PS) is calculated using:

$$P_S=\text A{ \times \text H \times\text {PR} }$$
(2)

Where:

  • PS​ = average daily energy output [kWh/day].

  • A = panel area [m²].

  • H = mean daily solar radiation [kWh/m²/day].

  • PR = panel efficiency (assumed = 0.20).

For remote high-altitude locations, the mean daily solar radiation is assumed to be H = 5.54 kWh/m2/day based on regional satellite data. To convert energy to instantaneous power (W), the equivalent daily generation is distributed over peak sun hours, typically 4 h/day in such locations.

If the target is a 100 W panel, to ensure the battery is replenished daily under minimum insolation, we use:

100 W = A (m2) x (5.54 × 1000) (Wh/day) x 0.2

Therefore, A = 0.09 m2 = 900 cm2

A single PV cell produces relatively low voltage and current; typically, a PV cell generates about 0.5 V, with current output varying based on sunlight intensity and cell surface area [31]. To increase the power output, PV cells are connected in series (to boost voltage), in parallel (to increase current), or in a series–parallel configuration (to achieve the desired current and voltage), forming a PV panel or module. Likewise, PV panels can be interconnected in series and/or parallel to create a PV array tailored to the application’s specific voltage and current requirements.

2.4.3 Wind turbine

The output power of a wind turbine at a particular location is influenced by several factors. For instance, as wind speed increases, the available energy also increases, following a cubic relationship. Additionally, wind tends to blow at higher and more consistent speeds at greater heights. Air temperature plays a role as well; colder air contributes to higher energy levels. The maximum energy that can be extracted from a wind turbine is roughly proportional to the swept area of the rotor [32]. This power can be calculated using the following equations-.

$$P=1/2C_P\rho Au^3$$
(3)

Where, ρ is the air density (kg/m3),

A is the rotor swept area (m2),

υ is the wind speed (m/s) and.

CP is the power coefficient of the wind turbine (Betz constant: 0.593).

Because, the wind blows irregularly, and for shorter periods, a higher power turbine is required compared to solar PV. A 500 W turbine was planned. This seemingly oversized turbine was intentionally chosen due to the highly intermittent nature of wind in high-altitude regions. Given that usable wind occurs in short bursts and often at night or during inclement weather when solar is unavailable, a higher-capacity wind turbine helps maximize energy capture during brief favourable windows. It compensates for the short duty cycle with higher instantaneous yield, ensuring the battery can recover faster during wind events. While this may appear oversized relative to the average 10 W load, it aligns with the design philosophy of ensuring redundancy and rapid recharge in power-scarce, solar-deficient scenarios. This trade-off was also influenced by portability constraints — a single 500 W unit with smaller swept area offers better structural and deployment efficiency than multiple smaller turbines.

For high-altitude locations such as at 10,000 ft, the air density (ρ) is lower than sea level. Based on standard atmospheric models, the air density is assumed to be ρ = 0.90 kg/m3. The average peak wind speed at such an altitude is estimated as υ = 6 m/s, based on IndianClimate.com data.

Substituting the assumed values to determine the rotor area required for a 500 W turbine:

500 = ½ x 0.593 × 0.9 x A x 6 × 6 × 6

Thus, A = 8.67 m2

The radius for wind turbine was thus calculated to be 1.67 m. Because the wind turbine is the second choice source of deriving power for the battery, the calculations have been carried out for the best case scenario (peak wind speeds). Another reason to keep these calculations, was the ultimate objective to keep the equipment portable, easy to handle, while minimizing its signature. Nevertheless, it is recognized that wind would not always blow on the peak speed. Therefore, for academic purpose data was derived from various databases (NASA POWER database and Wanderlog; https://wanderlog.com/) and published literature [33] on windspeed at Ladakh round the year, and average value of 3.30 m/s was arrived at. A probabilistic analysis using the Weibull wind speed distribution was carried out, which is a widely adopted probability function in wind energy studies to model variable wind speed patterns and to avoid overestimation resulting from theoretical peak speeds [34]– [35].

The wind speed distribution \(\:f\left(v\right)\) is given by:

$$\:f\left(v\right)=\frac{\text{k}}{\text{c}}{\left(\frac{\text{v}}{\text{c}}\right)}^{\text{k}-1}{\text{e}}^{-{\left(\frac{\text{v}}{\text{c}}\right)}^{\text{k}}}$$
(4)

where:

\(\:v\) is the wind speed (m/s),

k is the shape parameter (here assumed 2.0) (Shaban et al. 2000),

c is the scale parameter (mean wind speed, taken as 3.30 m/s for Ladakh).

The expected wind power density Pavg​ per unit area was then calculated as:

$$P_{avg}=\int_{0}^{\infty}\big(\frac12 \rho C_p{{v3}})\,f(v)dv$$
(5)

Here, ρ is the air density (0.90 kg/m3) at an altitude of 10,000 ft asl; and.

Cp is the power coefficient, taken up 0.35 for small wind turbine [36].

These calculations, bring the value of expected power density of 10.81 W/m2. In case, 200 W is required from the wind turbine, the required swept area, A becomes ratio of required wattage divided by expected power density, i.e., 200/10.81 = 18.50 m2. Rotor diameter is determined from this circular area and comes out to be 4.85 m, which clearly does not support the wind turbine as a sustainable primary or main source of energy. Therefore, rest of the calculations and analysis has been considered only considering the peak wind speeds of 6 m/s.

2.4.4 Other major components

MPPT solar charge controllers were used for charging of batteries through solar energy. MPPT controllers operate at the maximum power voltage by maximizing the amount of power being produced which becomes significant in colder conditions (including at high altitude with cold climates), when the array voltage gets increasingly higher than the battery voltage. MPPT controllers can also operate with much higher voltages and lower array currents which can mean fewer strings in parallel and smaller wire sizes since there is less voltage drop. For the present application a MPPT controller with charge current of 15 A was used, specifically designated for managing the photovoltaic (PV) input, which under peak conditions (100 W PV @ ~25 V) would typically produce around 4–5 A — well within the controller’s capacity. The wind input is independently managed through a dedicated 600 W wind charger unit, and does not route through the MPPT. This separation prevents overload and allows each renewable source to be optimized via its own charge regulation circuitry. Moreover, simultaneous peak output from both sources is highly unlikely due to environmental conditions in high-altitude regions — high irradiance typically coincides with low wind speeds, and vice versa. As such, the controller ratings were chosen to balance cost, weight, and real-world operating profiles without significantly compromising energy harvesting potential.

DC to DC converters were used for obtaining the output in 12 V and 5 V DC ranges. The switchover was done at fixed frequency, converter working on forward topology, magnetic feedback technology. A 500 W inverter/charger with two main AC outputs adhering to EN-IEC-60335-1 & 2 standards was used in present application. An inverter/charger is necessary to balance the energy flow between AC and DC components. Inverters convert electrical power from DC to AC, while converters change AC to DC. Standalone inverters and converters are specifically designed for independent, utility-free power systems, making them suitable for remote hybrid applications. Most modern inverters achieve a DC-to-AC conversion efficiency of approximately 90% or higher.

All electronic components used in the prototype — including the inverter/charger, MPPT controller, battery management system, wind charge controller, and DC-DC converters — were sourced as commercial off-the-shelf (COTS) modules. This approach was intentionally chosen to ensure ease of integration, replicability, and rapid field deployment. No component-level customization or embedded hardware development was undertaken. The mechanical assembly, such as the support frame and integrated tower housing, was custom-fabricated to accommodate environmental constraints and portability needs.

2.5 Losses and derating

Known efficiency values of various components were used to derive at internal losses and associated de-rated values (Table 2).

Table 2 Component-wise derating and overall system efficiency estimate

Overestimation was thus avoided by accounting for this efficiency factor. That is in order to meet required usable power (Preq), the generation capacity (Pgen) has to be enhanced, which is obtained by dividing the Pgen by ηtotal. Because, a 100 W solar panel would yield only about 72%, the correct sizing would be determined using 100/0.72 = 138 W solar panel. Because such a denomination is not available in the market to be integrated into the system, we continued using a 100 W panel, and accounted for overall 72% system efficiency. In other words, as estimated above in four hours of peak irradiance, the system would generate about 288 W power.

The battery capacity and autonomy have also been re-calculated. From the above discussion, it is known that we are expecting a 10 W load on a single system. For a continuous support for four days, the total wattage required is 10 × 24 × 4 = 960 W. We have used a 2560 W battery considering 50% derating at −20 °C. Considering the losses in battery charging and discharging and associated derating factor of 0.9, the power requirement becomes 960/0.9 = 1067 W. Now accounting for inverter and wiring efficiency with derating factors of 0.92 and 0.95 respectively, the battery requirement becomes 1067/(0.92 × 0.95) = 1098 W.

The existing choice of LFP battery of 25.6 V, 100 Ah is still good enough to meet this level of derating.

2.6 Testing

A basic functionality test (BFT) was conceptualized, and has been detailed in Fig. 4. The equipment was tested exclusively on DC sources (2000 W resistive load via Rigol module) and AC sources (2000 W resistive load via Chroma AC load unit, USA).”

Fig. 4
figure 4

Basic Functionality Test (BFT) used to test the proposed equipment

2.7 Preliminary life cycle impact Estimation

A first-order estimation of the environmental impact of the system was conducted in terms of embodied carbon dioxide equivalent (CO₂e) emissions for key components: battery, photovoltaic (PV) panels, wind turbine (WT), inverter, and structural frame. This screening-level Life Cycle Assessment (LCA) follows a cradle-to-gate boundary — covering raw material extraction, manufacturing, and transportation. Emission factors were drawn from published LCA databases and literature values, including Ecoinvent v3.8 (via OpenLCA), the IEA-PVPS Task 12 reports on PV life-cycle impacts [40], and published studies on lithium iron phosphate battery production [41, 42]. Usage-phase emissions were considered negligible due to the renewable-driven operation of the system. The functional unit was defined as a single operational hybrid system delivering 10 W continuous power over a 10-year life span. A more detailed cradle-to-grave LCA is proposed for future work.

2.8 Cost estimation approach

A high-level capital expenditure (CAPEX) estimate was prepared to assess the economic relevance of the proposed system. Costs were compiled for key components — including the battery, solar panels, wind turbine, controllers, and supporting structures — based on commercial vendor listings and supplier quotations obtained during the prototype’s procurement phase. Operational costs (OPEX) were considered minimal due to the passive and maintenance-light nature of the design. No financing or depreciation models were applied, as the intent was to provide a ballpark figure relevant to similar deployment scenarios.

3 Results

3.1 Basic functionality testing of the equipment

Running the equipment for 24 h on the projected reduced the battery state of charge only minutely, as expected (Fig. 5). The temperature of the battery also did not show any significant change during this period, rising from 26 °C to 27.3 °C in first 24 h, when the load was < 1%. For accelerated testing, the load was increased to 2000 W, i.e., approximately 78% of the battery capacity. The battery reached to its cut off voltage in about an hour and fifteen minutes thereafter (Fig. 5). The temperature of the battery in these 75 min rose to 30.2 °C from 27.3 °C.

Charging of battery using AC source, i.e. 230 V AC, 6 A through a 500 W inverter/charger took 4 h 30 min. to elevate battery voltage from 21.8 V to 27.5 V. The battery temperature during the process increased from 26 °C to 31 °C.

When the equipment was tested for hybrid nature, the default source of power selected was battery. When the battery charge would drop to 50%, DC charging through solar would begin. In case, solar was not available, power from wind turbine would be drawn. In case, both solar and wind were not available, AC charging would take over. The battery would be charged to 90%, and would then be switched off. As shown in Fig. 6, this scheme worked seamlessly both for first 24 h when the load was 10%, and in subsequent 1 h also when the load was increased to 100%. During this testing procedure, the temperature of the battery elevated to 40 °C within one hour, and stayed between 40 and 49 °C throughout the test period of 25 h. The charging was switched on between 15 and 17 h. During that part of the test period, neither solar nor wind was available therefore, AC source was switched on. Later when load was increased to 100%, both solar as well as AC source were activated.

Fig. 5
figure 5

Battery voltage profile under mixed load discharge. From 0 to 24 h, a 10 W continuous load was applied, showing a gradual voltage decline. After 24 h, the system was subjected to a 500 W load, leading to rapid discharge. A clear inflection point at hour 24 indicates load switching. Voltage values are recorded at hourly intervals

In order to test the DC sources, the battery was close to its cut off voltage (22 V) was put on 2000 W load with DC sources switched on. The system supported the load, as well as charged the batteries to 100% SOC in about 33 h (Table 3).

Fig. 6
figure 6

Battery charging profile under hybrid operating mode. The system supported a 10% base load for 24 h, followed by a full-load burst for 1 h. An increase in voltage observed between hour 15–16 indicates effective multi-source charging (solar, wind, and AC backup). The sharp voltage drop after hour 24 corresponds to the high load phase

Table 3 Operation of the equipment under solar/wind/battery hybrid mode

Similar study relying on AC source was completed with battery charging in 21 h, but the battery temperature in this case went higher than charging with DC source, as shown in Table 4. Starting from an initial battery voltage of 22.0 V and a temperature of 26.9 °C, the voltage steadily increased over a period of 21 h, reaching 27.5 V — indicating a complete charging cycle close to 100% SOC. Notably, the battery temperature also exhibited a progressive increase, peaking at 54.4 °C after 18 h.

The charging rate is initially moderate but stabilizes near the end, indicating appropriate voltage regulation by the inverter/charger. The rise from 24.0 V to 27.5 V over 17–18 h corresponds with gradual charging of a lithium iron phosphate (LFP) cell nearing full capacity, which typically shows a voltage plateau in the final phase of charge. Further, compared to DC-based charging (Table 4), the battery exhibited slightly higher temperatures under AC-based charging. This is likely due to internal conversion losses during AC-DC rectification and additional load on the inverter-charger circuitry [43]. Despite this, the temperatures remained within safe operating range for LFP chemistry, which is known for its thermal stability and tolerance to abuse.

A temperature of 54.4 °C, as observed during prolonged AC-based charging, does approach the upper thermal threshold for lithium iron phosphate (LFP) batteries. Most commercial LFP cells are rated for continuous operation up to 60 °C, beyond which accelerated aging and slight performance degradation may occur over repeated cycles [30]. Nevertheless, the cell chemistry is considered thermally stable and non-flammable up to 100–120 °C, offering a wide safety buffer. In our setup, the temperature rise was attributed to internal losses during rectification and ambient test conditions without active cooling. While cycle life may reduce marginally with repeated high-temperature exposure, the thermal response observed in this trial remained within datasheet-defined safety margins. Future iterations of the system may include passive heat sinks or improved ventilation, particularly for enclosed or high-altitude installations where ambient cooling is limited.

The results here support the viability of AC-based charging as a reliable fallback in absence of renewable sources, but also underscore the need to monitor thermal trends, especially for prolonged usage in confined enclosures at high altitude where cooling options may be limited.

Table 4 Operation of the equipment under AC source-Battery hybrid mode

To enable comparative understanding of the system’s performance across different operating modes, a summary of battery behaviour is presented in Table 5. Key variables such as voltage drop, recharge efficiency, and thermal behaviour are tabulated with reference to operational context. These values are representative of repeatable trends observed during controlled testing and reflect both baseline and stress-testing conditions.

Table 5 Summary of battery performance metrics across test scenarios

3.2 Preliminary estimation of LCA

In addition to functional validation, a preliminary estimation of the embodied carbon footprint was conducted for key components of the system. This analysis considers cradle-to-gate emissions associated with manufacturing and material sourcing of the battery, solar panels, wind turbine, inverter, and structural elements. The objective was to provide a first-order assessment of environmental tradeoffs associated with deploying such systems in remote regions. The emission factors were compiled from established literature [41, 42, 44] and LCA databases (IEA-PVPS, https://iea-pvps.org/). Table 6 summarizes the estimated CO₂-equivalent emissions for each component.

Table 6 Estimated embodied carbon footprint (Cradle-to-Gate co₂e Emissions) of major components

3.3 CAPEX/OPEX estimates

A basic cost estimate for the prototype system is presented in Table 7. The total estimated capital cost is ₹1.75 Lakhs (~ USD 2050), depending on component selection and vendor pricing. The LFP battery constitutes the largest share of CAPEX, followed by the wind turbine and inverter. Operational costs are nominal, excluding optional maintenance, due to the off-grid and passive configuration of the system.

Table 7 Estimated capital costs (CAPEX) for prototype system components

4 Discussion

4.1 Relevance of the study

The discussion is based on the operation of the experimental system under controlled conditions of laboratory and on the short time period performance. The purpose of the study was to size the hybrid energy system for ensuring uninterrupted power supply for surveillance and transmission equipment in unmanned locations, and on autonomous operations. The present set of experiments have been able to validate the same. Previous studies strongly indicate PV-WT-Battery or PV-WT-DG Set-Battery to be a suitable form of hybrid energy system [29, 45], both in terms of technical feasibility as well as techno-economics. Halabi and Mekhilef [46] proposed Solar PV-DG Set-Battery hybrid as a suitable system for a remote village in Malaysia, based on the simulation techniques. Table 8 outlines key limitations identified in selected prior works and how the present study addresses them with a unique experimental setup tailored to high-altitude, remote deployment. In summary, our work combines portability, autonomy, real-environment validation, and component-level optimization, particularly for challenging terrains such as Ladakh in India.

Table 8 Research gaps in hybrid energy systems and present work’s positioning

PV: Photovoltaic; WT: Wind Turbine; DG: Diesel Generator.

Given the area’s topography, available energy resources, and specific energy needs, hybrid energy systems can be developed and optimized to meet local requirements. Properly sizing these renewable energy-based hybrid systems can greatly enhance both the economic and technical efficiency of power supply, while also encouraging the adoption of these environmentally friendly energy sources. A number of algorithms and software these days are available for sizing of energy sources, while their performance can be studied using simulation software [29, 45, 47,48,49,50,51,52]. In the targeted area of Ladakh in present study, more renewable fraction is likely to come from the PV fraction, while WT has more or less only supplementary or alternative role to play. The battery unit, is an economical choice for remote areas. Many researches have conducted on this area such as Mishra et al. [47], Zhou et al. [54], Ma et al. [55] and Hemeida et al. [29].

4.2 Performance evaluation of the system

The designed system, though is designed to ensure four days of uninterrupted power supply even when no source of power is available, it can easily meet upto 2.5 kW of power requirement with a battery supported by solar PV, WT and AC source, when available. The battery bank was sized to support 3–4 autonomous days of operation at 10 W load, accounting for worst-case weather scenarios typical of high-altitude, cold weather, remote sites. The Depth of Discharge (DoD) was assumed to be 80%, a conservative yet practical choice for LFP batteries based on their commercial availability and performance under deep-cycle conditions. A derating factor of 0.5 was applied to simulate harsh environmental stress, taken from the technical datasheet of Victron Energy’s lithium smart battery range, which includes operation in cold and fluctuating climates [56].

The solar irradiance value of 5.54 kWh/m²/day was derived from historical meteorological data for remote regions in northern India and validated via sources like the Indian Meteorological Department (IMD), MNRE and NASA-SSE datasets [57]– [58]. The 20% panel efficiency reflects the use of commercially available monocrystalline PV modules under optimal tilt. The assumption aligns with the region’s average clear-day duration in winter months, which informed the conservative sizing of the solar array.

A wind speed of 6 m/s was used in modelling the VAWT output based on climatological averages reported for high-altitude sites at 10,000 ft elevation from IndianClimate.com and validated through empirical wind maps. The air density was adjusted to 0.90 kg/m³, reflecting lower atmospheric pressure at such altitudes. The power coefficient (Cp) was set to 0.593, the Betz limit, which represents the theoretical maximum efficiency of wind energy conversion and serves as an upper bound in design calculations [59].

The performance of the system was slightly lesser than the theoretical output. This could be because of the choice of the sub-components, internal consumption, and minor losses. Nevertheless, the observed battery discharge behavior under a 10 W load showed a minimal voltage drop (from 27.5 V to 26.8 V over 24 h), indicating stable performance and low self-discharge. The battery used in the system on 100% SOC shows 27.5 V, though the LFP batteries reach upto 29 V maximum on 100% SOC. On the other side the cut off voltage observed was 21.8 V, as against 20 V, which is mentioned in datasheet of reputed brands. Under 2000 W load, the system exhibited a complete discharge within 75 min, reaching a lower cut-off voltage. These results demonstrate good energy retention but also reflect the limitations of passive charge control. Load variations can also cause voltage dip at DC bus [60].

The use of programmable resistive loads allows precise current and power control, simulating field-representative peak load conditions such as power-hungry surveillance transmitters or heating elements under worst-case scenarios. The test loads were purely resistive, avoiding reactive components, which is acceptable given the constant power nature of expected applications in remote defence outposts.

Compared to the intelligent SOC estimation methods proposed by Qays et al. [18] and dynamic balancing techniques in Qays et al. [19], our system represents a hardware-verified baseline without algorithmic intervention. Integration of such strategies in future iterations may help reduce thermal gradients during fast charging (AC mode) and improve battery longevity in field deployments. All the calculations were made in conventional manner by pen and paper. The equipment was assembled in a straightforward manner. Regardless of simplicity of the methods, the assembled equipment was found performing to the expectations. The term “modest assembly” refers to the system’s simplified architecture — a plug-and-operate design using compact, commercially available modules such as Victron’s inverter and MPPT controller. These were selected not for cost minimization but for reliability, ease of integration, and rapid prototyping in fieldable formats. While this may increase unit cost compared to fully custom low-cost solutions, the trade-off ensures performance predictability in harsh conditions. Moreover, the modular architecture enables replication with minimal training, which is advantageous for deployment in infrastructure-scarce or logistically sensitive locations. The design intent prioritizes energy resilience and operational continuity over cost-efficiency for mass rural electrification, which was not the focus of this study.

Importantly, initially proposed energy model did not incorporate inverter, controller, and wiring losses, which can cumulatively reduce usable energy by 25–30%. Integrating typical derating factors into the system yields more conservative and realistic performance projections. This correction affects battery sizing and supports the design decision for including AC charging as backup in low-resource conditions. In summary, the present work successfully validates a functional prototype of a PV–WT–AC–Battery hybrid system under simulated load and environmental conditions. While these tests confirm basic operational readiness and source-switching logic, extended field trials are underway to assess performance under real-world stressors such as wide temperature swings, dust ingress, high winds, and seasonal variability. These ongoing evaluations will be essential in determining long-term system reliability for high-altitude, off-grid deployment.

While the present study validates the feasibility of a compact hybrid solar–wind–battery system for remote deployment, several limitations must be acknowledged. First, the experimental setup was tested under controlled laboratory conditions and simulated load scenarios. Although these trials reflect high-altitude use cases, real-world deployment—especially in extreme terrains may introduce additional variability such as snow load, icing on PV panels, wind intermittency, or seasonal solar shifts, which were not fully modelled. Second, the energy management logic relies on predefined thresholds and source priority rules; it lacks adaptive control based on real-time weather forecasts, predictive load demand, or dynamic optimization.

It is acknowledged that the current validation is based on short-duration testing under simulated conditions. Long-term or seasonally varied field trials — particularly covering winter extremes, extended low-radiation periods, and real autonomy cycles — are essential for complete system qualification. As of submission, extended field testing is ongoing, and modifications are being introduced iteratively based on performance feedback. These include refinements to passive cooling, energy prioritization logic, and enclosure design. Results from these extended evaluations will be reported in a follow-up study.

Thermal behaviour was monitored, but no active cooling or thermal protection strategy was evaluated, which could become critical under prolonged charging (particularly from AC sources) or enclosed installations. Communication and remote monitoring capabilities (e.g., SCADA, IoT) were not implemented in the current prototype but could enhance responsiveness and diagnostics.

Yet another shortcoming of the present research pertains to the fact that deterministic calculations using known environmental inputs were used for component sizing. The original sizing of the wind turbine in this study was based on peak wind speeds, which, while sufficient for system prototyping, may not reflect true long-term energy yield. The Weibull-based estimation provides a more statistically grounded perspective, taking into account wind speed variability, and shall be included in future versions of the existing system. Future work shall also integrate simulation-based optimization tools such as HOMER Pro or SAM to refine performance under variable climatic and load conditions. Future research should also focus on field deployment of the system under real environmental conditions over multiple seasons to assess long-term reliability and degradation. Incorporating machine learning-based energy management, dynamic source prioritization, and blockchain-enabled remote authentication may further improve resilience and autonomy. Scaling the system for medium-load applications (e.g., remote health units, sensor hubs) while maintaining portability also represents a promising direction.

The use of COTS modules ensures that the system can be replicated with minimal technical intervention, making it suitable for field applications where on-site custom development is not feasible. Innovation in this study lies in the system-level integration, operational logic, and the environmental resilience of the assembled solution rather than in component-level invention.

With its strengths and shortcomings, the present research directly addresses the challenges of energy reliability in remote and inaccessible regions, making the system suitable for defence, disaster relief, and environmental monitoring applications.

4.3 Preliminary LCA

A preliminary estimation of embodied carbon (Table 6) indicates that the system contributes approximately 974 kg CO₂e during its cradle-to-gate phase, with the largest share coming from the battery and wind turbine. Over a 10-year operating life, even assuming a modest offset of 0.5 kg CO₂e/day by avoiding diesel generation, the system could prevent emission of ~ 1800 kg CO₂e — yielding a net positive climate impact. While not exhaustive, this screening-level analysis supports the assertion that hybrid solar–wind–battery systems can offer long-term environmental advantages, particularly when designed with energy-dense but stable battery chemistries (like LFP) and low-maintenance renewables. Further work will involve full life-cycle modelling, including transportation, end-of-life recycling, and scenario-based comparisons with fossil-fuel alternatives.

4.4 Economic relevance and payback outlook

From a practical engineering standpoint, the prototype demonstrates a feasible trade-off between reliability and affordability. While premium components such as LFP batteries and branded inverters elevate the initial capital investment to ₹1.75 lakh, this ensures performance consistency and reduces OPEX in remote or infrastructure-scarce regions. Assuming a use-case where diesel transport or generator runtime is minimized, the payback period could range from 3 to 5 years, especially in solar-rich conditions. Moreover, the modular nature of the system allows cost scaling based on power demand. Future studies may explore cost optimization using indigenous components or integrating second-life batteries.

5 Conclusions

The hybrid energy system developed in this study was designed to ensure uninterrupted 10 W power supply for surveillance and communication equipment in high-altitude, remote, or disaster-prone locations. Experimental results validated the system’s autonomous operation for up to 4 days without solar or wind input, supported by a 2560 Wh LFP battery bank. During low-load (10 W) operations, the battery showed a minimal voltage drop from 27.5 V to 26.8 V in 24 h, with corresponding temperature rising only by 1.3 °C. In contrast, under high-stress testing with a 2000 W load, battery discharge completed within 75 min and temperatures rose to 30.2 °C. AC-based charging was completed in 4.5 h with peak battery temperature reaching 54.4 °C, while renewable-only hybrid charging required 33 h with temperature remaining under 49 °C. The system’s internal logic prioritized energy sources based on SOC and environmental availability, switching seamlessly among PV, wind, and AC. A preliminary life-cycle estimate indicates that, despite the embodied emissions of high-reliability components, the system achieves a net environmental benefit over its operational lifespan — reinforcing its potential as a sustainable off-grid energy solution. Overall, the results demonstrated the system’s effectiveness in meeting energy reliability targets while operating within thermal and electrical safety margins. This compact and modular solution is well-suited for deployment in off-grid field installations, defence outposts, and emergency response applications.