6 minute read

NEXT GENERATION

Multi Weapon Sights

Fire Control System

Advertisement

For use on Multiple Weapon Platforms with built-in laser range finder, inclinometers, gyro, and ballistic computer calculating for lead and distance to moving, stationery and airburst targets. Exceptionally high first hit probability while reducing collateral damage.

Red Dot Sights

Trusted By Millions

The Acro P-2™ represents the next generation of RDS optics. With over 5 years of constant-on use and tested to withstand over 20,000 rounds of .40 S&W ammunition, the Acro P-2™ goes above and beyond the call of duty. For use on Multiple Weapon Platforms.

array with 76,800 pixels has medium resolution and a 640x480 (307,200 pixels) is the highest. These arrays also do not require scanning reducing their complexity. Early thermal imagers also required cryogenically cooling of the detectors which increased size, weight, and cost while adversely influencing reliability. The development of the uncooled array operating at room temperature positively resulted in more compact imagers including handheld and infantry weapon sights. Uncooled 640x480 thermal binoculars and weapon sights are available at under well under 2.2lbs (1kg) and have become more affordable. Actively cooled detector arrays continue to offer the highest sensitivity and resolution and remain preferred where size, weight and power needs are less critical.

The rapid advance in thermal imager technology is illustrated by comparing the US Army Thermal Weapon Sight (TWS), first fielded in 2006, with its current replacement - the Family of Weapons

Sights – Individual (FWS-I). At 6.7x3.6x3.9in (170x91x99mm ) and 1.85lb (0.839kg) it is significantly more compact while having increased performance. The later permits not simply detecting a target signature but being able to recognise it around 4,600 feet (1,400 meters! The stand-alone, clip-on FWS is manufactured by firms including L-3Harris and Leonardo DRS, which was awarded the most recent US Army contract in October 2022.

The Enhanced Night Vision Goggle (ENVG) combines thermal imaging technology and the white phosphor I2 to provide dual wave augmented vision to the soldier. The thermal imaging used has a larger fieldof-view and is ideally suited to detecting targets in low-visibility including smoke or fog and concealed in vegetation. The I2 binocular can see through glass and allows maps to be read and complements the thermal imaging. With a fused presentation, the soldier receives the benefits of both technologies. In addition, the ENVG full-colour display is compatible with wireless communications, rapid target acquisition and augmented reality. This includes the ability to input the image and reticle of a weapon mounted thermal sight into the goggle, permitting heads-up rapid engagements. The US Army has awarded contracts to bot L3 Harris and Leonardo DRS for delivery of ENVGs. It is intended to be employed primarily by dismounted infantry with initial fielding having begun in 2022.

A step beyond the ENVG is the US Army Integrated Visual Augmentation System (IVAS). An adaption of the Microsoft HoloLens, it not only incorporates night/ thermal vision but can present a wide range of information to the wearer. These include passive targeting, navigation, friendly positions, and the ability to share and view data from outside sources. Referred to as ‘mixed reality’ it can combine real views with overlaid simulated objects, terrain or characters. This has benefits allowing its use as a training aid as well as operational tool.

IVAS faced a number of technology challenges including the low performance of its current lowlight sensor as well as image distortion, its limited FOV, and soldier form factors. Although a contract was awarded in 2020 the procurement had been on-hold waiting further testing. However, in September 2022 the acquisition of 5,000 of both IVAS 1.0 and 1.1 versions was approved by the Secretary of the Army. BGen Christopher Schneider, Program Executive Office – Soldier commander, stated that “work is continuing to correct the issues identified in the current IVAS with a version 1.2 already in the works.” Part of this is development of an Advanced Low Light Sensor (ALLLS) an effort for which Elbit Systems received a contract in September 2022. The objective is to have this new sensor sometime after 2025 for an IVAS 2.0.

Vehicle 360 Degree

The greatest weakness in a tactical vehicle, particularly armoured vehicles, is the restricted visibility of the crew, which can hinder mobility. The introduction of very compact cameras opened the possibility of placing them to cover blind spots. This was initially used with specialty protected mine clearance and explosive ordnance disposal (EOD) vehicles and expanded to other combat vehicles. With onboard digital it became possible to increase the number of cameras linked in an onboard network. Airboss Defense Systems 360 SA, for example, includes four fixed 102 degree wide-angle plus a single pan/tilt 36X zoom PTZ cameras with controls to allow the operator to switch views. Copenhagen Sensor Technology’s (CST) Cortex Resolve Platform uses seamless stitching combining images from multiple Citadel high definition, low latency cameras to provide all-around views for combat vehicles. Elbit Defense’s Iron Vision takes a similar approach with both perimeter cameras and other surveillance and targeting sensors, but displays information on a soldier’s helmet mounted visor which draws on its experience with aircrew displays. As Tom Carlisle, director Army Solutions added: “It also provides for integration of external surveillance information such as that from a tethered UAS.” Rheinmetall’s SAS incorporates both day and night sensors while employing automatic target recognition and threat sensors like Acoustic Shooter Localisation (ASLS) and laser warning to reduce crew task load. By linking the SAS with onboard active protection and fire controls it facilitates response to an identified threat.

The introduction of wide FOV drivers vision enhancers like Leonardo DRS’s ESA with a 107x30-degree coverage using three uncooled thermal cameras augmented by side and a rear camera expand on previous drivers’ viewers by offering perspectives covering not only road edges but other hazardous areas.

The benefits of integrated 360-degree surveillance go beyond simply offering a wider vision of a vehicle’s surroundings. It also enhances safety particularly when moving in tight quarters like urban areas and forest. In a networked system all members of the crew and, in the case of infantry carriers and fighting vehicles, the embarked squad members are able to actively monitor the situation outside the vehicle. For the later they are able to prepare themselves and assure they are ready to deploy appropriately prior to disembarking. The latest Situational Awareness (SA) systems are taking advantage of image processing advances are able to offer automatic target detection. This offers the possibility of proving an alert to the crew of a potential hazard or threat. The performance of a SA system is partly based on the sensors, especially cameras utilised.

Ryan Edwards, business development director for Soldier and Vehicle Electronics at BAE Systems explained, “Our 360 MVP uncooled thermal cameras provide 120 degree horizontal by 75 degree vertical field of view with a 1920x1200 pixel pitch delivered at 60hz. The advantages include superior detection range and coverage (fewer cameras needed). The higher frame rate also enables reduced false alarm rate for threat detection applications.”

It is the integration of the various onboard sensors, cameras, and other detection assets that offers the optimum tactical return in a vehicle situational awareness system. Simply adding cameras and displays runs the risk of simply complicating the ability of the crew in performing their functions in fighting the vehicle. Too much information can result in task overload. To address this requires a “smarter” system that manages and potentially processes these various sensor inputs.

One approach is Defensphere’s Vegvisir’s, an Estonian-Croatian firm. Vegvisir combines inputs from on-board sensors and innovative cameras that are synthesised to not only display objects of immediate interest but those as much as six miles (10km) distant. Company CEO Ingvar Pärnamäe stated that it “is a modular system that combines four complementary layers of sensors to ensure awareness in a range of tactical situations”. Information is displayed on helmet mounted displays. In fact, the US Army has similarly explored the application of its IVAS as an option for integrating and presenting information to the combat vehicle mounted infantry. In such an application these embarked troops can not only access the vehicle SA assets while on-board but then smoothly transition to dismounted squad combat. The concept has been demonstrated by Army Stryker units in field experiments.

“BAE Systems BattleView 360 concept”, as Dan Lindell, BAE Systems’ platform manager for the CV90 Infantry Fighting Vehicle explained, “is focused on a way of helping soldiers on the battlefield understand their environment, to quickly identify hazards and react to rapidly evolving scenarios.” It not only provides soldiers with a 360-degree, real-time view outside of their combat vehicles but stitches together a complete picture of the battlefield. It further allows plugging into a computer to digitally collate, map, and classify various features on the battlefield to track their environment. In addition, soldiers can share what they are seeing with other crew members or their commanders.

Future Integration

The objective of developments in the past has been focused on improving the soldier’s ability to see better. Technology advances have pushed this to include aiding vision at night, in low visibility and to detect in wave lengths beyond those of the human eye. Generally, each of these were stand-alone capabilities – such as I2 or thermal – each with its own advantages and weaknesses.

The current direction is towards integrating these into a common fused display. Beyond that is the possibility of capitalising on digitalisation, processing power, and memory storage capabilities to further augment the soldier’s vision. Advances in these fields have the potential to not only detect but to classify and identify what it is that is being viewed. In addition, the seamless sharing in near-real time that image with others will expand the broader awareness at multiple levels. Battlefield ‘vision’ will become no longer an individual task but rather a community effort.

This article is from: