Imagine this land-war scenario: An enemy fighter is several hundred yards away, another is attacking from one mile while a third fires from a nearby room in a close-quarters urban warfare circumstance, when U.S. Army soldiers apprehend, integrate, and quickly map the locations of multiple targets at once in 3D, all while knowing the range and distance of the enemy forces.
How could something like this be possible, one might wonder, given the nuances in perspective, range, navigational circumstances and the limitations of a human eye?
These complexities form the conceptual basis upon which the Army is fast-tracking its Integrated Visual Augmentation System, or IVAS, which is a soldier-worn combat goggle engineered with advanced sensors that are able to overcome some of the limitations of human vision and quickly organize target data.
“We take all soldiers who have IVAS and turn them into a sensor collecting data to share with a greater network.
The screen can chart a path and tell you where a reported adversary is. You can see through heat and augment existing light,” Gen. Joseph Martin, vice chief of staff of the Army, told an audience during an event at the Foundation for the Defense of Democracies. “If you have been dismounted, you know it can be lonely. You want to have a link to your fellow soldiers. This is what IVAS is delivering to our formation.”
Martin explained that the IVAS system is being improved and upgraded with new software by virtue of a “soldier touchpoint” collaborative process in which soldiers exercise with the goggle and offer feedback to developers.
Dr. Bruce Jette, assistant secretary of the Army, acquisition, logistics and technology, told Warrior Maven in an interview earlier this year that engineers created IVAS with an ability to compensate for what might otherwise be some of the limitations of the human eye.
Operation of IVAS calls upon a degree of what could be described as “Human-Machine Interface” because it integrates some of the neurological processes of human vision with software engineered to process, organize and display otherwise challenging factors such as “depth perception,” surrounding peripheral objects and other elements of human visual orientation.
“We don’t perceive distance with one eye, we just see larger or smaller, but if I can put it in both eyes I can get the object in 3D. To do that I need to have the sensing system to know where the eye is looking and focusing," Jette said.
The IVAS does that. The system determines what a soldier is looking at and what type of object the soldier is looking at and focuses on it to generate a 3D image in front of you.
The good part about this is I don’t need all those heavy optics on my face,” Jette said.
As designed, the IVAS system is built to lessen the hardware footprint, reduce weight and, perhaps of greatest combat relevance, streamline time-sensitive combat data.
“The sensor is seeing where my eyes are looking and preserving it based upon certain measurements. Then if I fly a UAV up there, IVAS can show the UAV coming into the scene - and converge the two onto each other so I can put the UAV right where you want it,” Jette said.
Part of the soldier feedback process, interestingly, involved requests to build even more data, icons, detail and combat information into the sensor. Developers deliberately limited the amount of information displayed on the IVAS system to avoid overloading soldiers, however, soldiers really liked the system and asked for an even more integrated display.
“Soldiers asked if they could see more things on there. The 20-year-olds have done this their entire lives and they said we can use more information,” James E. McPherson, undersecretary of the Army, said at the event.