TRANSFORMING BIG DATA INTO SMART DATA

JANUS develops and delivers customized data analysis paired with visualization tools that transform big data into smart data that communicates the information clearly through graphical representations in both static and dynamic outputs. Our Subject Matter Experts (SMEs) create tools that allow our team to take large amounts of data and create meaningful insights, using easy-to-read graphics accessible on personal dashboards, portals, and/or mobile computing devices on a need-to-know basis. With the capability for real-time business intelligence, decision-support, what-if scenarios, and advanced analysis, JANUS always keeps data security at the forefront.

JANUS Research Group's staff of talented scientists and engineers work on a wide variety of data-centric efforts designed to promote actionable decision making. We specialize in data mining and analysis to provide our customers with products and services designed to promote data driven decisions and tangible outcomes.

In the following use cases, we will illustrate how JANUS collects and analyzes data to provide our customers with the ability to rapidly detect soldier marksmanship performance and estimate proficiency. Next, we will show how JANUS uses common data analysis tools to estimate soldier poses and stance so that their behavior in a live training event can be automatically assessed.

Use Case 1 - Marksmanship Sensor Data Analysis for Shooter Performance Assessment

Tools: Anaconda, Orange, Python, Neural Networks

Marksmanship is a critical competency for all combat arms Soldiers, and every Soldier must demonstrate marksmanship proficiency on an annual basis. Effective marksmanship training, which includes accurate diagnosis and instructor feedback, is essential for maintaining this proficiency. However, the quality of feedback is constrained by the limits of human perception. Human instructors are unable to track the small movements of the rifle, which can make the difference between whether a shot hits the target or not. Such movements can be detected by sensors mounted on the weapon. 

JANUS Research Group provides advanced data analysis capabilities to our customer's sensor system instrumented in a device mounted on the Picatinny rail of a weapon that compiles data about muzzle movement and shot location in real time. The system also includes analysis and data visualization capabilities designed to facilitate decision making and shooter feedback. Data from the system can provide information about shooter mechanics and motor control, including aim, stability, and precision. This information may prove valuable to marksmanship instructors and enhance how they diagnose issues, provide feedback, and recommend corrective action. 

Our customer and its partners have performed pilot testing at various Army training sites with the sensor system and collected a significant amount of data from shooters with a wide variety of skills and experience. The data collection activity and its preparation set up the effort for more profound applications. For example, future data collection activities will add to the data lake and increase the reliability of any shooter performance forecasting efforts, intelligent tutoring opportunities, and assistive technologies for instructors and senior leaders seeking to gauge a unit's readiness. 

Figure 1. Data Mining 1427px | JANUS Research Group

Figure 1. Data Mining Pipeline from SQL Data Lake to Prediction Modeling

Figure 1 shows a screenshot of a graphical data mining tool used in this effort. Managed by Anaconda Navigator, Orange data mining software is a suite of open-source Python libraries and visualization tools that allow JANUS scientists and engineers to quickly move raw sensor data from a shooter database to outlier culling to analysis and lastly to projection and prediction. The data manipulation is mostly handled by the Orange tools allowing JANUS to concentrate on writing just a small amount of Python code to perform the analysis. Figure 2 shows the aim tracing of the sensor sampling points plotted in a five second shot window. This plot shows how the shooter steadily 'zeroed' in on the center point with a concentration of points where the shooter dwelled immediately before taking the shot.

Figure 2: Soldier Aiming Trace Point Data | JANUS Research Group

Figure 2. Soldier Aim Trace Data Points

This effort is unique and different from other similar development activities in that the entire system has been designed from the ground up with an Enterprise Data Management Strategy, with a pipeline and cloud workflow to produce shooter data lakes, all the way to analysis, decision making, display, and actionable outcomes. This kind of data analysis lends itself to the detection of shooter's skills quickly and efficiently, allowing for the more skilled shooters to move along to qualification faster and free up resources for the Warfighters who need more attention from the marksmanship trainers.

Use Case 2 - Soldier Pose Data Analysis

Tools: Jupyter Notebooks, Intel OpenVINO, Python

The US Army's new Synthetic Training Environment will consist of an ecosystem of new simulations and Training Aids, Devices, Simulators, and Simulations (TADSS), including augmented reality (AR) hardware. Most focus for AR has been on the drills for the Warfighter in a live training context, but JANUS is taking a different look at how AR can be used as an assistive technology for the trainers.

JANUS would like to extend the AR technology to perform additional Warfighter automated performance assessments by first identifying soldiers in the incoming video stream, overlaying a pose skeleton on the soldier, and then updating the skeleton in real time. Figure 3 shows how JANUS engineers have constructed a machine vision prototype used to estimate poses of soldiers in video and photographs. The engineer who constructed this prototype quickly used Jupyter notebooks, the OpenVINO machine vision libraries from Intel Corp., and Python.

Figure 3. Machine Vision Estimates a Soldier's Pose, Direction of View | JANUS Research Group

Figure 3. Machine Vision Estimates a Soldier's Pose, Direction of View

The system can track multiple soldiers simultaneously and estimate a soldier's pose and direction of view in real time (Figure 4). A situational training exercise (STX) course coordinator or trainer wearing AR gear such as the Microsoft HoloLens 2 can see overlays of the pose estimator skeleton while a training exercise is ongoing. The next step for this prototype is to add additional analysis to the Python code so that soldier performance estimation can take place automatically to assist the training managers. For example, the system can detect when a weapon is in the 'up' position and when the weapon is pointed in the direction of another soldier. Alerts can be built in to let the user know and timed flags for performance-related events can be sent to a Learning Management System (LMS) for further review. Using machine vision and real time data analysis, JANUS research is leaning forward with practical applications of artificial intelligence in the areas of automated human performance assessment. Ultimately, through intelligent application of automation and evaluation tools, we lean forward to ensure that modernization in training technology is achieved with measurable outcomes.

Figure 4: Multiple Soldiers Identified and Simultaneous Post Estimates | JANUS Research Group

Figure 4. Multiple Soldiers Identified and Simultaneous Pose Estimations

Author

Dr. Douglas Maxwell, Chief Engineer

Looking to join our team? Visit our Employment Opportunities page to get more information.

Join Our Team

ISO 20000 ISO 9001 NCS logo CMMI CMMI Development