Vous êtes sur la page 1sur 18

Playing games with the development of

machine vision algorithms using ViPER


Presented by:

Benjamin Wheeler
NAVAL SURFACE WARFARE CENTER DAHLGREN DIVISION
UNMANNED AND ROBOTIC SYSTEMS INTEGRATION BRANCH (G82)
540.653.6030 (office phone)
540.642.2973 (mobile phone)
benjamin.wheeler@navy.mil

OUTLINE: What are we talking about?

Introduction to Concept, and Motivation


Dev. of machine vision for UxS is often dependent on data from real systems.
This work investigates the use of virtual environments to create synthetic data
to feed perception system algorithms in robotic systems.

Background and Virtual Prototyping


Utilization of open source, freeware, and govt owned simulation tools.
Demonstrates integration with existing machine vision code used in the
SUMET UGV program (Office of Naval Research Code 30).

Experiment Design and Setup


Built a real and virtual experiment setup to generate stereo disparity images.
Compared real and synthetic imagery.

Results and Application


Created stereo disparity images with an average disparity difference of 3.5
pixels between real and synthetic data for 656x492 pixel images.
Use of simulation can be effective when the world is sufficiently modeled.

Utilizing M&S for computer vision

INTRO to CONCEPT: Virtual imagery for dev. of machine vision for UxS
Current development of machine vision for UxS is highly
dependent on access to the platform and real data.
Size and complexity of UxS systems is ever increasing.
Test and development of algorithms for these systems is a technical and
resource challenge due to the multi-disciplinary nature and complexity of
current state-of-the-art robotic systems.

Progress is often bottlenecked by resource limitations,


making synthetic data an attractive alternative.
A simulated system would reduce resource requirements, speed
development time, and lower overall development cost.
However, it is difficult to correlate the true benefit of this approach, and the
ease at which high-fidelity simulated data may be created.

To test feasibility, using a real UxS perception system, we


attempted to create identical real and virtual data sets

Limited resources bottleneck development

INTRO to CONCEPT: Virtual Prototyping Environment for Robotics (ViPER)

P E

ViRTUAL PROTOTYPING ENVIRONMENT for ROBOTICS


The ViPER simulation was created under the SAF-T program
by ONR, to create a new paradigm in fire-control for RWS.
It was designed to integrate with Software-in-the-Loop
capability, and implements a preliminary design of the
autonomous target detection, integrated with a Linux-based
embedded system implementing semi-autonomous fire control.
In Sept. 2013, the initial baseline of ViPER for SAF-T was used
to conduct a user experiment to compare human performance
differences between tele-operated and semi-autonomous firecontrol modalities.

ViPER enables software-in-the-loop dev.

INTRO to CONCEPT: Virtual imagery for dev. of machine vision for UxS
This work demonstrates the ability to create near identical
disparity maps using an existing UGV perception system.
Average of 3.5 pixel difference per disparity calculation

Real Imagery

Synthetic Imagery

Investigate use of fake imagery for real algo.

VIRTUAL PROTOTYPING: Why use a video game?


Serious Gaming is a focus area for training and system dev., often
leveraging the commercial gaming industry
Examples include: VBS II, DSTS, OneSAF, Army Gaming Studio,
Delta3D,
Dev. examples include: RIVET, MODSIM (FCS), EODRS,

Many open-source, freeware, and low-cost game engines have been


developed for aspiring game creators
For warfighter-centric systems, todays games share many of the
features necessary for rapid virtual prototyping and virtual
demonstration of user-intensive systems.
Real-enough physics make interaction with the world believable
High fidelity graphics create an immersive environment for free
Easy to manipulate game environments enable creation of tailor-made
experiments, and the ability to rapidly test different system designs.

Allows creation of system-in-a-box

VIRTUAL PROTOTYPING: Why use a video game?


Utilize M&S to perform RWS vs.
supervised autonomous RWS (saRWS)
experiments to generate requirements and
inform system design
In simulated world, information is known apriori, allowing precise modeling of varying
levels of object detection, and creation of
training and test data.
Algorithms transition into ROS from
simulation, allowing concurrent
development of the real system

Record MoEs to generate KPPs

BACKGROUND: UGV Perception Systems


SUMET and ROS

Small Unity Mobility Enhancement Technologies (SUMET)


developed an advanced perception system (APS) for
unmanned ground vehicle based on stereo perception algorithms.
The SUMET system is implemented in the Robotic Operating System
(ROS). ROS is a meta-operating system enabling effective
integration of re-use of software elements common to robotic
systems.

Autonomous off-road navigation

BACKGROUND : Advanced Target Detection and Tracking


Purpose of SAF-T
Weaponization of unmanned systems,
and increase effectiveness of RWS
Autonomously identify, track, and
computes firing solutions for targets

Simulation-based approach used to define


parameters of desired future system

Auto ID,
track & firesolution
- but Operator
decides to
engage

Leveraging commercial video game


software and existing tools
Focus on Warfighter-System interaction

Improved use of RWS requires a new C2 paradigm


Systems suffer from limited SA and latency.
Technologies such as shot detection provide
recognized value-added to the warfighter.

Shoot more stuff, faster, using a computer

BACKGROUND : Tools and Related Works


Unity3D
Unity3D is a video game engine developed by Unity Technologies,
providing a convenient content-oriented editor.
Freeware and low-cost licensing options are available.
Unity3D is being using by the Supervised Autonomous Fires Technology (SAF-T)
program to virtually prototyped an unmanned weapons system, and through this
effort has integrated ROS with Unity3D.

Related Works
Virtual Environments for Cognitive Architecture Development (VECAD - ONR)
Robotics Interactive Visualization and Experimentation Toolkit (RIVET - GDRS)

Leverage open source and freeware

EXPERIMENT DESIGN & SETUP


Created test-board in the lab, and virtual world within
Unity3D using the Unity Editor.
Created textures for the virtual world with still photographs of the real scene.

Created real and virtual world

EXPERIMENT DESIGN & SETUP


Created test-board in the lab, and virtual world within
Unity3D using the Unity Editor.
Cameras implemented using 115mm baseline, with FOV matching the real
system, with resolution of 656x492 pixels, and 3.5mm (68.82 degree FOV)
Stereo calibration parameters were shared between real and virtual system.

Created real and virtual world

12

EXPERIMENT DESIGN & SETUP


Real and Simulated Systems run the same ROS code to
calculate stereo disparity from imagery
Real system uses imagery generated by the stereo camera system
Simulated system used imagery generated by the Unity3D simulation
Unity3D communicates to the ROS system via a custom ROS driver node that
communicates via the RTP protocol using a UDP Client / Server relationship.

Real or
Virtual
cameras
produce the
same ROS
Topics

Real and Simulated Systems run the same ROS code to


calculate stereo disparity from imagery
Real System

Disparity Difference

Disparity Map

Average diff. over 100


images [3.5 per pixel]

Temporal diff. over 100


images [1.2 per pixel]

Std. dev. diff. over 100


images [.006 per pixel]

Synthetic System

Disparity Map

Temporal diff. over 100


images [~ 0 per pixel]

Uncharacterized noise in real-world data

Image Resolution = 656x492 pixels : 322752 total pixels in image

RESULTS & CONCLUSIONS

APPLICATION: ViPER usage in the SAF-T program


See notes for slide explanation

ViPER acts as a synthetic sensor

APPLICATION: ViPER usage in the SAF-T Program


Warfighter Workshops
Used simulator to motivate
talks with Marine RWS operators.
Marines played simulator in TO
and SA mode, commented
on concept and design,
informing algorithm requirements

User-based Experimentation
IRB protocol for participants to play scenario in TO and SA mode.
Recording overall measures of performance at different error levels
to determine necessary false positive and false negative rates of the
detection and tracking pipeline to maintain system capability

Inform system level design requirements

APPLICATION: ViPER usage in the SAF-T Program

Generate synthetic data for algorithm dev.

QUESTIONS: What else would you like to know?

Playing games with the development of


machine vision algorithms using ViPER
Presented by:

Benjamin Wheeler
540.653.6030 (office phone)
540.642.2973 (mobile phone)
benjamin.wheeler@navy.mil

Vous aimerez peut-être aussi