Académique Documents
Professionnel Documents
Culture Documents
International Conference
for Vision Guided Robotics
Proceedings
We hope you will find great ideas from the presenters as well as your fellow attendees.
And, we encourage you to meet with the vendors in the tabletop exhibit area who can
offer you products that meet your specific needs.
In addition to this conference, RIA and AIA offer a host of valuable resources that can
help you when you return to your company. We recommend visiting our websites
(www.robotics.org and www.machinevisiononline.org) to find free technical papers, case
studies, and information on upcoming events such as The Vision Show (end of March
2009 in Phoenix, Arizona) and the International Robots, Vision & Motion Control Show
(June 2009 in Chicago, Illinois).
Your feedback is very important to us, so please take the time to complete your
evaluation form and submit it to us onsite (or send it in to our office after the conference).
If you prefer, you can always talk to our staff in person, either here this week or by
calling 734/994-6088 to share your ideas.
Sincerely,
DISCLAIMER
The papers presented in this Proceedings book are the personal
expressions and positions of the respective author and presenter. These
views are not those of the Association, nor are they necessarily endorsed
by the Association or its members. This conference was presented by the
Association to allow robot and vision topics to be openly discussed and
diverse views disseminated. Comments about the contents will be
forwarded to the authors by the Association.
Conference Speakers.................................................................................................... 10
International Robots, Vision & Motion Control Show June 9 – 11, 2009
Donald E. Stephens Convention Center Rosemont (Chicago), Illinois
USA
11:00 am to 11:30 am Vision Guided Robot Applications for Packaging & Flexible Feeding
Mark Noschang, Applications Engineer, Adept Technology
3:00 pm to 3:30 pm Case Study: Robots & Vision in the Automated Pharmacy
David Arceneaux, Business Development & Marketing, Stäubli Robotics
9:45 am to 10:15 am Vision Guided Part Loading/Unloading from Racks for Automotive
Applications – Lessons Learned
Robert Anderson, New Technology Manager, Chrysler LLC
Presented by:
Bob Rochelle
Kawasaki Robotics USA
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Bob Rochelle
North American Sales Manager
Kawasaki Robotics USA
Bob Rochelle
Kawasaki Robotics USA
28140 Lakeview Drive
Wixom, Michigan 48393
Phone: 248-446-4211
Fax: 248-446-4200
Email: bob.rochelle@kri-us.com
Bob Rochelle has a Bachelor’s and Master’s degree in Engineering from Virginia Tech
and holds numerous US and International patents in the automation and food packaging
fields. He has been in the Automation Industry for over 25 years and has held positions
as Design Engineer, Project Manager, R & D Engineer, Engineering Manager, Sales
Engineer and Sales Manager.
Bob is a veteran seminar speaker and has taught General Engineering, Project
Management and Robotics for Baker College in Southeast Michigan. He is also the
Chair for the RIA’s New Markets Committee.
The Basics of Robotics
Bob Rochelle
North American Sales Manager
Kawasaki Robotics
References
References:
Robotic Industries Association www.robotics.org
Kawasaki Robotics (USA) Inc. www.kawasakirobotics.com
Denso Robotics www.densorobotics.com
Advance Products Corp www.advanceproductscorp.com
Practical Robotics Services www.prsrobots.com
TDI Covers www.tdicovers.com
Adept Technology www.adepttechnology.com
PAR Systems, Inc www.par.com
Conveying Industries Inc www.conveyind.com
ANSI / RIA Standard R15.06 - 1999
Handbook of Industrial Robotics Edited by Shimon Y. Nof
The Top 10 Application Mistakes Article by George Martin
Outline
Flexible Automation
The Robot Industry
Yesterday, Today, Tomorrow
Tooling
Robots in Systems
Robot Based Systems
Vision
Final Thoughts
Robotics = Flexible Automation
Manual Dedicated Automation Flexible Automation
Quick product High Volume Quick product change
change Requires Set-up time Programmable
Breaks More maintenance Higher initial Cost systems
Monotonous Air Cylinders / Repeatability by fixtures
tasks actuators Changeable Cell
Health Claims Rigid conveyors / configuration
fixtures Responds to Part Changes
The Robot Industry
History
Today
Tomorrow
General Terms
Types of Industrial Robots
First “Robots”
Steam Man and Electric Man
Robota
Czech word for “forced labor” or “serf”
Karel Capek - Rossum’s Universal Robots
Written in 1920, Premiered in Prague in
1921
Translated into English and performed in
New York in 1923
Isaac Asimov
Coined the word Robotics
1950’s wrote the Robot Series - part of the
Foundation Series
Drafted the Three Laws of Robotics.
Today’s Industrial Robots
People
George Devol
Joseph Engleberger – “Father of Robotics”
History
1956
George Devol & Joseph Engleberger met
Began development work of first commercial robot
First Working Model late 1956
1961 - First Installation
GM - Die Cast Part Extractor
Patented in 1961
Formed Unimation
Early Industrial Robots
Unimation
Universal Automation
Unimate Robot
4000# Arm
Step by Step Commands
stored on a magnetic drum
Hydraulic Actuators
$100,000 Plus Price
Puma
Programmable
Universal
Machine for
Assembly
Robot Industry - Today
Coordinates
Base or World - Origin is in the robot base
Tool Coordinates - Origin is the Tool Center Point Y
X
Multiple Axis System
Axis 7 - Turntable
Axis 1 to 6 - Robot
Cartesian / Gantry
SCARA
Telescopic
Parallel
Articulated
Modular
Cartesian / Gantry Robots
Arm or Manipulator
Controller
Arm or Manipulator
Joint 3 Motor - in rear
Arms
Wrist Joints 4, 5
Counter Balance & 6 Motors
Base
Joint 2 Motor
Mounting and Environment
Mounting
Floor, Ceiling or Walls
Proper Fasteners - no Casters
Tracks or Traverse Units
Two Components
Controller
Teach Pendant
Design
Microprocessor based
Programmable
Generally One Controller per Robot
Multi Controllers available
Teach Pendant
Design
Hand Held
Programmer's Interface to Robot Controller and Programs
LCD Display
Hard keys for Functions / Keyboard
Functions
Communicates with Controller
Dead man Switches
E - Stop
Monitor
Teaching / Programming
User Interface to robot
Operator’s System Interface Possibility
Communication and Networks
Discrete I/O
Photocoupler, relays, transistors
Relay modules add on
Remote I/O to PLC’s
DeviceNet
Master, Slave, Master & Slave
Profibus
Master, Slave, Master & Slave
Interbus
Ethernet
TCP / IP, I/O adaptor
RS232 / RS485
Internet
Intranet
Programming
Teach Pendant
Programmer holds the teach pendant
Manually teaches the robot
Off Line Programming
Teach Pendant Programming Program written remotely
Higher level language
Touch up required
Check Programs
Slow speed operation
Program Storage
Flash RAM
PC Hard Drive
Other media
PC Programming
Basic Robot Motion Teaching
Motion Instruction
Defines a target position
Interpolation Instruction
Defines how to get to the position
Joint Move - Robot articulates any axis to accomplish the move
Linear Move - Maintains the tool in the orientation specified
Circular Move - Generated by defining three points and a radius to scribe a circle
Speed
Expressed in percent of full speed or a software settable maximum speed.
Termination Instruction
Expressed as a number [1 - 9] most to least accurate.
Defines approach to the target position
Additional Programming Activities
Activities to be complete before moving to the next target position
I / O switching
Data acquisition
Repeatability
Repeatability
Ability of the robot to return to a preprogrammed position.
Closeness of agreement of repeated position movements under
the same conditions to the same location.
0.008”
Robot can position
anywhere within the
• 0.008” diameter circle
• •
and still fall within its
• repeatability specification.
•
•
Robots In Systems
Who’s Who in Robot System Industry
Tooling
Control Systems
Systems
Vision
Safety
Who’s Who in the Robotics' Industry
Robot Manufacturers
Manufactures the robot
Provides robot training, maintenance and service
System Integrator [System Builder]
Integrate the robot into a system to perform a specified task
Independent business, industry specific, some allegiance to robot manufacturer
Has knowledge of End User’s business
Designs and builds the robot based system
Purchases robot and all peripheral equipment
Designs and builds systems, writes and maintains programs
Trained on entire cell / provides training on system
Provides system components, installation, training, service and support
End Users
Uses the robotic-based system in production or processing
Knows what is required to accomplish tasks
Ultimate user - needs training, service, maintenance, spare parts
Tooling / End Effectors / E.O.A.T
The tool attached to the robot manipulator or arm that actually performs the work.
Examples
Vacuum Cups
Grippers
Spatulas / Fingers
Spray Nozzles
Dispensers
Buffing Wheels
Machine Tools
Water Jets
Welding Torches / Resistance Welding Guns
Saws
Laser Cutters
Ladles
Adds to the Work Envelope
Adds to the Payload / Torque / Inertia
Tooling Considerations
Parts Fixtures
Repeatable and Positive
Sensors
Part locators / verification of action / QC
Tool Changers
Quick change / machine set-up
Environmental Considerations
No Parts Fixture?
Can Locate
Philosophy 1
Robot Controller does all
System I/O, Tooling Control, Motion Control, Operator Interface
Philosophy 2
Robot Controller
Tooling Control, Motion Control
PLC or PC
System I/O, Operator Interface
Philosophy 3
Robot Controller
Motion Control only
PLC or PC
System I/O, Tooling Control, Operator Interface
Robot System Safety
Responsibility
Robot Manufacturer
Integrator / System Builder / Installer
User
Refer to Resources
ANSI / RIA R15.06-1999
OSHA Standards
CUL / UL [Underwriters Laboratories]
Hazardous materials requirements
ANSI/RIA R15.06-1999
Local Codes
Robotic Industries Association
Good manufacturing practices Ann Arbor, Michigan 48106
(734) 994-6088
Plant Standards www.robotics.org
Safety Components
Fence, Gates, Interlocks, Light Curtains, Barriers, Awareness Beacons
Selecting a Systems Integrator
Determine if the Integrator has experience in your industry
Transferable knowledge
Evaluate the Integrator’s background and capabilities
Full Service
Commercial Issues
Check references
The Integrator’s
Robot Manufacturers
Prepare for disaster
What happens?
After sale maintenance
Integrator / Robot manufacturer
Cost
Is the lowest bid the best?
Vision Systems
Peripheral Equipment
Camera
Camera Controller
Light Source
Calibration Check Means
Robot Components
Robot and Controller
Interface to Camera Controller
Software
Applications
Part Location
Inspection
Bin Picking
Real Time Feedback
Bin Picking
Locating or Orientating Parts
Cameras
Camera
Parts
Rack
Robot Guidance
Real Time
Welding
Seam Sealing
Dispensing
10 Reasons to Invest in Robotics
simple materials
pour efficiency
10+ Mistakes in Robot Integration
Underestimating Payload and Inertia.
Expecting the robot to do to much.
Underestimating Cable Management Issues.
Not considering all current and future application needs.
Misunderstanding accuracy and repeatability.
Focusing on the robot alone.
Not planning for disaster.
Overlooking the need for options or peripheral equipment for a system.
Not fully utilizing the capabilities of a robot.
Choosing a robot or system solution solely on price.
Thinking that robots are too complicated.
Failure to consider using robotic technology.
Bob Rochelle
North American Sales Manager
Kawasaki Robotics
28140 Lakeview Drive
Wixom, Michigan 48393
USA
Telephone: 248-446-4211
Email: bob.rochelle@kri-us.com
Web: www.kawasakirobotics.com
The Basics of Machine Vision
Presented by:
David Dechow
Aptúra Machine Vision Solutions
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
David Dechow
President
Aptúra Machine Vision Solutions
David Dechow
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
Phone: 517-272-7820, x11
Email: ddechow@aptura.com
David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr.
Dechow has worked in the field of machine vision for over 25 years as a programmer,
engineer, and manager. He served 14 years on the AIA board of directors, and was a
two term president of that board. Mr. Dechow is the 2007 recipient of the AIA
Automated Imaging Achievement Award honoring industry leaders for outstanding
contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at
conferences and seminars, and a frequent contributor to industry trade journals and
magazines and has served on the editorial boards of Vision Systems Design magazine
and Quality Magazine’s Vision and Sensors.
The Basics of Machine Vision
David Dechow
President
Aptúra Machine Vision Solutions
Session Outline
• Machine Vision
– Machine vision is the substitution of the human visual sense and
judgment capabilities with a video camera and computer to
perform an inspection task. It is the automatic acquisition and
analysis of images to obtain desired data for controlling or
evaluating a specific part or activity.
– Key Points:
• Automatic – self-acting
• Acquisition and analysis – machine vision uses both
• Non-contact
• Data acquisition – value of the technology
• Control – necessary for reasonable ROI
Overview
• Hardware
execution
Initiate • Camera and
Acquire
Inspection – (if applicable)
Image strobe trigger
external event
-Part Tracking
-Multiple
results
-Other data
• Software
Analysis execution of
inspection
program
-Recipe
changeovers • Determine
Results part status
-Multiple and Process
images/lights communicate Result –
-Part tracking results external
event
System Architectures
Computer
System Architectures
Computer
Optional
computer for
operator interface
System Architectures
Computer
Connected
monitor for
operator interface
System Architectures
• Key Issues
– Imaging
• Application requirements will dictate image space and camera resolutions
– Lighting
• The purpose of lighting for machine vision is to create the highest level of
contrast between features to be inspected relative to the background or
other features
• Competent lighting technique contributes over 80% to the success of an
application
– Optics
• Most machine vision applications use “off the shelf” optics
• Select proper machine vision quality lenses
Imaging Basics
• Image Acquisition
– Performed by a light-gathering silicon device
• CCD, CID, CMOS …
– The imaging chip comes in a variety of physical layouts
• Area
• Line
– Size of the chip varies widely as does the number of individual
picture elements (pixels)
• Typical area chip for machine vision: from .3 to 4+ Mpix
• Physical sizes from ¼" diag. up
• Typical line scan array: from 1K to 12K+
Imaging Basics
• Cameras
– Image sensor supported by electronic circuitry and
packaged for industrial use
– Final output may be analog or digital
• RS170, CCIR, NTSC, PAL, USB, FireWire (1394 a/b),
Camera Link, GigE
Camera
Lens
Electronics
Imager
Power/Control In
Signal Out
Imaging Basics
60 68 62 57 42 41 46 41 43 49 42 41
44 42 41 46 46 42 48 44 42 42 46 42
41 46 42 48 44 42 41 41 46 43 49 42
59 54 60 59 41 46 42 46 46 42 48 46
• Color Images
– Color images commonly are acquired and internally
represented as three planes of digital data – one each
for Red, Green, and Blue
– Difference between 3-chip color and Bayer Filter
– Other representations such as HIS, LAB are derived
from the RGB data
Lighting Basics
• Lighting Techniques
– The goal of lighting for machine vision applications
usually is to maximize the contrast (grayscale
difference) between features of interest and
surrounding background
– Techniques are categorized generally by the direction
of the illumination source
• Most may be achieved with different sources
Lighting Basics
• Direct bright-field
illumination
– Sources: high-angle ring
lights (shown), spot-lights,
bar-lights (shown); LED’s
or Fiber-optic guides
– Uses: general illumination
of relatively high-contrast
objects; light reflection to
camera is mostly specular
• Diffuse bright-field
illumination
– Sources: high-angle
diffuse ring lights (shown),
diffuse bar-lights; LED’s or
fluorescent
– Uses: general illumination
of relatively high-contrast
objects; light reflection to
camera is mostly diffuse
• Direct dark-field
illumination
– Sources: low-angle ring
lights (shown), spot-lights,
bar-lights; LED’s or Fiber-
optic guides
– Uses: illumination of
geometric surface features;
light reflection to camera is
mostly specular
– “dark field” is misleading –
the “field” or background
may be light relative to
Images: CCS America surface objects
Lighting Basics
• Diffuse dark-field
illumination
– Sources: diffuse, low-
angle ring lights (shown),
spot-lights, bar-lights;
LED’s or fluorescent
– Uses: non-specular
illumination of surfaces,
reducing glare; may hide
unwanted surface features
• Diffuse backlight
– Sources: highly diffused
LED or fluorescent area
lighting
– Uses: provide an accurate
silhouette of a part
• Structured light
– Sources: Focused LED
linear array, focused or
patterned lasers
– Uses: highlight geometric
shapes, create contrast
based upon shape, provide
3D information in 2D
images
• On-axis (coaxial)
illumination
– Sources: directed, diffused
LED or fiber optic area
– Uses: produce more even
illumination on specular
surfaces, may reduce low-
contrast surface features,
may highlight high-contrast
geometric surface features
depending on reflective
angle
• Collimated illumination
– Sources: diffuse, low-
angle ring lights (shown),
spot-lights, bar-lights;
LED’s or fluorescent
– Uses: non-specular
illumination of surfaces,
reducing glare; may hide
unwanted surface features
• Specialty Lenses
– Telecentric
– Microscope stages
– Macro, long WD
The Basics of Machine Vision
• Inspection Concepts
– What are the capabilities and limitations of machine
vision technology for the target application
• Requirement: specify a processing direction to take with
respect to system architecture, and the ability to specify
deliverables, performance, and acceptance criteria
– Analysis of the inspection concept can be subdivided
by general type of inspection
• Assembly Verification/Recognition
• Flaw Detection
• Gauging/Metrology
• Location/Guidance
• OCR/OCV
Machine Vision – Getting Data out of Images
• Defect/Flaw Detection
– A flaw is an object that is different from the normal
immediate background
– Imaging Issues
• Must have sufficient contrast and geometric features to be
differentiable from the background and other “good” objects
• Typically must be a minimum of 3x3 pixels in size and
possibly up to 50x50 pixels if contrast is low and defect
classification is required
• Reliable object classification may not be possible depending
upon geometric shape of the flaws
Machine Vision – Getting Data out of Images
• Gauging/Metrology
– Measurement of features
– There are physical differences between gauging
features in an image produced by a camera, and the
use of a gauge that contacts a part. These
differences usually can not be reconciled
Machine Vision – Getting Data out of Images
• Gauging/Metrology
– Gauging concepts
• Resolution, repeatability, accuracy
• Sub-pixel measurement
• Measurement tolerances
• Resolution must be approximately 1/10 of required accuracy
in order to achieve gauge reliability/repeatability
Machine Vision – Getting Data out of Images
• Gauging/Metrology
– Imaging Issues
• Lighting to get a repeatable edge
– Backlighting, collimated light
• Telecentric lenses
• Calibration
– Correction for image perspective/plane
– Calibration error stack-up
Machine Vision – Getting Data out of Images
• Location/Guidance
– Identification and location of an object in 2D or 3D
space
– May be in a confusing field of view
– Imaging Issues
• Measurement tolerances and accuracies as described for
gauging/metrology applications
• Sub-pixel resolutions may be better than discrete gauging
results
• For guidance applications, the stack-up error in robot motion
may be significant
Machine Vision – Getting Data out of Images
• OCR/OCV
– Optical Character Recognition/Verification – reading
or verifying printed characters
– Can be fooled by print variations
– Verification is difficult depending upon the application
– Imaging Issues
• Consistent presentation of the character string
• May require extensive pre-processing
The Basics of Machine Vision
• Business Issues
– Scope of supply/deliverables; who is responsible for
what
• Engineering: design, integration, shipping, installation
• Hardware components
• Warranties
• Documentation and training
– Contractual items
• Performance guarantees
• Terms
• IP ownership
Application Analysis and Specification
• Acceptance Criteria
– How to prove the machine is functioning properly
– How to resolve differences in opinion regarding
machine function
– Clearly state acceptance criteria AND methodology in
quantifiable terms
– Acceptance will be based on stated performance
criteria
Application Analysis and Specification
• Acceptance Criteria
– Analysis of system performance must be done using a verifiable
sample or challenge set of parts
• Verifiable: All parties agree that each specific challenge part meets the
stated criteria, either reject criteria or feature size if a gauging application
– Static testing is done with challenge parts
– A gauge R&R is appropriate for gauging applications
– Production testing can be done with parallel visual inspection
• Rejected parts will be judged against the set of challenge parts
– The acceptance criteria will list false accept and false reject rates
Contact Information
David Dechow
President
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
USA
Telephone: 517-272-7820, x11
email: ddechow@aptura.com
www.aptura.com
Successfully Integrating
Vision Guided Robotics
Presented by:
David Dechow
Aptúra Machine Vision Solutions
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
David Dechow
President
Aptúra Machine Vision Solutions
David Dechow
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
Phone: 517-272-7820, x11
Email: ddechow@aptura.com
David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr.
Dechow has worked in the field of machine vision for over 25 years as a programmer,
engineer, and manager. He served 14 years on the AIA board of directors, and was a
two term president of that board. Mr. Dechow is the 2007 recipient of the AIA
Automated Imaging Achievement Award honoring industry leaders for outstanding
contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at
conferences and seminars, and a frequent contributor to industry trade journals and
magazines and has served on the editorial boards of Vision Systems Design magazine
and Quality Magazine’s Vision and Sensors.
Presentation not available at time of production.
International Conference for
Vision Guided Robotics
Frank Maslar
Ford Motor Company
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Frank Maslar
Technical Specialist
Ford Motor Company
Frank Maslar
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
Phone: 313-805-3904
Email: fmaslar@ford.com
Key Responsibilities:
Work with universities and key suppliers to develop and implement advanced
manufacturing technology in the manufacturing of powertrain systems. Areas of focus
include vision systems and traceability.
Degrees:
B.S.M.E. Penn State
Technology Advances in
2D Vision Guided Robotics
Presented by:
John Keating
Cognex Corporation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
John Keating
Principal Product Marketing Manager
Cognex Corporation
John Keating
Cognex Corporation
1 Vision Drive
Natick, Massachusetts 01760
Phone: 508-650-3000
Fax: 508-650-3338
Email: john.keating@cognex.com
John Keating is a Principal Product Marketing Manager for In-Signt® vision systems at
Cognex Corporation. He holds a B.S in Electrical Engineering from Boston University
and an MBA from Babson College. Since joining Cognex in 1994, he has held roles in
applications engineering management, as well as a variety of positions in industry and
product marketing.
Technology Advances in 2D Vision
Guided Robotics
John Keating
Principal Product Marketing Manager
Cognex Corporation
Types of Robotic-Vision Applications
• Robotic-assisted Inspection
– Robot presents part to vision system for inspection
– End-of-arm vision system maneuvered around part
Types of Robotic-Vision Applications
• Palletizing/Depalletizing
– Place/Remove parts on pallets
• Conveyer Tracking
– Locate unfixtured parts on conveyer
and place them in package
• Component Assembly
– Locate unfixtured parts and assemble
to other components
• Machine Tending
– Locate unfixtured parts on conveyer
and place into CNC work cells
• Robotic Inspection
– Use robot to manipulate part or
camera to inspect critical features of
part
Types of Industries Using VGR
• Consumer Electronics
• Pharmaceutical
• Medical Devices
Applications Examples for Vision Guided Robotics
• Application:
– Stacked pallets of juice boxes need to be de-
palletized for distribution
• Challenges:
– Various juice box sizes and configurations
– Parts move slightly when on pallet
– High speed of production line Æ must minimize
robot movement
• Solution:
– High resolution, robot-mounted vision system
with 6 mm lens and large (6 feet) field of view
– Non-linear calibration algorithm to ensures
accurate placement
– 30% reduction in robot cycle time
Enabling Technology: Non-linear Calibration
Distorted Undistorted
Caused by Lens After Non-Linear Calibration
• Application:
– Locate holes and assemble rivnuts
into automotive frame
• Challenges: Rivnut
• Solution:
– Wrist-mounted, high resolution
vision system
Assembly Cross-Section
Enabling Technology: High Accuracy Gauging
Standard Res.
480
• Non-Linear Calibration
– Removes lens and
perspective distortion 768
• Application:
– Bagged food packet
– Pick-and-Place from conveyor into
shipment boxes
– Packets vary in size and can
sometimes overlap – need flexible
solution to provide exact location
• Challenges:
– Patterned background
– Non-uniform lighting
– Overlapping parts
– Specular reflection from bags
• Solution:
– Fixed Mount Vision System with
geometric pattern finding
• Application:
– Bagged food packet
– Pick-and-Place from conveyor into
shipment boxes
– Packets vary in size and can
sometimes overlap – need flexible
solution to provide exact location
• Challenges:
– Patterned background
– Non-uniform lighting
– Overlapping parts
– Specular reflection from bags
• Solution:
– Fixed Mount Vision System with
geometric pattern finding
Enabling Technology:
Advanced Pattern Finding Algorithm
Trained Part
• Challenges:
– Large variety of wheel types
– Part finish varies due to part processing
– Part cannot be shrouded Æ resulting in
variable lighting
– Part type changes
– Part is loosely placed in bin
• Solution:
– Fixed-mount vision system communicating to
6-axis robot
– 4 Month Project Payback
Robotic Inspection Application
in the Durable Goods Industry
• Application:
– Inspect washing machine
– Inspect controls, LEDs, and labels for
correct placement and surface finish
• Challenges:
– Large variety of panel colors and
configurations
– Large area to inspect for small defects
• Solution:
– Six-axis robot presents washer panels to
vision system for inspection
Conveyer Tracking Application
in the Pharmaceutical Industry
• Application:
– Pharmaceutical product tubes
need to be located on conveyer
and placed into package for
distribution
• Challenges:
– Tubes are loosely placed on
conveyer
– Range of product sizes
• Solution:
– Fixed-mount vision system
– Pick-and-Place Robot
Enabling Technology: Robot Communications
• Communication Flexibility
– Serial and Ethernet based
– Formatted strings, specific drivers,
and native mode commands
• Point-and-Click Configuration
– Build up formatted communication
strings quickly
• Code Samples
– Robot and vision system sample
code available
Robotic Inspection Application
in the Electronics Industry
• Application:
– Consumer electronics stereo
components are assembled in a flexible
automation cell
• Challenges:
– Verify that the correct components are
being assembled
– Match model number with database
description in computer system
• Solution:
– Robot presents part to fixed-mount
vision system
– OCV algorithm “reads” part number
– OPC communications to SCADA
system
Enabling Technology: OPC & ActiveX
• Application Description
– Robot locates parts on conveyer and
places them into machine press
• Challenges:
– Safety concerns prohibit manual
intervention
– Parts are in random orientation on
conveyer
• Solution:
– Fixed mount vision system suited to a
rugged, industrial environment
Enabling Technology: Rugged Hardware
• Application:
– Inserts need to be loaded
into an injection mold
housing
• Challenges:
– Need flexible solution to
accommodate a wide range
of parts
– Heavy Industrial
Environment
• Solution:
– Industrially-hardened fixed
mount vision system
Robotic Inspection Application
in the Automotive Industry
• Application:
– Need multiple inspections on variety
of parts
• Challenges:
– Need to achieve 70 inspections in
under 1 minute
– Small lot-size production and need
to minimize machine changeover
• Solution
– Robot-mounted vision system
– High speed & high accuracy
system for multi-point inspection
Enabling Technology: DSP Performance
Advancements
• Higher Performance
– Embedded image processing
– Ability to run more powerful
algorithms
03
04
05
06
07
08
09
10
11
20
20
20
20
20
20
20
20
20
20
Advantages of Vision Guided Robotics
• Communications
– Availability of Robot Protocols
– Making the application code work
together
• Return on Investment
Gain from Investment – Cost of Investment
ROI =
Cost of Investment
• Product Quality
Cost of Quality = Internal Failure Cost + External Failure Cost +
Inspection Cost + Prevention Cost
Improve Return on Investment
Gain from Investment – Cost of Investment
ROI =
Cost of Investment
• Elimination of costly mechanical fixtures
Cost of Quality
John Keating
Principal Product Marketing Manager
Cognex Corporation
1 Vision Drive
Natick,MA 01660
USA
Telephone: 1-508-650-3000
email: john.keating@cognex.com
www.cognex.com
Top Lessons Learned in
Vision Guidance Applications
Presented by:
Eric Hershberger
and
David Wyatt
Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
Eric Hershberger
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2000
Fax: 248-409-2027
Email: ehershberger@appliedmfg.com
Eric has a degree in Computer Science from Michigan Tech. He enjoys working with
vision systems and loves robot calibration and performance testing.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
David Wyatt
Staff Engineer
Applied Manufacturing Technologies
David Wyatt
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2073
Fax: 248-409-2027
Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter
Member of the Machine Vision Association of SME, the founder of Midwest Integration
and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision
guidance at Delco Electronics. At Delco Electronics, David was involved in the writing
of many of the core vision algorithms in use today and served as the Machine
Intelligence Chairman for General Motors. At Midwest Integration, David performed
hundreds of automation projects earning the Vendor of the Year Award from Day and
Zimmerman Inc. as well as the US Small Business Administration’s Award for
Excellence.
Top Lessons Learned
in Vision Guidance Applications
David R. Wyatt
Staff Engineer
Applied Manufacturing Technologies
Fixture Repeatability
• Yes, vision relaxes fixture
requirements
• A camera is a 2D sensor
• We can get 3D info from cameras
• Make sure we don’t make 3D decisions on
2D data
– Avoid using shadows to gage height
– Avoid using reduced feature sizes as an
indication of distance
– Perspective can be valid in certain applications
Global vs. Local Calibration
• It is possible to improve
robot accuracy in a
smaller work envelope
– Accuracy will decrease
outside of that smaller
envelope
• It is always best to
calibrate the vision
system as close to the
work area as possible
Contact Information
David Wyatt
Staff Engineer
Applied Manufacturing
Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
USA
Telephone: 248-409-2000
email: dwyatt@appliedmfg.com
www.appliedmfg.com
How Advancements in Vision Guidance are
Making Flexible Feeding Applications Desirable
Presented by:
Eric Lewis
Flexomation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Eric Lewis
President
Flexomation, LLC
Eric Lewis
Flexomation, LLC
586 Northland Boulevard
Forest Park, Ohio 45240
Phone: 513-825-0555
Fax: 513-825-1870
Email: eric.lewis@flexomation.com
Presented by:
Mark Noschang
Adept Technology
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Mark Noschang
Manager of Applications Engineering for North America
Adept Technology, Inc.
Mark Noschang
Adept Technology, Inc.
11133 Kenwood Road
Cincinnati, Ohio 45242
Phone: 513-792-0266, x106
Fax: 513-792-0274
Email: mark.noschang@adept.com
Mark Noschang was appointed Manager of Applications Engineering for North America
in July of 2008. He joined Adept in October 1997 as an applications engineer in the
company’s Cincinnati, Ohio office. In his tenure, he served as a senior applications
engineer, as well as fulfilling roles in the training department. Mr. Noschang holds a
Bachelor’s Degree in Electrical Engineering from the University of Cincinnati.
Vision Guided Robot Applications for
Packaging and Flexible Feeding
Mark Noschang
Manager of Applications Engineering
Adept Technology, Inc.
Agenda
• Introduction
• Motivation
• System Components
• Pitfalls for system implementation
Packaging and Flexible Feeding?
Flex Feeding???
Packaging vs. Feeding
• Packaging:
– A method by which products are enclosed to
provide containment and protection, allowing
for easier shipment, distribution, and sale
• Flexible Feeding:
– A method by which parts are taken from bulk
storage to a known orientation, typically for
assembly operations
What is Flexible Material Handling?
• “A method of taking parts from bulk to a
known orientation that can handle multiple
part sizes and styles.”
• Abilities:
– Handle a wide variety of part types
– Perform frequent model changeovers quickly / easily
– Process multiple parts and models simultaneously
– Respond quickly to part design changes
Market Business Drivers
1. Higher Throughput per Factory Space
Factory Space is at a Premium
$0.0120
$0.0120
$0.0100
$0.0100
$0.0080
$0.0080
$0.0060
Manual
$0.0060
$0.0040
$0.0040
$0.0020
$0.0020 Robot
$-
$-
1990
11 22 33 44 55 66 77 88 99 10
10 11
11 12
12 13
13 14 152006
14 15 16
16
• Results:
– Save engineering time
– Saves start up time
– Allows equipment to be re-deployed
– Saves money
Trends in Packaging and Feeding
• Food, Consumer & Household Products and
Personal Care Products more specialize
• Companies desire to keep manufacturing close
to consumers
• The use of contract packagers is increasing
• Labor & worker liability increasing
• Handling randomly oriented product from
conveyors is required for many companies
• Flexible automation enables companies to
compete and flourish in a global economy
Contact Information
Mark Noschang
Manager of Applications Engineering
Adept Technology, Inc.
11133 Kenwood Road
Cincinnati, Ohio 45242
USA
Telephone: 513-792-0266, x106
email: mark.noschang@adept.com
www.adept.com
High Accuracy Robot Calibration, Wireless
Networking, and Related Technical Issues
Presented by:
Eric Hershberger
and
David Wyatt
David Wyatt
Staff Engineer
Applied Manufacturing Technologies
David Wyatt
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2073
Fax: 248-409-2027
Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter
Member of the Machine Vision Association of SME, the founder of Midwest Integration
and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision
guidance at Delco Electronics. At Delco Electronics, David was involved in the writing
of many of the core vision algorithms in use today and served as the Machine
Intelligence Chairman for General Motors. At Midwest Integration, David performed
hundreds of automation projects earning the Vendor of the Year Award from Day and
Zimmerman Inc. as well as the US Small Business Administration’s Award for
Excellence.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
Eric Hershberger
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2000
Fax: 248-409-2027
Email: ehershberger@appliedmfg.com
Eric has a degree in Computer Science from Michigan Tech. He enjoys working with
vision systems and loves robot calibration and performance testing.
Wireless Vision Systems and
High Accuracy Vision Guidance
Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
Wireless Vision – A Reality
• New Gigabit Ethernet (GigE Vision™) camera
standard
• New IEEE 802.11n draft protocol
• Routers available on the market
• Fewer wires, less expensive high flex cables
• Easy to integrate new cameras with older vision
systems
• Problems and issues
• Other options
GigE Vision™ Standard
• New GigE Vision™ camera standard
– 1000 megabits per second (Mbps)
• Lots of camera manufacturers
– DALSA, Basler, Prosilica
• Fast image transfer
• 100m cable runs
IEEE 802.11n draft protocol
• Not approved until ~December 2009
– Hardware already for sale as Draft-n
compliant
– Backwards compatible, but not
recommended
• MIMO – Multiple-input Multiple-output
– More antennas
– Theoretical 600Mbps
• 5.0Ghz Recommended
– Older protocols use the 2.4Ghz network
Routers Recommended
• Linksys WRT600N
• Best implementation
of the 5.0Ghz draft-n
• 12v power
• Easy to mount to the
EOAT
• Built in GigE switch
• Best to use
Less Wires, Less Expensive High flex Cables
• Bluetooth 3.0
– Up to 480Mbs
– Ultra-wideband (UWB)
• Nanny cams, or
wireless ethernet
based cameras
– Typically a CMOS
imager
– Lower resolution
– 2.4Ghz band
High Accuracy Vision Guidance
• 2D systems, typical
for a dot calibration
grid – manual process
• 3D systems, auto
calibration available
• Portable CMM’s can
be used for very high
accuracy calibration
Global vs. Local Calibration
• It is possible to improve
robot accuracy in a
smaller work envelope
– Outside of that smaller
work envelope will lose
accuracy
• Always best to calibrate
the vision system as
close to the work area as
possible
High Accuracy Vision Guidance
• Combination of robot
calibration, high accuracy
TCP, and vision optics can
improve your VGR project
• Robot calibration alone can
help increase the accuracy
of off line programming for
downloads
Contact Information
Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
USA
Telephone: 248-409-2000
email: ehershberger@appliedmfg.com
www.appliedmfg.com
Vision Based Line Tracking
Presented by:
Frank Maslar
Ford Motor Company
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Frank Maslar
Technical Specialist
Ford Motor Company
Frank Maslar
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
Phone: 313-805-3904
Email: fmaslar@ford.com
Key Responsibilities:
Work with universities and key suppliers to develop and implement advanced
manufacturing technology in the manufacturing of powertrain systems. Areas of focus
include vision systems and traceability.
Degrees:
B.S.M.E. Penn State
Vision Based Line Tracking
Frank Maslar
Technical Specialist
Ford Motor Company
Background
• Interactive directed research project
between Ford Advanced Manufacturing
Technology Development and Purdue
University Robot and Vision Lab
• Team Leaders
– Ford – Frank Maslar
– Purdue University – Professor Avinash Kak
Current Robotic Applications
Assembly
3%
Arc Welding
13%
Painting
Dispensing 18%
3%
Inspection
1%
Material Handling
27%
Spot Welding
31%
Material
Removing
4%
Source: Robotic Industries Association
Opportunities
1910’s
1940’s
1990’s
Moving Line Assembly
• Enhanced Accuracy
–3 mm
• Multi-loop control
–Enhanced robustness
• Visual tracking systems
–Geometric model-based approach
–Appearance-model-based approach
Control Architecture
Control
Arbitrator
Robot
Controller
Coarse Control
0.3
-0.1
0.25
0.2
0.15 -0.05
0.1
0.05 0
0
-0.05
0.05
-0.1
0.1
Perspective
0.15
0.64
0.66
0.68
0.7
0.72
projection
Current Image
Avinash C. Kak
Professor of Electrical and Computer Engineering
Electrical and Computer Engineering
Purdue University
EE Building
West Lafayette, Indiana 47907
765-494-3551
kak@purdue.edu
http://rvl4.ecn.purdue.edu/~kak/
Contact Information
Frank Maslar
Technical Specialist
Advanced Manufacturing
Technology Department
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
USA
Telephone: 313-390-2132
email: fmaslar@ford.com
www.ford.com
Case Study:
Robots & Vision in Life Sciences
and Automated Pharmacy
Presented by:
David Arceneaux
Stäubli Robotics
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
David Arceneaux
Business Development & Marketing Manager
Stäubli Robotics
David Arceneaux
Stäubli Robotics
201 Parkway West
Duncan, South Carolina 29334
Phone: 864-486-5416
Fax: 864-486-5467
Email: d.arceneaux@staubli.com
Stäubli Robotics and employees are corporate members of the Robotic Industries
Association (RIA), Association for Laboratory Automation ALA, Semiconductor
Equipment and Materials International (SEMI), Society of Plastics Industry (SPI), and
Society of Mechanical Engineers (SME).
Robots & Vision in Life Sciences and
Automated Pharmacy
David Arceneaux
Business Development-Marketing Manager
Stäubli Robotics
Agenda
Why Automate?
Applications
No cross-contamination (human/product
or product/human)
Miniaturization
Biocompatibility (Cleanroom)
Key Benefits
Elimination of Costly Fixtures
Reduced capital investment costs
Reliability
– Automated systems need to be able to work consistently for extend periods
with minimal human intervention. (24/7)
Scalability
Consideration to current equipment and future growth should include
automation that is modular and scalable.
Flexibility
Automation needs to be easily reconfigurable when the need arises.
Flexible robotic automation allows devices to be positioned closer to the robot for a compact
solution with all of the capacity and functionality of the larger systems offered in the industry.
Applications:
Bench Top Automation
Bench top robotic systems are inexpensive, flexible robotic solutions capable
of performing a wide array of applications.
Cells can be configured with two or more devices utilizing very little floor
space.
Applications:
Cell culture automation
• Cell culture is and has historically been an essential component of the drug discovery toolbox.
• Cell culture provides the proteins, membrane preparations and other raw materials required for biological
research.
• In recent years this demand for cells and new cell lines has increased dramatically with the
emergence of high-throughput screening reinforcing the need for robotic automation.
Applications:
Ultra High Throughput Automation
Robots are capable of UHTS applications with a throughput of over 1 million assays per day.
These systems are built for industrial use and are capable of running 24 hours a day, 7 days a
week.
Case Study: Automated Pharmacy (RIVA)
ROBOTIC IV AUTOMATION
The new standard in IV admixture compounding.
– Safety
The Foundation acts as the primary fundraising body for the St.
Boniface General Hospital Inc., and promotes excellence in
health care research related to the prevention and treatment of
disease, the promotion of good health, and improvements in
patient care.
RIVA
Robotic IV Automation
The Product Overview
RIVA is an integrated system designed to automate the process
of preparing IV admixtures in the hospital pharmacy.
Up to 600 labeled, patient specific or batch doses, per 8 hour shift with
one operator
– Resource savings (3 Techs per shift – ROI in less than one year)
Just about any laboratory and hospital can take advantage of some of the
advances in automation, the questions are what to automate and to what
extent?
The options cover the spectrum from islands of automation, which retain some
manual processes, to fully automated integrated systems. The optimal degree
of laboratory automation depends on the laboratory setting and considerations
of cost, throughput, and flexibility.
Other considerations include the time that will be required to complete the
installation, the space available, the proportion of the tests that are routine, the
availability of skilled technicians, safety, and reliability.
David Arceneaux
Business Development-Marketing Manager
Staubli Robotics
201 Parkway West
Duncan, South Carolina 29334
USA
Telephone: 864-486-5416
email: d.arceneaux@staubli.com
www.staubli.com
Unmanned Systems Intelligence, Vision and
Automation Concepts for Combat Engineer and
Other Battlefield Missions
Presented by:
Jerry Lane
Applied Research Associates, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Jerry Lane
Director Great Lakes Office
Applied Research Associates
Jerry Lane
Applied Research Associates
48320 Harbor Drive
Chesterfield Twp., Michigan 48047
Phone: 586-242-7778
Email: glane@ara.com
Jerry joined ARA in 2005 to work on multiple unmanned systems projects including the
Modular Robotic Control System for Route Clearance and Combat Engineer missions,
the man-packable Nighthawk mini-unmanned aerial vehicle and the obstacle/stair
climbing robot, and the Lightweight Reconnaissance Vehicle (LRV). Jerry’s prior
assignment for 28 ½ years was at TARDEC leading various advanced vehicle
technologies and robotic vehicle projects. Jerry is also on the Board of Directors of the
Michigan Chapter of NDIA and First Vice President of the Association for Unmanned
Vehicle Systems International (AUVSI). Jerry is the Co-Founder and Co-Chair of the
Intelligent Ground Vehicle Competition (IGVC).
Robotic Control
System
+
Construction Equipment
Armor option
Supportable
Low cost
Optional equipment
High power
Hydraulic available
+
Multiple IED Disruptors
Rollers & Chains
Rakes & Cutters
Jammers
GPR/EM Detectors
Robotic Concept
with IED Disruptor Options
Rhino
Detection Neutralization
Robotic Platform
Launcher Launcher
SIDE VIEW FRONT VIEW
UAV/MAV:
Recovery Solutions:
State Police Sedan
(Ford Crown Victoria):
Deployable Capture System Deployable Capture System
SIDE VIEW FRONT VIEW
Jerry Lane
Director Great Lakes Office
Applied Research Associates
Inc.
48320 Harbor Drive
Chesterfield Twp. Michigan 48047
USA
Telephone: 586-242-7778
Email: glane@ara.com
www.ara.com
International Trends and Applications in 3D
Vision Guided Robotics
Presented by:
Adil Shafi
SHAFI Innovation Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Adil Shafi
President
SHAFI Innovation, Inc.
Adil Shafi
SHAFI Innovation, Inc.
803 Lakeshore Drive
Houghton, Michigan 49931
Phone: 734-516-6761
Email: adil.shafi@shafi-inc.com
Adil Shafi is founder and president of SHAFI, Inc. and SHAFI Innovation, Inc.
(www.shafiinc.com) He has worked in the robotics and vision industry for the last 20
years and his companies have pioneered more than 100 Software Solutions in the area
of Vision Guided Robotics. His company's RELIABOT software runs on equipment
worth $500 million and is ranked #1 in the world for AutoRacking and Bin Picking. Adil
received three degrees in Computer Science and Electrical Engineering from Michigan
Technological University and worked in Chicago, Manhattan and the Silicon Valley prior
to founding his companies in Michigan. He works closely with many industry, academic
and government organizations and has travelled to more than 90 states and countries in
the world.
Presentation not available at time of production.
Advances in 3D Vision Guided
Robotics at Fraunhofer IPA
Presented by:
Jens Kuehnle
Fraunhofer IPA
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Jens Kuehnle
Research Associate
Fraunhofer Institute Manufacturing Engineering and
Automation (IPA)
Jens Kuehnle
Fraunhofer Institute Manufacturing Engineering
and Automation (IPA)
Nobelstrasse 12
70569 Stuttgart
Germany
Phone: 49 711 970 1861
Fax: 49 711 970 1004
Email: kuehnle@ipa.fraunhofer.de
Jens Kuehnle studied pure and applied mathematics at Ulm University, Germany and at
San Diego State University, California, USA. He completed his diploma in 2005 and his
Master of Science in 2006. Since 2007, he is employed as research associate in the
Information Processing department at Fraunhofer Institute Manufacturing Engineering
and Automation (IPA), Stuttgart, Germany. His areas of interest include software
development for 3D data evaluation, 3D metrology and computed tomography, 3D
perception in robot vision and dimensional inspection.
Advances in 3D Vision Guided
Robotics at Fraunhofer IPA
Jens Kuehnle
Research Associate
Fraunhofer IPA
Structure
• Motivation
• 3D Measurement Principles
• Applications:
– 3D Obstacle Avoidance for Service Robots
– 3D Object Recognition and Localization for
Automation and Handling Engineering –
Case Studies on Bin Picking
• Summary
© Fraunhofer IPA
Fraunhofer Society in Germany
58 institutes
at 40 locations
© Fraunhofer IPA
Fraunhofer IPA, Stuttgart
© Fraunhofer IPA
Focal Fields of Research
at Fraunhofer IPA
Corporate Management Automation Surface Technology
Life cycle management Digital Factory Coating processes and plant
Product creation management Visualization and interaction for paints and powder coating
Technical risk management Deposition of metals and
Technical information systems
Hazardous materials and recycling metal compounds
management New fields of application (electroplating, PVD/CVD)
Quality management Human-machine interaction Optimization / planning of
Factory planning and design processes and plants
Industrial and service robots
Work-flow and process planning Generation of functional
Assembly systems
Production system and plant layers and nano layers
management Robot assistance systems
Analysis and testing
Production optimization Sensor technology, mechatronics techniques for components,
Order processing management and microsystems engineering layers, surface engineering
Inbound and outbound logistics processes and sources of
Signal and power transmission defects
Manufacturing logistics Measurement and testing
Supplier parks Simulation of coating
technology processes and spatial flow
Network logistics and Ultraclean technology
supply chain management
Rapid product development
© Fraunhofer IPA
Department
Technical Information Processing
• 3D Object Recognition
• Robot Vision
• Industrial Image Processing
• Digital Signal Analysis
• Measurement and Testing Technology
• 3D Modeling and Reverse Engineering
• Rapid Prototyping, Rapid Product Development
• Industrial Computer Tomography
© Fraunhofer IPA
Motivation
Motivation
Challenges of Industrial Effects on Machine/Robot
Production Vision
measurement in
zero defect production ▲
manufacturing cycle ▲
intuitive MM-communication;
part complexity ▲
no expert knowledge ▲
robot usage in human 1st Law of Robotics:
environments ▲ collision avoidance ▲
robot accomplishes complex
perception ▲
tasks (e.g., parts handling) ▲
© Fraunhofer IPA
Motivation
Challenges of Industrial Effects on Machine/Robot
Production Vision
measurement in
zero defect production ▲
manufacturing cycle ▲
intuitive MM-communication;
part complexity ▲
no expert knowledge ▲
robot usage in human real-time modeling of 3D
environments ▲ workspace with memory
robot accomplishes complex object recognition and
tasks (e.g., parts handling) ▲ localization of industrial parts
© Fraunhofer IPA
3D Measurement Principles
Rough Classification
Time-of-Flight
Sheet of Light
(D
tio
e)M
ula
od
g
an
ul a
Tri
Interferometry
tio
Structured Light
n
Stereo
Depth-From-
Focus Determination (De)Focus
Line-
projection 2D-Camera
Mo
vem
en
t
© Fraunhofer IPA
Moving Voxel Model with Memory
© Fraunhofer IPA
Moving Voxel Model with Memory
robot
robot robot
© Fraunhofer IPA
Experimental Platform
3D sensor SR3000
biaxial drive
© Fraunhofer IPA
Performance of Voxelization Algorithm
© Fraunhofer IPA
3D Object Recognition and
Localization for Automation and
Handling Engineering
Case Studies on Bin Picking
Object Recognition and Localization
Sorted parts
TA
S O
• Parts stored in special carriers
• Parts supplied totally ordered
• Vision system redundant
Partly ordered
2D camera • However, carriers specifically
adapted to the parts stored and
thus, variations in the parts
usually require the carriers to be
Randomly stored changed as well
3D Sensor • Carriers are space-consuming
© Fraunhofer IPA
Object Recognition and Localization
Sorted parts
© Fraunhofer IPA
Object Recognition and Localization
Sorted parts • Parts in known plane (e.g.,
belt, …) with only 3 DOF
(translation x/y, rotation z)
- one 2D camera
- image processing
Partly ordered
2D camera
TA
S O • Parts in arbitrary position (6
DOF) with identifiable features
(e.g. corners, holes, …)
Randomly stored
3D sensor - one (or more)
2D camera(s)
- photogrammetry
© Fraunhofer IPA
Object Recognition and Localization
Sorted parts • Parts „thrown in a box“
• Parts supplied unordered
• E.g., Bin Picking
Randomly stored
- 3D sensor
3D sensor
© Fraunhofer IPA
Problems like „Bin Picking“
• Applications
– handling of known parts with a robot
– e.g, supplying chaotically stored parts to the
manufacturing chain
© Fraunhofer IPA
Different Approaches at Fraunhofer IPA
Approach 1: database-driven
• Bin Picking ≠ Bin Picking
– part features (e.g., geometry)
– form of supply
– etc.
© Fraunhofer IPA
Approach 1:
database
scan data
© Fraunhofer IPA
Approach 1:
Overview
Generation of a point cloud Object localization of workpieces Gripping point calculation, movement
e.g. with a rotated laser‐scanner with offline‐generated database planing and gripping of a workpiece
© Fraunhofer IPA
Approach 1:
Status
• Current state of development:
– Tool to generate Database implemented
– Flexible object-localization and
gripping-point-calculation finished
– Prototype of bin picking system for
shafts implemented
• Currently under development:
– Integration of robot-cell at Hirschvogel
Umformtechnik GmbH (Denklingen, gripper
Germany) with cycle time 8s
till end of 2008
– Tests with different types of
found parts
objects (e.g. ring, housing)
© Fraunhofer IPA
Approach 2 (basics):
Segmentation of Geometric Primitives
• Geometric Primitives are:
– Plane, Sphere, Cylinder, Cone, Torus
• Best-Fit Principle (non-linear least-squares) optimizes:
d i = X i − X i′ orthogonal distance
d T = ( d1 ,...,d m ) distance vector
objective functions:
measurement point Xi X i′ base point σ02 = d T PT Pd
with weighting matrix PT P
optimization problem:
min min
a m
{X i′}i =1∈ F
σ 2
0 ({X ′ ( a )} )
i
m
i =1
© Fraunhofer IPA
Approach 2 (basics):
Best-Fitting of Geometric Primitives
• In case, the point cloud represents more than one geometric
primitive, the iterative fitting can be complemented with an automatic
segmentation into inlier and outlier:
– error of fit = order of magnitude of measurement error
inlier outlier
© Fraunhofer IPA
Approach 2:
© Fraunhofer IPA
Approach 2:
Overview
1. scanning unordered scene 2. recognition/localization
scan data
found cylinders
© Fraunhofer IPA
Approach 2:
Status
© Fraunhofer IPA
Performance of Bin Picking Algorithms
• Computing time:
– Approach 1: about 0.5 sec. with database < 40 MB.
– Approach 2: about 0.25 sec.
• Accuracy of localization adaptable:
– Approach 1: dependent on rotatory resolution of database (typically 2°)
– Approach 2: bounded by sensor inaccuracy (typically < 0.5 mm)
• Not dependent on 3D sensor used
• Successfully adapted to different parts
• Outstanding rate of recognition for parts tested
• Adaptable to different parts:
– Approach 1: definition of bounding volumes
– Approach 2: examination of geometric primitives (relation, symmetry, …)
© Fraunhofer IPA
Summary
Summary
© Fraunhofer IPA
Acknowledgments
Jens Kuehnle
Research Associate
Fraunhofer Institute
Manufacturing Engineering and
Automation (IPA)
Nobelstrasse 12
D-70569 Stuttgart
Germany
Telephone: +49 711 970 1861
email: kuehnle@ipa.fraunhofer.de
www.ipa.fraunhofer.de
© Fraunhofer IPA
Vision Guided Part Loading/Unloading
from Racks for Automotive
Applications – Lessons Learned
Presented by:
Robert Anderson
Chrysler LLC
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Robert Anderson
New Technology Manager
Advance Manufacturing Engineering
Chrysler LLC
Robert Anderson
Chrysler LLC
800 Chrysler Drive, CIMS 482-04-16
Auburn Hills, Michigan 48326
Phone: 248-944-6076
Fax: 248-841-6272
Email: ra2@chrysler.com
Mr. Anderson received Bachelor’s and Master’s of Science degrees from the Ohio State
University and a Master’s of Engineering degree from the University of Michigan. Before
coming to the Chrysler, Mr. Anderson worked for General Electric and Automatix, Inc.
Mr. Anderson has four patents in powder metallurgy and laser processing and has
published several technical articles.
Vision Guided Part Loading & Unloading from Racks for
Automotive Applications – Lessons Learned
Robert Anderson
New Technology Manager
Advance Manufacturing Engineering
Chrysler, LLC
Automotive Part Loading / Unloading
Topics
70% 306
Increase
181
2000 2008
Flexible Processes
Robotic Vision Experience
• Early Experience
– First vision aided rack unload application in 1997 using stationary cameras.
– Robot mounted cameras for rack unloading used first in 2006.
• Current Applications
– Belvidere Assembly was the first Chrysler program to use comprehensive
vision rack unload strategy.
– All but one Chrysler assembly plant and all Stamping/Sub-Assembly
fabrication plants use rack load / unload vision systems.
– Approximately 75 robotic vision rack load / unload applications.
– Load and unload a wide range of body shop components: doors, floorpans,
body sides, rails, crossmembers. . . .
• Future
– Standardize systems, incorporate best practices, improve system integration
and robustness.
– Possibly reduce the number of assemblies loaded/unloaded with vision.
– Looking at bin picking but need solid business case.
Rack Load / Unload Applications
Why Use Vision ? – Reduce Cost
• 2½D
– Feature size, shape change interpreted as change in Z.
– Requires 3D object model
– Object tilt introduces measurement error
– Often used for parts which have little rotation in the rack
Vision Approaches
Brightness Range
Object surface
Stationary Mounted Cameras
Vision Approach Lesson Learned
• One solution will not likely be best for all applications
– Drives increased cost and complexity
• Custom solutions lack advantages of standardization
– Need consistent vision components, integration and user
interface to optimize operation and troubleshooting procedures.
• Standardize by application
– Flat, stacked sheet metal such as floor panels
– Vertical, 3D objects such as rails
– Completed assembly loading such as doors, hoods
– Small components such as cross-members
– Installation applications such as roof and glass fitting
– Can also group by positioning accuracy required
Image Processing
• Lighting
• Feature recognition
Search windows, field of view,
accuracy, redundancy,
accommodate part movement
Robustness for Productions Conditions
Use 3D Vision
Hand-off table
How much
Parts shift in rack movement is Shift during transport
acceptable /
excessive?
Problem typically one style, only Part repeatablility problem, tends to exclude camera failure
Fail with all features "0" No part, camera moved/failed, light changed/failed
Readiness Assessment:
Fault recovery, restore, backup
System start-up. – connections,
power up, software navigation, . . . System maintenance - alignmt,
replacemt, calibrtn, clean’g . . .
Operation – robot prgm’g, lighting,
reference images, diagnostics, . . . Eng’g changes
Plant Considerations – Optimization
Search Window
Search Window
Robert Anderson
New Technology Manager
Advanced Manufacturing Engineering
Chrysler, LLC
Auburn Hills, Michigan
Presented by:
Babak Habibi
Braintech Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Babak Habibi
CTO
Braintech, Inc.
Babak Habibi
Braintech, Inc.
102 - 930 West 1st Street
North Vancouver, British Columbia V7P 3N4
Canada
Phone: 604-988-6440
Fax: 604-986-6131
Email: bhabibi@braintech.com
Presented by:
Steven West
ABB, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Steven West
Business Development Manager
ABB, Inc.
Steven West
ABB, Inc.
1250 Brown Road
Auburn Hills, Michigan 48326
Phone: 248-391-9000
Fax: 248-391-7390
Email: steven.w.west@us.abb.com
Bio Not Available at time of Print.
Random Bin Picking
Applications and Solutions
Steven W. West
Development Manager
ABB Inc.
Random Bin Picking
The Robotic Bin Picking
Challenge:
• Extreme part
overlap and
occlusion
• Significant lighting
variability and
shadowing
• Shortage of
distinct features
on parts
• Collision
avoidance with
other parts, tools
and bins
• Factory ready and
RELIABLE!
Random Bin Picking
Multi Part Candidate Tracking to offer a choice of identified pick- Simplified User Interface provides easy set-up for the most common
able parts operations required to develop a Robotic Bin Picking System
Pick Candidate Re-verification, to enable efficient re-use of Robust Motion Planner identifies the optimal path for the robot to pick
identified parts from previous pick cycles a part from the bin
Multi Grasp Point Selection, to provide multiple grasping points Position Reachability checks to see if the robot can reach the image
and scenarios for a given part, while also increasing speed and capture point and the pick point selected by the vision system
accuracy when grasping each part Collision Avoidance detects if the robot tool and arm will collide with
Advanced 3D Range Data Analysis, to confidently move to and the bin walls
grasp parts without collisions.
RBP – System Features
Auto Recovery from Collision Detect / Motion
Standardized Compliant Tooling
Supervision faults. The robot is capable of
handles minor collisions making the
gripper more likely to grasp part automatically restarting should the robot gripper
collide with a neighboring part.
• 6 DOF compliance
Optimal Application
Optimal Application
RBP – Optimal Applications
• Easily recognizable
2D features for initial
detection of possible
candidates
• Edges that can be
consistently used to
detect the part
footprint, even if they
are the silhouette –
very important
• Bin that have angled
side walls near the
bottom
RBP – Optimal Applications
Complexity Factors
• Parts that link, hook or
wedge together
• Multi-planer parts that rest
in in excess of three
positions
• Small parts, less than 10 cm
• Limited pick points
• Parts with deep cavities
• Parts that appear to have
self-similar areas.
• Bottom of the bin / side of
the bin (deep bins with 90
degree angles – a “deep
box”
RBP – Optimal Applications
Semi – Structured RBP
• 3D picks from exit
conveyors
• RBP module solves part
overlapping challenges
• Does not require parts to
be cingulated
RBP – Gripper Design
Considerations
• End of Arm Tooling
– Narrow footprint for gripper fingers
– Rounded edges on gripper fingers and gripper body
– Hooks, suction cups and magnetic grippers
– Multi pick point grippers (e.g. ID and OD fingers)
– Cable and hose management
• Re-grip and flipper stands
• Part present switches, and sensors to detect multiple parts
picked, or parts that have slipped out of position prior to
placement
• Robot uses specialized tool to move parts away from bin walls
• EOAT design for RBP necessitates “out of the box” thinking
RBP – Gripper Design
Outputs:
• Simulation Model
• Equipment specification including robot model, robot risers, bin / box locators, safety system, etc...
• Gripper concept drawn in 3D model
• Cycle time study
• Proposal and budget for Step 2
Outputs:
• Design and build prototype gripper
• Perform 10,000 pick test run
• Measure pick rate quality, system availability, and cycle time
• Proposal and budget for Step 3
Outputs:
• 1st production ready unit
• Lessons learned from pilot installation
• Standardization of factory specific requirements
RBP – Business Case
The “Traditional” Business Case for Robot Automation
1. Reduce operating costs
2. Improve product quality & consistency
3. Improve quality of work for employees
4. Increase production output rates
5. Increase product manufacturing flexibility
6. Reduce material waste and increase yield
7. Comply with safety rules and improve workplace health &
safety
8. Reduce labour turnover and difficulty of recruiting workers
9. Reduce capital costs (inventory, work in progress)
10. Save space in high value manufacturing areas
Based on research carried out by the International Federation of Robotics (IFR)
Published in World Robotics 2005
RBP – Business Case
Random Bin Picking - Business Case “Adders”
1. Perfectly ergonomic
Eliminate noise from vibratory feeding systems
Eliminate potential for worker injury:
Repetitive motion
Workers hit by forklifts moving bins into position
Injuries resulting from dropped parts, pinched fingers, etc…
2. Reduce MRO costs
robotic bin picking cells are mechanically simple systems (robot and gripper)
much less complex than vibratory feeding and/or mechanical positioning systems
lift assists and hoists often require frequent maintenance and spare parts
3. Improve productivity
robots don’t stop to talk about the “big game” causing the line to starve for parts
Test 1 Test 2
Total Parts Picked 1125 3005
System Failures 2 5
System Availability 99.8222% 99.8336%
Total Faults 42 147
Pick Rate Quality 96.2667% 95.1082%
Steven W. West
Development Manager, Vision Guided
Robotics
ABB Inc.
1250 Brown Road
Auburn Hills, Michigan 48326
USA
Telephone: 248-393-7120
email: steven.w.west@us.abb.com
www.abb.com
The Need for Generic 3D Bin Picking
Presented by:
Firewire camera
C
C
Handling station
Robot controller
EtherNet, RS232
• Cannot be trained in
a generic way from
real images.
Challenge 2: How to grip a part
• Must have multiple grip positions on the
parts. Two reasons preventing grips:
• Precision grip
Example of:
• Completely randomly placed parts
• Deep bin (4 pallet frames)
• Need for multiple grippers
Case Study: Picking pipes
Case Study: Split Cone and Nuts
Requirements:
• Bin-pick two different products from two
bins
• Compact
• Robot has many other tasks outside the
bins
Case Study: Split Cone and Nuts
Final Words
Things to come:
• Even more generic
• Improved path planning
• More automated training of parts
• Handling a bigger variety of part types
Contact Information
Presented by:
Jane Shi
and
James Wells
General Motors Corporation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Jane Shi
Staff Researcher
GM R&D Center
Jane Shi
GM R&D Center
MC 480-106-359
30500 Mound Road
Warren, Michigan 48090
Phone: 586-986-0353
Fax: 586-986-0574
Email: jane.shi@gm.com
Currently, Jane Shi works at General Motors R&D Center of Warren, Michigan as a staff
researcher, focusing on the fundamental challenges in achieving a reliable, robust, and
capable autonomous and intelligent robotic systems for automotive assembly. Her
delivered research results are ranging from analytic models, innovative methods, to
data analysis and related practical tools to address a variery of automotive
manufacturing challenges in order tom improve its flexibility, efficiency, and reliability.
Dr. She joined GM R&D Center in 2002. Prior to 2002, Dr. Shi’s work experience
includes FANUC Robotics America, Inc. (1994-2002) and NIST (1988-1989). Jane
earned her Ph.D. in robotics from Kansas State University in 1995. Dr. Shi is a member
of IEEE and Robotics and Automation (RA) Society.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
James Wells
Senior Staff Research Engineer
GM R&D Manufacturing Systems Research
James Wells
GM R&D Manufacturing Systems Research
MC 480-106-359
30500 Mound Road
Warren, Michigan 48090
Phone: 810-602-9879
Fax: 586-986-0574
Email: james.w.wells@gm.com
Jim joined GM’s Manufacturing Engineering organization in 1979 and has been working
in the area of Robotics since 1982. Jim has held engineering and management
positions primarily responsible for robot application development and manufacturing
program support including simulation, paint operations, body assembly, robot
procurement and specifications. Jim joined the R&D MSR group in 2003 and is
currently Senior Staff Research Engineer working on developing advanced robotics with
low cost flexible tooling and equipment for vehicle assembly. Jim has served SME
Robotics International as Chairman for the RI Board of Advisors (1995) and is currently
on the board of the RIA (Robotic Industries Association). Jim holds a Bachelors degree
in Electrical Engineering from Rochester Institute of Technology and a Masters degree
in Engineering from Purdue.
Presentation not available at time of production.
3D Robot Guidance for
Cosmetic Sealer Applications
Presented by:
Kevin Taylor
ISRA VISION SYSTEMS
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Kevin Taylor
Vice President
ISRA VISION SYSTEMS
Kevin Taylor
ISRA VISION SYSTEMS
3350 Pine Tree Road
Lansing, Michigan 48911
Phone: 517-887-8878
Fax: 517-887-8444
Email: ktaylor@isravision.com
Kevin has been with ISRA Vision since 1999. His first years with ISRA were solely in
sales capacity. Several years ago he assumed the position of Vice President with the
responsibility of the North American Business Unit. Prior to that, he spent 8 years
selling automation to the automotive industry.
3D Robot Guidance for Cosmetic
Sealer Applications
Kevin Taylor
Vice President
ISRA VISION
What is Cosmetic Sealer?
Camera
Plane
3D Point
• Multi Line Sensors are Robust even with Disturbed
Features
• Software Requirements for Sealant Applications
Combination of 3D Measurement and Inspection
Software Platform is Identical to Other Accepted Robot
Guidance Systems
Improved Algorithms to Increase Robustness of
Measurement
Automated Calibration
Indicator
Bar
Action Area
Interactive Area
Information
Status Bar
Area
• Ideal Software - Diagnostics
Display of current
system messages
• Ideal Software - Diagnostics
Red indicator
signals last
measurement In the overview image the
failed failed feature is indicated
• Problem: Blooming
– Destruction of image
information by overexposed
pixels due to reflections in
the imaging area
• Software: Environmental light and color
independant measurement
• Solution: Color Control Functionality
Images at each shutter time:
Completed Image:
• Software: Environmental light and color
independant measurement
• Solution: Color Control Functionality
Offset Vector
(x,y,z,Rx,Ry,Rz)
• Bead Inspection
– Bead Presence/Absence
Kevin Taylor
Vice President
ISRA VISION
3350 Pine Tree Road
Lansing, Michigan 48911
USA
Telephone: 517-887-8878
email: ktaylor@isravision.com
www.isravision.com
Combining Machine Vision and Robotics to
Mimic Complex Human Tasks
Presented by:
Michael Muldoon
Averna Vision & Robotics
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Michael Muldoon
Business Solutions Engineer
AV&R Vision & Robotics Inc. (Averna Vision & Robotics)
Michael Muldoon
AV&R Vision & Robotics Inc.
(Averna Vision & Robotics)
269 Rue Prince
Montreal, Quebec H3C 2N4
Canada
Phone: 514-788-1420
Fax: 514-866-5830
Email: michael.muldoon@avr-vr.com
Mike studied at the University of Windsor graduating with a Bachelor of Applied Science
in Electrical Engineering. Since then his main focus has been expanding the Machine
Vision and Robotics technologies for applications ranging from laser welding to de-
palletizing to deburring and surface inspection. He spent the first part of his career in
Windsor Ontario heavily involved in automotive production where the volume and range
of parts was always the challenge. Now, in Montreal Quebec, he specializes in
Aerospace applications which has different demands as it is low volume, complex and
high precision production environments.
Combining Machine Vision and
Robotics to Mimic Complex Human
Tasks
Michael Muldoon, P.Eng
Business Solutions Engineer
AV&R Vision & Robotics
(Averna Vision & Robotics)
Presentation Agenda
• Automation Challenges
• Aerospace Industry
• Material Removal
• Surface Inspection
• Cad-to-path Strategies
• Performance Testing
• Case Study 1: Surface Inspection & Gauging System
• Case Study 2: Finishing & Inspection System
Automation Challenges
Some Areas Typically Left to Humans:
•Defect Classification
•NI Particle Analysis VI
Defect Examples:
• Dents
• Scratches
• Cracks
• Pits
• Ripples
• Die Marks
Inspection Algorithms
Future-> Real-time analysis & adjustment of robot path/recipe (ie cast parts, MRO facilities)
Performance Testing
Statistical Analysis Requires:
• Defined performance requirements
• Test Plan
• Samples!
• Ability to present useful information of data
Po Pe Kappa
Overall
Good
Repeatability 0.95
Agreement
Kappa
Case Study 1: Surface Inspection & Gauging
System
Description:
Final Quality
Part
Finish Inspect
Contact Information
Presented by:
Brian Windsor
SICK, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Brian Windsor
Business Development Manager - Machine Vision
SICK, Inc.
Brian Windsor
SICK, Inc.
6900 West 110th Street
Minneapolis, Minnesota 55438
Phone: 810-923-1880
Fax: 248-997-1068
Email: brian.windsor@sick.com
Brian has been working at SICK, Inc. for 3 years and is currently a Business
Development Manager for SICK’s Machine Vision products. He has been involved in
technical sales and engineering of industrial sensor and machine vision products for the
past 15 years.
Using 3D Laser Scanning
for Robot Guidance
Brian Windsor
Business Development
SICK, Inc.
3D Imaging Technology
What is 3D Profiling?
Camera captures the laser light
resulting in a contour of the surface
Intensity-based contrast
Height-based contrast
Height Resolution Calculation
Scanning Methods
• Robot or other device moves camera, object is stationary
• Object is moving on a conveyor, camera is stationary
Occlusion
Occlusion
• Camera Occlusion - the height of an object can block the laser
creating areas of missing data
Vision result
(Bottle groups can be
20oz Crates Range Data separated, position of the
trays can be found)
Palletizing Application
Vision result
(Bottle groups can be
2L Crates Grey Scale Picture separated, position of the
trays can be found)
Palletizing Application
Random Bin Picking
Brian Windsor
Business Development
SICK, Inc.
6900 West 110th St
Minneapolis, Minneapolis 55438
USA
Telephone: 248-997-7618
email: brian.windsor@sick.com
www.sickusa.com
Vision Options for “Dual Arm” Robot Guidance
Presented by:
Greg Garmann
Motoman, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Greg Garmann
Software & Controls Technology Leader
Motoman, Inc.
Greg Garmann
Motoman, Inc.
1050 Dorset Road
Troy, Ohio 45373
Phone: 937-440-2668
Fax: 937-440-2626
Email: greg.garmann@motoman.com
Computer Engineering Degree from Wright State University, Dayton, Ohio 21 Years of
experience in Automation.
Vision Options for “Dual Arm”
Robot Guidance
Greg Garmann
Software & Controls
Technology Leader
Motoman Inc.
Robot Technology
• New developments in robot technology
require new ways of working with vision
systems. The “human-like” flexibility of
movement with Motoman’s new dual arm
robots provides unique solutions for the
automation world. This presentation will
show options in handling vision
opportunities with dual arm robots.
Ultimate Flexibility
Gear Assembly
Air Conditioner Assembly
Air Conditioner Assembly
Spool Handling
Packing
Packing
Window Glass Sealing
Sunroof Assembly (Bolting)
Vision (Ageria) Solution
Vision (Ageria) Solution: Bolt Picking 1
Vision (Ageria) Solution: Bolt Picking 2
Parts Picking with Dual Arm
Contact Information
Greg Garmann
Software & Controls Technology Leader
Motoman Inc.
805 Liberty Lane
West Carrollton, OH 45449
USA
Telephone: 937-440-2668
email: greg.garmann@Motoman.com
www.Motoman.com
Distance, Pitch & Yaw from a 2D Image
Presented by:
Steve Prehn
FANUC Robotics America
International Conference for Vision Guided Robotics
September 30 – October 2, 2008
Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.
Steve Prehn
FANUC Robotics America, Inc.
3900 W. Hamlin Road
Rochester Hills, Michigan 48309
Phone: 248-276-4065
Email: steven.prehn@fanucrobotics.com
Steve Prehn has worked in the machine vision market for over 20 years. In addition to
implementing over 200 vision systems, he has acted as product manager for VisionBlox
software at Integral Vision, and CorrectPlace and CorrectPrint products at ESI. He is
now product manager at FANUC, applying his knowledge of machine vision to extend
the reach of iRVision into the material handling market. He has a bachelors of science
in electrical engineering from DeVry Institute in Columbus, Ohio.
Distance, Pitch & Yaw from a 2D Image
Understanding the Dynamics of the
Robotic / Vision Coordinate Interface
Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.
Image to Robot Relationship
• Knowns:
Calculate
Height • Calibrated Focal length of
Lens
Known
Width
• Camera Array size
• If Part size is know, calculate
distance of the part from the
camera
Consistent part size
• Find parts at two known heights.
1) Calculate scale change and
correlate this to the height
difference. (Delta to delta
determines lens magnification)
2) The part size at this trained
distance is then known
Multi-plane Calibration
Camera 2 Image
3D processing 2D processing
Detection of Detection of
distance and pose by position and rotation of
structured-light the object from 2D image
Composition
Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.
3900 W. Hamlin Road
Rochester Hills, Michigan 48309
USA
Telephone: 248-276-4065
email:
steven.prehn@fanucrobotics.com
www.fanucrobotics.com
VGR Panel Discussion