Vous êtes sur la page 1sur 27

1.

0 Problem
The computer mouse has gone through an impressive evolution—from the mechanical ball mouse
to the wireless laser mouse used today. The next step in improving the mouse is to make it work
in air, and even when it is not pointed at the computer screen! Companies have successfully
developed an air mouse, but this technology is current expensive and complex.

One of the applications of a remote mouse is using it as a presentation pointer. An affordable


handheld mouse pointer could replace the existing laser pointer. The extensive features on the
current products, which are not needed for a presentation pointer, come at a high cost. The
challenge is to develop a new pointer that is affordable and easy-to-use, but has more
functionality than the standard laser pointer.

The goal of the P-Klik project was to create this pointer! By pointing the P-Klik handheld at a
projection of the computer screen, the user will be able to control the position of the mouse. The
P-Klik will be marketed as an alternative to the laser pointer; comparable in price, with increased
reliability and functionality. In addition to controlling the position of the mouse, the pointer will
have two buttons, giving the user control over all mouse functions. This will allow the user to
draw on the projection screen, or to activate embedded links during a PowerPoint presentation.

This report describes the development and testing of the P-Klik prototype. It begins by
highlighting the competing technology and existing products. We will then describe our initial
product designs and justify why we chose the approach we did. This is followed with a detailed
look into how the prototype works, how it was built, and how it was tested. Finally, we present
future recommendations for the next generation of P-Klik pointers.

2.0 Competing Technology

For our device to be successful, it has to be an improvement on existing technology. Several


companies have developed computer mice than can control the curser in air, from anywhere in the
room. The gaming industry is also moving in this direction by developing motion sensing
controllers. This section describes the competing technologies for the P-Klik.

2.1 The Air Mouse


Gyration’s Optical Air Mouse1 and Logitech’s MX Air 2 are computer mice that work in the air
without being in a direct line of sight to the computer. They determine the desired position of the
cursor using gyroscope and force sensor technology. The handheld uses radio frequency to
communicate with the computer. Extensive mouse functionality has been incorporated into these
designs. Both products cost around $150.00 CAD each. Gyration is coming out with a more
affordable air mouse, for $70.00 CAD, but this version has a shorter range of only 30 feet
compared to 100 feet in the original.

2.2 The Nintendo Wii Remote


The Nintendo Wii Remote3 (Wiimote) is a motion sensing video game controller. By directly
pointing to the screen, the Wiimote can control an object in a video game. The Wiimote uses two
infrared LEDs attached at the bottom of the television screen, and a camera incorporated into the
wireless controller. By analyzing where the infrared LEDs are in the camera’s frame, the Wiimote
can determine where the user is pointing on the screen.

1
Johnny Chung Lee developed an interactive whiteboard using the Wiimote4. This project extends
the application of the Wiimote from video games to computers. The whiteboard uses infrared (IR)
light emitting diodes, held against the projection of the computer screen, to control the motion of
the mouse cursor.

2.3 Patents
There are no patents on controlling a mouse cursor by monitoring a laser beam with a stationary
camera.

2.4 Laser Safety Classification


One of the possible designs, which is described in the next section, uses an infrared laser. This
raises important safety issues because IR lasers have the potential to cause permanent eye
damage. We researched the regulations on laser safety, so we could take the appropriate
precautions in our design.

Only lasers of class 2 or less are considered safe for public use5. In general, the power of a
visible class 2 laser was limited to 1 mW. If the laser is in the infrared range, the limiting power
is lower because a person cannot see the light, so they will not blink before permanent damage is
done. This means that if a laser with power more than several hundred microwatts is used, then
safety becomes a concern. No regulations are found for other infrared light sources, such as
LEDs.

3.0 Design
Four different designs were considered for the P-Klik. This section presents the details of these
designs.

3.1 Camera and Edge Detection


This method uses a camera in the handheld to image the projection screen. An edge detection
algorithm is used to determine the position of the pointer. The image of the screen is extracted
from a stream video taken by the webcam and imported into Matlab. This generates an n x n x 3
RGB matrix, where n depends on camera’s resolution. The image can be scanned horizontally and
vertically to detect the edges where the colour changes sharply. The frame of the screen can be
generated by joining the horizontal and vertical edges together. The movement of the handheld
can be monitored by comparing the positions of the edges in two sequential frames.

3.2 Camera and Infrared LEDs


This system uses two stationary signal LEDs, located on the projection screen, and a handheld
camera to determine the position of the mouse. Infrared LEDs are used because they are invisible
to the human eye. The position and intensity of the LEDs, as seen by the camera, is used to
determine the pointer’s location. The location of the screen is calibrated so the position of the
LEDs can be directly translated into the desired position of the mouse cursor. To ensure that the
noise is minimized, an IR filter is introduced before the camera.

A modification to this approach is to flash the LEDs at the same frequency as the image capture
rate for the camera. This will gives two images: one with the LEDs and one without. By
subtracting the two images, the location of the LEDs can be quickly and precisely determined.

2
3.3 Array of Detection LEDs
The third method uses an array of receiving LEDs in the handheld to detect an LED attached to
the projection screen (the signal LED). The signal LED and the receiving LEDs emit light at the
same wavelength. The light from the signal LEDs will be focused onto the receiving LED array
so that a voltage bias is generated at each receiving LED. The magnitude of the voltage bias is
directly related to the intensity of light detected by the LED. By comparing the voltage
difference between each receiving LED in the array, we can calculate the relative orientation to
the signal diode and also to the screen itself.

The voltage measured at each receiving diode will be converted to a digital value using an
analog-to-digital (A/D) converter. A four-quadrant positioning algorithm will be used to interpret
the voltage values and calculate the orientation of the handset.

3.4 Tracking an Infrared Laser


In this approach, the system is composed of a handheld infrared laser and a stationary webcam.
The webcam is pointed at the projection screen, and an IR filter is used to reduce the noise from
other light sources. The user points the laser at the screen, invisible to the human eye, and the
computer mouse mimics the laser’s motion. Similar to the LED approach, the camera detects
changes in the position of the IR beam, and calculates where the handheld is pointed on the
screen. The difference in this approach is that the camera is stationary.

This method requires additional safety features. To prevent accidental eye damage, the laser
should never be on when it is not pointed at the screen. If the webcam does not detect the laser,
for instance if the user points off of the screen, the mouse will remain stationary and the laser will
turn off. To activate the mouse again, the user has to simply press the laser switch, while pointing
to the screen.

4.0 Approach
We evaluated the potential of each design, and decided to build our prototype to track an infrared
laser with a stationary webcam. This section describes the advantages and the disadvantages of
each design outlined in section 3, and the reasons we choice the infrared laser approach.

4.1 Camera and Edge Detection


For this method to be successful, the LEDs must always be in the camera’s field-of-view.
Misaligned optics can be used to ensure that the LEDs, which are located at the bottom of the
projection screen, are always seen by the camera. The image recorded with the camera will be
optically manipulated, and this will need to be compensated in the analysis. Another challenge is
that the user will not typically stand directly in front of the screen and this requires an additional
calibration step.

The major disadvantage of this approach is that the algorithm is slow. In initial tests, we were
able to overcome the issue of speed, but reliably became a problem. Not every screen has a
distinct and detectable edge. This meant that we had to project markers onto the screen to
accurately locate the edges. The main goal of this project was to develop a presentation pointer
that controlled the position of the mouse cursor instead of using a red laser beam. Projecting
indicators at the screen did not agree this aim, and we abandoned this approach.

3
4.2 Camera and Infrared LEDs
This method is advantageous over the edge detection approach because of its speed. The
additional cost of the IR LEDs is insignificant, but they do add another component to the system.
Positioning the LEDs and calibrating makes the system more difficult to use. An additional
disadvantage of this system is the cost; the miniature, wireless camera and the IR filters increase
the overall cost of the pointer. The challenges in this method are that distinguishing the LEDs’
signal from background light sources, and coordinating the LEDs to flash at the same time as the
screen capture.

Originally, we choice to implement this design because it was faster than edge detection and
simpler than using an array of detection LEDs. We achieved a proof of concept for this idea. .
Due to similarity with existing Wiimote technology, this was not made into a prototype.

4.3 Array of Detection LEDs


This method is the cheapest option and has the fastest response time. By using an array of IR
LEDs, instead of a miniature camera, the cost of the pointer is significantly reduced. In addition,
no image recognition is needed to determine the position of the pointer, which improves the speed
of the calculations. Flashing the signal LED to reduce the noise negates the need for an IR filter
in the handheld. The disadvantage of choosing this method is that it is more complicated to build
than the camera-based approach.

4.4 Tracking an Infrared Laser


This design was developed as an alternative to the camera and infrared LED approach. The code
modifications to switch from detecting two IR signal LEDs to one IR laser beam were minor. The
infrared laser approach is cheaper, because the camera does not need to be miniature and wireless,
and the laser diode does not add a large cost to the system. This method has the advantages of
being long range, having a unique design, and not requiring signal LEDs to be attached to the
projection screen.

Switching from LEDs to an infrared laser raises important safety issues. The methods we used to
prevent accidental eye damage are discussed further in the next section.

5.0 Implementation

5.1 Image Processing


Matlab was used to analyse the image captured by the webcam, and to calculate the desired
position of the mouse cursor. This section describes how the image analysis code works. The
complete Matlab code can be found in Appendix A.

Locating the Laser Beam


To detect the location of the laser beam on the screen, Matlab analyzes the stream video captured
by a webcam. The webcam can see the IR dot reflected off the screen. To eliminate noise from
other light sources, an IR filter is placed before the webcam.

The algorithm to detect the laser beam in Matlab finds the point with the highest intensity in each
captured frame. This function returns the x and y coordinates of the brightest point.

4
Calibration
To calculate the relative position of the beam within the frame, a calibration is required at the
beginning of the program to indicate where the corners of the frame are. In this process, the user
will be asked to point at the four corners, and the program will store these locations. At this time,
positions are in the range of the web camera’s resolution, which is 320 x 240. Once the
calibration is finished, the user can move the laser and the program will start to calculate the
cursor’s location according to the beam’s position captured for each video frame.

Transformation
For each captured frame, the program will have five position vectors available to analyze. These
are the four corners and one laser beam. Depending on where the web camera resides, the shape
of the frame and beam may look similar to Figure 1 (a). In order to calculate the beam’s relative
position to the frame, it is necessary to apply transformations to these four points so that the
frame becomes a rectangle, as seen in Figure 1 (b). The transformation process can be done in
four steps: shifting, rotating, shearing, and scaling. These techniques are described in detail
below.

a b

Figure 1. The laser beam and the screen frame. a) Four corners and beam spot before transformation b)
four corners and the beam spot after transformation.

Shifting
The entire figure is firstly shifted so that the first corner will be moved to the origin, as shown in
Figure 2.
200

150

100

50

0
0 50 100 150 200
Figure 2. Frame after shifting.

5
Rotating
After shifting, the figure is then rotated about the origin so that the bottom line aligns with the
x-axis, as shown in Figure 3.
200

150

100

50

0
0 50 100 150 200
Figure 3. Frame after rotating.

Shearing
After rotating, in order to align the left line with the y axis, the figure is then sheared along the
y-axis, as shown in Figure 4.
200

150

100

50

0
0 50 100 150 200
Figure 4. Frame after shearing
Scaling
Scaling is first implemented along y-axis so that the top line becomes parallel to the x-axis. Then
the image is scaled along the x-axis, so that the entire frame becomes a rectangle. This is shown
in Figure 5.

6
200

150

100

50

0
0 50 100 150 200
Figure 5. Frame after scaling.

Cursor Location on the Computer Screen


After the frame is transformed into a rectangle, the location of the transformed beam is then
compared to the width and height of the frame to calculate its relative position. The point of
maximum intensity is then scaled to the resolution of the computer screen (typically 1280 x 800)
to determine the location of the mouse cursor. The user enters the resolution of their computer
screen before calibration, with the default set to 1280 x 800.

Distribution
The distribution function converts the position vector of the detected beam from integers to
doubles. It will first calculate the ratio of the beam’s right point intensity and the beam’s left point
intensity, and convert the beam’s position vector into double numbers with the following
equation:
ratio
x = x −1+
ratio + 1
Then the y coordinate is also applied with the same algorithm to calculate its y position in double
format. In the case the beam is at the edge of the screen, the ratio is set to zero.

Optimizing the Cursor’s Motion


When the laser pointer is moved to draw a curve, the beam will not be moving in a smoothed path
due to the instability of peoples’ hands. The actual trace will look similar to the solid path in
Figure 6, which has a lot of sharp turns. This motion is undesirable and makes the cursor difficult
to control. To smooth the path, a motion optimization algorithm is employed. This algorithm will
store the last 6 calculated cursor positions and output their average as the current location of the
mouse cursor. In this way, the sharp turns will be cancelled out and the actual output trace will
become a smooth path, similar to the dotted line in Figure 6.

7
Figure 6. Motion optimization

Stop laser
Since the IR laser is very dangerous to human eyes, the program must stop the laser when the
beam is out of the webcam’s field-of-view. To determine whether or not the frame has captured
the beam, the highest intensity point is compared with the intensity of the four calibrated corners.
If the brightest pixel has an intensity less than the threshold (i.e. half of the lowest intensity of the
calibrated corner points), the program will set the stop_laser variable to 1. This indicates that the
laser should be turned off.

Output to Visual Basic


After analysing of each frame, there are three results to output to Visual Basic (VB): the cursor’s
horizontal location, the cursor’s vertical location, and the stop_laser value. These variables are
written to a text file called FromMatlab.txt after they are calculated for each frame.

5.2 Hardware
This section describes the circuitry in the handheld device.

Mouse Buttons
Figure 7 shows the computer mouse circuit board used in the handheld. The circuit board was
extracted from a standard USB Logitech mouse. No changes were made to the left or right mouse
buttons, but the middle mouse button was rewired to act as push button switch for the laser
driving circuit.

8
Figure 7. Mouse circuit board used in the handheld.

Laser Driver
Figure 8 is the schematic for the handheld laser driver circuitry. A Wilson’s current source is used
to provide constant current to the laser diode. The 1 kΩ potentiometer is connected as a variable
resistor. The current to the load is calculated using:

Vcc − 1.4V
I load =
R

where Vcc is the positive 5V supply and R is the series resistance of the 56 Ω resistor and the
1 kΩ variable resistor. Using this setup, the current can be adjusted from to 3.4 mA to 64.3 mA.
The current source was first tested with a blue LED, to ensure that the current would not damage
the laser. To set the value of the potentiometer, the resistance was adjusted to 1 kΩ, supplying a
current well below the laser’s threshold. The resistance was slowly decreased, and the current was
monitored on a multimeter, until the laser turned on. The potentiometer resistance was tweaked
until the output was 40 mA.

9
Figure 8. The schematic for the constant current source and the indicator LED.

The middle mouse button is a pushbutton switch to turn on the constant current source. When the
middle mouse button is pushed, the positive voltage supply is connected to the circuit, turning on
the laser and the indicator LED. Power to the circuit is from a 9V battery, which is connected
through a 5 V voltage regulator.

5.3 Optics
An infrared laser was chosen so that it is visible to the web camera, but invisible to human eyes.
This section describes the optical components of the system.

Collimator
Figure 9 shows the collimator package used in the handheld device. To collimate the laser beam,
a collimation tube (Part Number C230220P-B) from Thorlabs was used. The laser beam remains
collimated for at least 5 m with this setup. Once the laser diode is packed in this housing, it is
mechanically very stable.

Figure 9. Collimator package

Web Camera
Figure 10 shows the Dynex PC web camera (Model WXWC100) used with the P-Klik handheld.
The webcam remains stationary and pointed at the projection screen. In general, any
commercially available webcam will work, since most can view infrared. The only modification
to the webcam is to attach an IR filter in front of its lens.

Figure 10. Dynex PC Web Camera used for the P-Klik prototype.

10
Infrared Filter
Coloured glass from Thorlabs was used for the IR filter. The pass band wavelength is unknown
for the filter used in the prototype. However, by using the camera to image LEDs with different
wavelengths after they are passed through the filter, we found that the filter permitted light
beginning from the red end of the visible range up to at least 850nm.

Infrared Laser Diode


The first criterion for the IR laser is that it is bright enough to be clearly seen with a webcam a
few meters from the screen it is projected against. The second criterion is that the laser power
should be as low as possible, so that it is minimally harmful to human eyes.

It was not feasible for us to try every laser source on the market. We tested the VCSEL’s and
laser diodes available in UVic Optics Lab. The first laser diode we tested was a VSCEL from
Honeywell (Part Number HFE4080-321). This VCSEL had a laser diode and a monitoring
photodiode for feedback control. It had a typical driving current of 10 mA and output power was
0.4mW. In our design, a current source supplied a constant 10 mA to the laser diode. The
photodiode was not used since feedback controlling is not needed if constant current through the
laser diode is maintained. This VCSEL did not have enough power to be clearly imaged by a
webcam less than one meter away. Increasing the driving current to 15 mA did not significantly
increase the range.

Figure 11 shows the Fabry Perot laser diode from Thorlabs (Part Number L780P010), which was
tested next. This laser is more powerful, having a typical power of 10 mW. The laser has a peak
wavelength of 780 nm, a typical driving current of 50 mA, and a threshold driving current of 30
mA. It was also driven with a constant current source. When the current reached threshold, the
laser spot from this diode could be very clearly seen by the webcam from three meters away.
This diode was used in the P-Klik handheld.

Figure 11. Fabry Perot laser diode

5.4 Mouse Movement in Visual Basic


The Visual Basic (VB) code was responsible for moving the mouse to the position indicated by
Matlab, capturing when the laser was on, disabling the middle mouse button functions when the
button is held down, and displaying the safety warning message.

11
The VB code is broken into two separate programs that communicate with each other. This is
necessary to properly handle both the button events and the mouse movement events
simultaneously. The two VB programs are entitled MouseMoveProject, which handles the
movement of the mouse cursor, and MouseHookServ, which handles mouse button events. The
VB code can be found in Appendix B.

Controlling Mouse Movement


MouseMoveProject is the main program, due to the fact that it is in charge of moving the mouse,
and it receives input from both MouseHookServ and from Matlab. When MouseMoveProject
receives the input from MouseHookServ indicating that the middle mouse button is down, it will
begin parsing the input from the Matlab code. The input from the Matlab code contains the x and
y coordinates that the mouse cursor should be moved to, and a third variable which denotes
whether or not the laser is within camera’s view. The third variable, stop_laser, is used as a
safety precaution. When stop_laser is set to one, the MouseMoveProject control flow will break
from the regular cycle, displaying a warning message and forcing the user to reinitiate the
program. This is to help notify the user that the laser is off-screen, which is a safety concern.

The only other time the regular cycle, parsing Matlab input and moving the mouse, will be
broken is if the input from MouseHookServ indicates that the middle mouse button has been
realised. At this point, the MouseMoveProject flow will enter a wait period, during which time
the computer mouse works as normal, and it will return to the regular cycle when the input from
MouseHookServ indicates middle mouse button is pushed again.

Handling Mouse Buttons


MouseHookServ is the mouse button handling service. It accomplishes the hooking of mouse
input by using the WindowsHookLib.dll library to globally capture button inputs and allow VB to
react to these as events. A specific event is triggered when any mouse button is pressed down,
and when a mouse button goes up. Which button has been pushed can also be determined.
MouseHookServ will ignore all buttons, except the middle mouse button. The left and right
mouse buttons function normally. If the middle mouse button is detected on a “down” event, the
regular action of a middle mouse click is blocked and the message “down” will be sent to
MouseMoveProject. Likewise, if the middle mouse button is detected on an “up” event, the
regular action is released and the message “up” is sent to MouseMoveProject.

5.5 Safety
The laser diode employed by our product has a typical power of 10 mA in the infrared range. By
the new laser safety classification, this is a Class 3B laser5. Class 3B is harmful to human eyes if
they are exposed directly to the beam, but scattered light from rough surfaces is safe. Figure 12
(a) and (b) shows the warning signs for a class 3B laser. This means our product could be
considered safe if we can ensure that the laser beam is only pointed at the projected screen. This
section describes the safety feature integrated into the P-Klik to prevent accidental eye damage.

a b

Figure 12. Warning labels for lasers. (a) Warning label for lasers of Class 2 or higher, (b) a warning sign
for a Class 3B laser

12
There are three safety features in the P-Klik design to prevent accidental eye damage. These
features are:
• If the laser is on, but not pointed at the projected screen, a warning message will appears.
• The laser beam is only on when the power button is held down, and turns off immediately
when the button is released.
• A red indicator LED turned on when the laser was on.

In addition to the safety features described above, the output power of the IR laser was tuned as
low as possible. While the typical driving current was 50mA, only 40mA was pumped into the
laser diode, which considerably reduces the power, and makes the laser less dangerous.

6.0 Testing the Prototype


Figure 13 is a picture of the P-Klik prototype. The prototype was successfully demonstrated as a
presentation pointer. It was used in PowerPoint to control the motion of the mouse, click
embedded links and move to the next slide. The pointer was also demonstrated in Paint, where it
was used to draw pictures on a projected screen.

When the middle mouse button was held down on the prototype, the laser and the indicator LED
turned on. The computer detected when the laser was on, and disabled the middle mouse button
function until the laser was turned off. If the laser was on, but the webcam could not see the
beam, a warning message appeared. If the laser was turned off, the mouse stayed stationary on the
screen. When the laser was not on, the user could control the computer mouse normally.

Indicator LED
Power button for laser

Mouse buttons

Collimated laser

Figure 13. Prototype of the P-Klik

To test the calibration, a red laser pointer was projected onto the screen, instead of the IR beam.
When the calibration was done accurately, the mouse closely followed the path of the laser beam.
Due to the motion optimization, the mouse lagged slightly behind the laser pointer. If the pointer
was held still, the mouse would always catch up to the correct position. It was obvious if the
pointer was not properly calibrated, because the mouse would jump around or move on an angle
relative to the handheld’s motion.

13
Without the smoothing algorithm, the mouse position was noticeably jumpy. It was extremely
difficult to position the mouse accurately, and to have it stay in place long enough to click one of
the buttons. Adding the smoothing feature fixed this problem, but it made the system slower.

The laser driver circuit was tested before integrating the handheld with the code. A multimeter
measured a constant current of 40mA delivered to the laser. The red indicator LED always
illuminated when the laser was on. The code was also tested with a laser pointer before the
system was integrated.

The webcam could clearly distinguish the laser beam at a distance of more than 3m. However,
without the infrared filter, other light sources interfered with imaging the laser.

7.0 Conclusions
We have demonstrated the successful operation of the first prototype for the P-Klik presentation
pointer. The P-Klik controls the position of a computer cursor by pointing at a projection of the
computer screen. Two mouse buttons are integrated into the handheld to give the user control
over all mouse functions.

After considering several designs, we chose to track the position of an infrared laser with a
stationary webcam. Matlab code was developed for the image processing and Visual Basic was
used to control the position of the mouse, the mouse button capture, and to display a safety
warning message. The handheld was composed of a standard USB mouse circuit board, a
constant current source, a collimated infrared laser, an indicator LED, and a 9V battery. The P-
Klik was calibrated with a four point calibration method and a motion optimization algorithm was
developed to control jumpy mouse motion.
8.0 Recommendations

8.1 Calibration
Accurately calibrating the device was a large challenge in the prototype. The infrared laser is
invisible, so the user does not know exactly where they are pointing on the screen. Incorporating
a laser pointer into the device could solve this problem. To calibrate, the user would point as the
corner of the screen with the laser pointer, but activate the infrared laser when they clicked the
calibrate button.

8.2 Safety
In future generations of the P-Kilk, the laser power will be automatically shut down if the laser
beam is not seen on the screen by the webcam. A microcontroller should be integrated into the
system to turn off the laser driver when the computer indicates that the laser is not pointed at the
screen.

8.3 Wireless
For the P-Klik to be marketable, it needs to be wireless. Currently, the mouse circuit board
connects to the computer via USB. The next prototype should integrate a wireless mouse in the
handheld. This will slightly increase the cost of the system. The new safety features would also
need to be wireless. The microcontroller can be integrated to communicate via Bluetooth with the
computer.

14
8.4 Aesthetics
Future prototypes will have a larger focus on the ergonomic design of the handheld. The current
device is large and it is difficult to press the mouse buttons and the laser power button
simultaneously. One way to solve this problem would be to position the laser power button as
shown in Figure 14. This way, the user can turn on the laser with their middle finger and control
the mouse buttons with their thumb.

Laser power button

Figure 14. Model of the next generation of P-Klik pointers

15
7.0 References

1. www.gyration.co.uk
2. www.logitech.com
3. www.nintendo.com/wii
4. www.cs.cmu.edu/~johnny/projects/wii/
5. wikipedia lasers

16
Appendix A
Matlab Code for Image Processing
function newproject
%******************************
%This function basically captures the beam from a web camera
%and output its relative position for the LCD to an output file
%*******************************

%====== Initialization ========================================

seeLaser = 1; %camera sees the laser


stop_laser=0; %don't stop laser
OutputToFromMatLab (1, 1, stop_laser); %output to frommatlab.txt

n = input('Please enter the level of smoothness/delay (6): ');


if isempty(n)
n=6;
end
b = input('Please enter your threshold level (0.5): ');
if isempty(b)
b=0.5;
end
screenheight = input('Please enter your screen hight(800): ');
if isempty(screenheight)
screenheight=800;
end
screenwidth = input('Please enter your screen width(1280): ');
if isempty(screenwidth)
screenwidth=1280;
end

vid=videoinput('winvideo',1); %define the web camera device


triggerconfig(vid,'manual');
set(vid,'FramesPerTrigger',1 );
set(vid,'TriggerRepeat', Inf);
start(vid);
% preview(vid)
xs=ones(1,n); %for motion optimization
ys=ones(1,n); %for motion optimization

%====== End Initialization ====================================

%====== some constants ========================================


vidheight = 240;
vidwidth = 320;
% screenheight = 800;
% screenwidth = 1280;
%====== end some constants ====================================

%====== Calibrating 4 corners =================================

17
ready = input('Please point to the UPPER LEFT corner.','s');
trigger(vid);
data= getdata(vid,1);
I=rgb2gray(data);
[uply uplx] = FindMax(I);
maxI = I(uplx, uply); % the highest intensity of this frame
[uply uplx] = distribution([uply uplx], I, [vidwidth vidheight]);
%calculate the distributed position in format double according to the
%intensity of the points next to it

ready = input('Please point to the UPPER RIGHT corner.','s');


trigger(vid);
data= getdata(vid,1);
I=rgb2gray(data);
[upry uprx] = FindMax(I);
maxI = [maxI I(uprx, upry)]; % the highest intensity of this frame
[upry uprx] = distribution([upry uprx], I, [vidwidth vidheight]);

ready = input('Please point to the LOWER LEFT corner.','s');


trigger(vid);
data= getdata(vid,1);
I=rgb2gray(data);
[lowly lowlx] = FindMax(I);
maxI = [maxI I(lowlx, lowly)]; % the highest intensity of this
frame
[lowly lowlx] = distribution([lowly lowlx], I, [vidwidth vidheight]);

ready = input('Please point to the LOWER RIGHT corner.','s');


trigger(vid);
data= getdata(vid,1);
I=rgb2gray(data);
[lowry lowrx] = FindMax(I);
maxI = [maxI I(lowrx, lowry)]; % the highest intensity of this
frame
[lowry lowrx] = distribution([lowry lowrx], I, [vidwidth vidheight]);

threshold = min(maxI);
threshold = b * threshold;

%====== End Calibrating 4 corners =============================

%======== Main Loop ===========================================


for i=1:inf

%=======getting data from webcam=================


trigger(vid);
data= getdata(vid,1);

I=rgb2gray(data);
%=======end getting data from webcam=============

[my mx] = FindMax(I);

18
intensity = I(mx, my);

if (intensity >= threshold)


%laser spot is successfully detected

[my mx] = distribution([my mx], I, [vidwidth vidheight]);

[y x] = transf([uplx uply], [uprx upry],[lowlx lowly],[lowrx


lowry],[mx my],[screenwidth screenheight]);
% transfer the maximum intensity point coordinates to its
relative coordinate
% in real screen

[y x] = FormatPointBig2 (y, x, screenwidth, screenheight); %


make sure no zeros or double numbers

%===========motion optimization===========
if n>1
xs=[xs(2:n) x];
ys=[ys(2:n) y];
x=round(sum(xs)/n);
y=round(sum(ys)/n);
end
%===========end motion optimization=======

if (seeLaser == 0)
%just toggled from NOT SEEING to SEEING
seeLaser = 1;
stop_laser=0;
else
stop_laser=0;
end

else
%laser spot is NOT detected, basically do nothing to x and y

if (seeLaser == 1)
%just toggled from SEEING to NOT SEEING
seeLaser = 0;
stop_laser=1;
else
stop_laser=1;

end

end
OutputToFromMatLab (y, x, stop_laser); %output to
frommatlab.txt

end

19
%======== End Main Loop =======================================

function [y,x]=transf(a,b,c,d,t,resolution)
%********************************************************************
%This function will transfer the beam's position into the real cursor's
%location for the LCD
%a,b,c,d are coordinates of the 4 corners while t is the beam.
%resolution contains the width and height of the LCD's resolution
%********************************************************************

v=[a(1) b(1) d(1) c(1) a(1) t(1);a(2) b(2) d(2) c(2) a(2) t(2)];
width=resolution(1);height=resolution(2);
%plot-------------------------
% figure
% hold on
% axis([0 200 0 200])
% axis square
% grid on
% line(v(1,1:5),v(2,1:5));
% plot(v(1,6),v(2,6),'ro')
% hold off
%--------------------------

%shift back to origin---------------


v(1,:)=v(1,:)-v(1,1);
v(2,:)=v(2,:)-v(2,1);
%-----------------------------------

%plot-------------------------
% figure
% hold on
% axis([0 200 0 200])
% axis square
% grid on
% line(v(1,1:5),v(2,1:5));
% plot(v(1,6),v(2,6),'ro')
% hold off
%--------------------------

%rotation----------------
% L=length(v);
% vv=[v; ones(1,L)];
th=-atan(v(2,4)/v(1,4));
r=[cos(th) -sin(th);
sin(th) cos(th)];
v=r*v;
% v=vv(1:2,:);

%plot-------------------------
% figure
% hold on

20
% axis([0 200 0 200])
% axis square
% grid on
% line(v(1,1:5),v(2,1:5));
% plot(v(1,6),v(2,6),'ro')
% hold off
%--------------------------

%shear------------
sx=-v(1,2)/v(2,2);
s=[1 sx; 0 1];
v=s*v;
%-------------------

% %plot-------------------------
% figure
% hold on
% axis([0 200 0 200])
% axis square
% grid on
% line(v(1,1:5),v(2,1:5));
% plot(v(1,6),v(2,6),'ro')
% hold off
%--------------------------

%scale----------------------------
h=v(2,2); hb=v(2,3)-v(2,2); L=v(1,3);
v(2,:)=h./(hb.*v(1,:)./L+h).*v(2,:);

h=v(1,4); hb=v(1,3)-v(1,4); L=v(2,3);


v(1,:)=h./(hb.*v(2,:)./L+h).*v(1,:);
%------------------------------

% plot-------------------------
% figure
% hold on
% axis([0 200 0 200])
% axis square
% grid on
% line(v(1,1:5),v(2,1:5));
% plot(v(1,6),v(2,6),'ro')
% hold off
% --------------------------

%scale to LCD's resolution------------


y=v(2,6)/v(2,3)*width;
x=v(1,6)/v(1,3)*height;
%-------------------------------------

function [column,row]=distribution(max,I,originalResolution)

21
%function
[column,row]=distribution(originalMax,max,I,originalResolution,maxResol
ution)

%max contains (coloum#,row#) of the maximum intensity point in


I(320x240)
%I contains original image
%originalResolution contains resolution of the web camera
Owidth=originalResolution(1);
Ohight=originalResolution(2);
Ocolumn=max(1);
Orow=max(2);

if (Ocolumn==1)
column=1;
else if (Ocolumn == Owidth)
column=Ocolumn;
else if (I(Orow,(Ocolumn-1))~=0 && I(Orow,(Ocolumn+1))~=0)
ratio=double(I(Orow,(Ocolumn+1)))/double(I(Orow,(Ocolumn-1)));
fraction=ratio/(ratio+1);
else fraction=0;
end;
column=fraction+(Ocolumn-1);
end;
end;

if (Orow==1)
row=1;
else if (Orow == Ohight)
row=Orow;
else if (I((Orow-1),Ocolumn)~=0 && I((Orow+1),Ocolumn)~=0)
ratio=double(I((Orow+1),Ocolumn))/double(I((Orow-1),Ocolumn));
fraction=1/(1/ratio+1);
else fraction=0;
end;
row=fraction+(Orow-1);
end
end

function [ybar xbar] = FormatPointBig2(y, x,screenwidth,screenheight)

%This function formats any points so that it can be displayed


%in a screenwidth*screenheight gray level bitmap.

%inputs:
% x - the row number, i.e. the height of the bitmap.
% y - the column number, i.e. the width of the bitmap.

if (x < 1)
xbar = 1;
elseif (x > screenheight)
xbar = screenheight;
else

22
xbar = round(x); %round to the nearest integer
end

if (y < 1 )
ybar = 1;
elseif (y > screenwidth)
ybar = screenwidth;
else
ybar = round(y); %round to the nearest integer
end

function [maxy maxx] = FindMax(inputM)


% FindMax finds and outputs the position vector of the input matrix
inputM
[maxcolumn columnindex]=max(inputM);
[maxrow rowindex]=max(maxcolumn);
maxy=rowindex;
maxx=columnindex(maxy);
function Output = OutputToFromMatLab (XCoord, YCoord, stop_laser)
%This Function changes the content of the text file "FromMatlab.txt",
%if a coordinate is negative, change it to zero.

%cd X:\elec499\MatlabAndVBS
%cd Z:\elec499\Matlab\working
%cd D:\ELEC499Temporary\MainWorkingFile\MatLab
%cd X:\elec499\MainWorkingFile\MatLab

% NO negative positions!!
if (XCoord < 0)
XCoord = 0;
end
if (YCoord < 0)
YCoord = 0;
end

coord = [XCoord YCoord stop_laser];


dlmwrite('FromMatlab.txt',coord, ',');

end

23
Appendix B
Visual Basic code for Mouse Movement
Public Class MouseHookServ

Private Sub MouseHook_MouseDown(ByVal sender As System.IntPtr, ByVal e


As WindowsHookLib.MouseEventArgs) Handles MouseHook.MouseDown

If (e.Button.ToString = "Middle") Then


e.Handled = True
Try
Dim oWrite As System.IO.StreamWriter
oWrite = IO.File.CreateText("./hook.txt")
oWrite.WriteLine("down")
oWrite.Close()
Catch ex As Exception

End Try
End If
End Sub

Private Sub MouseHook_MouseUp(ByVal sender As System.IntPtr, ByVal e


As WindowsHookLib.MouseEventArgs) Handles MouseHook.MouseUp

If (e.Button.ToString = "Middle") Then


e.Handled = True
Try
Dim oWrite As System.IO.StreamWriter
oWrite = IO.File.CreateText("./hook.txt")
oWrite.WriteLine("up")
oWrite.Close()
Catch ex As Exception

End Try
End If
End Sub

Private Sub MouseHookServ_Load(ByVal sender As Object, ByVal e As


System.EventArgs) Handles Me.Load

Me.MouseHook.InstallHook()

End Sub

Private Sub MouseHookServ_FormClosing(ByVal sender As Object, ByVal e


As System.Windows.Forms.FormClosingEventArgs) Handles Me.FormClosing

Me.MouseHook.Dispose()

End Sub

End Class

24
Imports System.Runtime.InteropServices
Public Class MouseMoveProject

' Declare Routine to set cursor position


Declare Function SetCursorPos Lib "user32" _
(ByVal x As Long, ByVal y As Long) As Long

'Public Shared Property position() As Point


Public address As String

Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As


System.EventArgs) Handles Button1.Click

address = "./FromMatlab.txt"

Me.Hide()

Dim xvals As String


Dim yvals As String
Dim stop_lasers As String

Dim xval As Integer


Dim yval As Integer
Dim stop_laser As Integer

Dim EntireFile As String


EntireFile = "down"

Dim l As Integer = 0 'number of variables in the FromMatlab.txt


While True
While True
Dim oRead As System.IO.StreamReader
Try
oRead = IO.File.OpenText("./hook.txt")
EntireFile = oRead.ReadToEnd()
oRead.Close()
Catch ex As Exception
End Try

If EntireFile.Contains("down") Then
Exit While
End If
End While

While True

Dim oRead As System.IO.StreamReader


Try
oRead = IO.File.OpenText("./hook.txt")
EntireFile = oRead.ReadToEnd()
oRead.Close()
Catch ex As Exception
End Try

If EntireFile.Contains("up") Then
Exit While

25
End If

Try
Using MyReader As New _
Microsoft.VisualBasic.FileIO.TextFieldParser _
(address)

MyReader.TextFieldType = FileIO.FieldType.Delimited
MyReader.SetDelimiters(",")

Dim currentRow As String()


While Not MyReader.EndOfData
Try
currentRow = MyReader.ReadFields()
l = currentRow.Length
If l = 3 Then
xvals = currentRow(0)
yvals = currentRow(1)
stop_lasers = currentRow(2)

xval = Convert.ToInt32(xvals)
yval = Convert.ToInt32(yvals)
stop_laser = Convert.ToInt32(stop_lasers)
End If
Catch ex As Exception
MsgBox("Line " & ex.Message & _
"is not valid and will be skipped.")
End Try
End While
End Using
Catch ex As Exception
End Try

If stop_laser = 1 Then
Exit While
End If

Cursor.Position = New Point(xval, yval)

End While

If stop_laser = 1 Then
Exit While
End If

End While

Me.Show()
Dim aPoint As Point
aPoint = Me.Location
aPoint.X = aPoint.X + 50
aPoint.Y = aPoint.Y + 65
Cursor.Position = aPoint

End Sub

26
Private Sub Form1_Load(ByVal sender As Object, ByVal e As
System.EventArgs) Handles Me.Load
Shell("./NewProject.exe", AppWinStyle.NormalFocus)
Dim aPoint As Point
aPoint = Me.Location
aPoint.X = aPoint.X + 50
aPoint.Y = aPoint.Y + 65
Cursor.Position = aPoint
End Sub

Public Shared Function ToInt32( ByVal value As String ) As Integer


End Function

End Class

27

Vous aimerez peut-être aussi