Problems with Matlab Projects? You may face many Problems, but do not worry we are ready to solve your Problems. All you need to do is just leave your Comments. We will assure you that you will find a solution to your project along with future tips. On Request we will Mail you Matlab Codes for Registered Members of this site only, at free service...Follow Me.

INFRARED GAZE TRACKING in MATLAB


Abstract


One of mankind’s most major senses is its eyesight.  The eye is different from the
other body parts that make up the human’s sensor array.  It is different, as through the
eyes, a lot can be read with regards to a human’s expressions.  For example, it can be
assumed that a person’s attention is generally focussed on where they are looking and
because of this, tracking the eyes movement can be useful.  Fiction has had people
operating things using sight rather than hands a long time ago, yet the technology
hasn’t developed into the mainstream as of yet.  The major reason seems to be the
cost of systems, with most applications of gaze tracking being for specialist fields.
The development of this project is an attempt at making optical sensor control more
feasible.  The system is flexible enough to provide a basis for a variety of sensor
applications, ranging from statistics or human- computer interfaces to safety control
systems.  It is also much cheaper than similar off- the -shelf products.

A simple Logitech WebCam is used to capture images and Infrared Light Emitting
Diodes are mounted around the lens.  Infrared light, reflected off the pupil, has been
used to create a high intensity bright spot.  This is called a Purkinje image and can be
tracked.  Edge Detection is used in conjunction with error and blink checking to
derive an approximate corresponding x, y co- ordinate within the frame boundaries. 
Due to the possible effects of Infrared on the eye, the light source has been designed
to Australian Standard. For the purposes of demonstrating the system, a MATLAB
user interface has been designed.  The interface provides a sight controlled telephone
interface for hands free telephone dialling.  The system is also easily adaptable to
other potential Gaze Tracking uses.  It has been written in MATLAB and operates in
semi real-time, except in screen calibration, where single image pictures will be used.  

There are a variety of ways this thesis may be extended upon.  A Direct Show
implementation could be used in conjunction with speed improvements to allow
smoother tracking.  Also, the system is reasonably intrusive, as one eye ha s to be
covered up by the camera.  Head tracking technology could be used with Gaze
Tracking to allow a camera to be placed further away from the head, negating this
                                             problem.

1  INTRODUCTION

Due to the possibilities, tracking the eyes motion in real time is not a new idea.  The
most common method involves finding the centre of the pupil with respect to a
Purkinje image.  From these two co- ordinates a distance can be easily measured. 
There are other various ways to achieve tracking of the gaze and these are highlighted
later in this thesis.  Infrared is advantageous in pupil tracking, as Infrared causes
strong gradients in image intensities along the pupil border and around Purkinje
images.  It also provides less dependency on surrounding light, making it the
preferred method.  

To detect a persons state of awareness there are other easier methods than pupil
tracking.  For example, monitoring the heads outline is more convenient.  This is
because it is harder to make the eye gaze system non- intrusive.  New techniques are
also constantly being designed to compensate for the he ad movement in non-steady
conditions.  Generally these systems need to be extensively calibrated and are quite
expensive to purchase off- the -shelf. 

This thesis explains the development of a cheap system of gaze detection.  The speed
of the system is not quite fast enough to be considered in real- time, however for the
purposes of a sight controlled telephone interface it  succeeds.  Major emphasis has
been placed in the development of the detection scheme to ensure accuracy and
repeatability.  With minimal future development this project could make cheap gaze
                                            detection viable. 




A sight controlled interface has been developed to demonstrate the gaze tracking
system.  To enhanc e public accessibility, it is written to allow for use with a simple
computer camera.  The interface has been developed in MATLAB, as has the General
User Interface.  Several methods of detection and calculation have been experimented
with in the design process.  However, the current system provides improvements in
speed and accuracy over these techniques.

A large amount of research has been conducted in the field of eye detection.  In
particular, IBM Almaden is one of the research leaders in the field  [1].  While IBM
was one of the first companies to research this field, many individuals have since
contributed and made improvements to the technology.  It is agreed that face
recognition is the easiest way to detect gaze or attention, however accuracy suffers. 
Hence most commercial technology uses both eye and face detection.  Some of the
current methods of gaze detection are outlined further in chapter four.

Personal work on this thesis, involved creating an accurate and robust method of
tracking the eye.  The information highlighted in the background literature and theory
has been gathered from outside sources to help explain the nature of the thesis and lay
the groundwork for the design.  

In the future, speed increases combined with head tracking would enable cheap and
accurate application in a wide variety of applications.  Decreases in the intrusiveness
of the system, is the first step for it to become commercially viable. 

1.2  Benefits of Gaze Tracking 

There are many potential uses for Gaze Tracking and  there are already many products
on the market.  However, these are generally for specialist applications, can be very
                                            costly to purchase and mainly exist on research and development levels.


There is potential in the automobile industry to have gaze detection for control
loops within cars.  As cars are becoming increasingly computer controlled, the
feasibility of using gaze detection in cars is growing.  Gaze detection is useful in
letting the computer know where the driver’s attention is focused.  Uses include
predicting crashes and to safely deploy air bags.  Using gaze detection is also
beneficial in being able to tell if the driver is checking the mirrors or even falling
asleep.  

   Virtual Reality benefits with the integration of a vision tracking system.  In
accommodating for pupil motion, the experience is enhanced.  Gaze detection in
virtual reality allows the display image to become much more lifelike, as eye
movement can compensate the image.  Companies such as Applied Science
Laboratories [2] specialise in eye tracking systems.  However, as with most
Virtual Reality systems prices can be quite high. 

   Gaze tracking can be used to aid people who are disabled and lack the capacity to
communicate in other ways.  As long as a person’s sight is active, co mmunication
can be achieved.  Generally this involves a keyboard displayed on a screen and
the gaze detection system, once calibrated, tracks eye movement to discern the
character to be chosen.  This is one of the most common uses of Gaze detection
and many companies such as LC Technologies [3] and Eye Response have
products aimed at this market.  Eye Responses ERICA [4] technology is software
based and allows disabled individuals to access all windows based software.  LC
Technologies provide a complete gaze detection package including hardware.

   Gaze detection is often used for statistical purposes.  The Cognitive Ergonomic
Research Facility web site [5] proves that the military is currently heavily
investing in gaze detection technology.  Applications include attention statistics
for console operators in order to improve efficiency of programs.  By tracking
console operators gaze movements during exercises, software GUIs and programs
can be configured to best accommodate users.  This leads to an increase in
                                                   efficiency and reaction times in emergencies.



A multitude of other uses in control are possible for this type of system, should the
final product be flexible, non- intrusive and compact.  The past has proven that
flexibility is the key to most successful Eye Tracking systems.  New uses are being
engineered at a fast rate, allowing mo re flexible systems that are easily implemented
to become very successful.  The main pitfall of most of these systems so far, however,
is the high costs associated with purchase.  By addressing this issue and providing
flexibility, the eye gaze system deve loped has a competitive advantage over common

                                            off-the- shelf products. 

Functional Specification


As outlined in the scope, the development of this project is an attempt at making
optical sensor control more feasible.  Flexibility is the main aim and the  Gaze
Tracking system is to provide a basis for a variety of sensor applications, ranging
from statistics and human- computer interfaces to safety control systems.  It is
demonstrated as a sight controlled telephone interface where the user can key in a
telephone number by sight.  The system is much cheaper than similar off- the - shelf
products.  

This product is designed, using a USB Camera and MATLAB.  The pupil and a
Purkinje image are used to determine gaze.  The Purkinje image provides a stationary
reference point with which to measure the movements of the pupil.  Methods of
detecting these image features have been derived and a canny edge detection
algorithm formed.

The majority of this project has been spent on software coding.  However, work has
been completed to address these points:
 
 •  Development of a method to detect reflected Infrared off the eye
 Researching the effects of infrared on the eye
 Edge detection of the reflected light
 Analysis of the reflections and determine an x, y co-ordinate
 Constructing necessary hardware
 Providing a GUI for demonstration of the system
                                                   •  Providing calibration mechanisms 


1.3.2  Design Goals and Objectives

Through the use of MATLAB, design of an infrared sensor is simplified.  Initial
design focused on a stand -alone computer system that allows the user to request a
number from a number pad screen.  Five design phases were established in the
Project Plan.  The goals and objectives are outlined below:

1.3.2.1  Design Phase 1: Research

Initial research focused on Signal and Image Processing.  The best method of image
capture was decided upon.  Also the implications of Infrared on the eye were
addressed, to ensure product safety. 

1.3.2.2  Design Phase 2: Testing and Setup

For initial testing, the construction of the system hardware was relatively simple. 
First, several different sized boards of LED's were constructed and research into the
most effective distances, magnitudes and board sizes determined next.  This was
decided by the magnitudes picked up by a Logitech QuickCam digital camera.  
Next, research into the best methods of filtering for this sensor was undertaken.  For
example, whether it is better to use monochrome or colour decoding and what types
of filters are most efficient, has been concluded.  

1.3.2.3  Design Phase 3: Filter Design

The two major tasks of this thesis were design of the Filter and GUI in C++.  During
the filter design stage, IR timing issues were investigated.  Through the designed
filter, tracking of the pupil is achieved without the need for timing the LED’s.  As a
result, detection became more complicated and the design phase took much longer
than expected.  However, the system is accurate and provides functionality to the
                                            GUI.


1.3.2.4  Design Phase 4: GUI Design

As the major goal is to design a sight controlled number system, the GUI has been
designed to measure an x, y distance from the pupil to the reflected Purkinje image. 
It uses this to select numbers gazed at and then displays on the GUI accordingly.  A
calibration system has been developed to facilitate the number selection process.

1.3.2.5  Design Phase 5: Final Testing and Reporting

Reporting has been factored into the design plan.  This includes all assessable
material, including the time to write this thesis itself.  In the weeks before
demonstration, further speed increase are hoped to be achieved.

 Networks

Due to the intrusiveness of other methods (i.e. Camera needing to be close), a method
has been devised to be able to track gaze from a distance.  This relies on corneal
reflections and neural networks.  Baluja and Pomerleu [9] explain a method o f using a
wide angled camera that picks up the glint surrounded by darkness.  This signifies an
eye within the wide view, from which the eye portion of the image is then extracted
and processed.  As the image is generally only about 15 X 40 pixels, a neural network
is required to deduce the more accurate gaze reading from the previous few minutes’
information.  The main advantage is the consideration of head movement.  However,
this method is lacking as far as accuracy is concerned with respect to other methods



2.1.3  Electric Skin Potential Tracking (Electrooculography)

Shaviv [10] introduces the idea of Electrooculography [EOG].  Due to the higher
metabolic rate at the retina compared to the cornea, the eye maintains a constant
voltage with respect to the retina.  This is approximately aligned with the optical axis. 
It rotates with the direction of gaze and can be measured by surface electrodes placed
on the skin around the eyes.  This system is easily mounted elsewhere other than
directly in front of the person as compared to other methods.  



Electrical skin potential tracking is often used in medicine and optometry to diagnose
certain conditions.  For example researchers from the department of Ophthalmology
at the University of Vienna [12] have used EOG to diagnose sixth nerve palsy.  From
their research it can be seen that while a clinical orthoptic examination is still the best
method of diagnosis, Electrooculography provides a suitable replacement in the
follow-up stage of treatment programs.  While these uses are beneficial, the use of
electrodes makes this method of gaze tracking unsuitable for use in everyday
applications. 


2.1.4  Pupil Tracking 

Pupil tracking is a method of gaze detection that is commonly used, often in
conjunction with other forms of tracking.  There are several reasons for this, but  the
main advantage is the notion of the “bright spot” [13].  Similar to the case of red eye
when taking flash photographs at night, Infrared can be used in pupil detection to
create a high intensity bright spot that is easy to find with image processing.   This
bright spot occurs when infrared is reflected off the back of the pupil and magnified
by the lens.

The main advantage of pupil tracking is that, as the border of the pupil is sharper than
the limbus, a higher resolution is achievable.  Also, as the pupil is never really
covered by the eyelid, x, y tracking is more feasible as compared to Limbus tracking. 
A disadvantage of this system is the fact that the camera and light still require
accurate on-axis alignment with the pupil.  Xiu, Fengliang and Fujimura [14] explain
the basic concepts of pupil tracking and introduce new ideas on improving the
flexibility of pupil tracking with infrared.  Also several methods are discussed in
order to better accommodate for head movement, pupil blinking and the wearing of
sunglasses (car application).


TASKS:


Design Phase 1: Research

 
 Research and choose camera
 Research and choose communications method
 Research LED's and LED board
 Research implications of IR on the eye
 Research best filter type

Design Phase 2: Testing and Set up

 •  Test camera input levels with MATLAB 
 Timing issues research and system prototype
 PCB design and construction
 Minor adjustments to set up 

Design Phase 3: Filter Design

 •  MATLAB imaging:  picking up on axis reflection in MATLAB
 Initial MATLAB design
 Application of research in Filter design with Direct Show
 Code alterations for integration with GUI

Design Phase 4: GUI design

 •  Application of eye tracking techniques
 GUI Design plan
 Front end design
 Calibration program design
 Code alterations for integration with filter
 Accuracy and repeatability improvements

Design Phase 5:  Final testing and Reporting

 •  Progress Report
 Oral Presentation research
 Thesis formulation
 Poster Design
                                                       •  Final testing and performance analysis 


PROGRAM LOOP CODE:


function varargout = Program(varargin)
% PROGRAM M-file for Program.fig
%
% Contains code for uicontrol and the manipulation of GUI
% Program loop determines gaze from iterations of the single capture process
%
% Written by Kim Kreutz
% for ELEC4801 Thesis Project 2003
%
%

gui_Singleton = 1;
gui_State = struct('gui_Name',       mfilename, ...
                   'gui_Singleton',  gui_Singleton, ...
                   'gui_OpeningFcn', @Program_OpeningFcn, ...
                   'gui_OutputFcn',  @Program_OutputFcn, ...
                   'gui_LayoutFcn',  [], ...
                   'gui_Callback',   []);
if nargin & isstr(varargin{1})
    gui_State.gui_Callback = str2func(varargin{1});
end 

if nargout
    [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
    gui_mainfcn(gui_State, varargin{:});
end 

%Inital Operations
function Program_OpeningFcn(hObject, eventdata, handles, varargin)

% Choose default command line output for Program
handles.output = hObject;

% Update handles structure
guidata(hObject, handles);

%setup globals
global tl
global tr
global bl
global br
global MaAp1
global MiAp1
                                               global MaAg1  

global MiAg1 

global MaAp2
global MiAp2
global MaAg2 
global MiAg2 
global MaAp3
global MiAp3
global MaAg3 
global MiAg3 
global MaAp4
global MiAp4
global MaAg4 
global MiAg4 

% --- Outputs from this function are returned to the command line.
function varargout = Program_OutputFcn(hObject, eventdata, handles)

% Get default command line output from handles structure
varargout{1} = handles.output;

% --- Executes on  button press in pushbutton1 (Close).
function pushbutton1_Callback(hObject, eventdata, handles)

close

% --- Executes on button press in pushbutton2 (Run).
function pushbutton2_Callback(hObject, eventdata, handles)

% test popup menu
popup_sel_index = get(handles.popupmenu1, 'Value');
switch popup_sel_index
    
    %find tl calibration xy
    case 1
    
        [MaAp1,MiAp1,MaAg1,MiAg1,Error,Blink,gx,gy,px,py,smooth,perode] =
newmethod 
        figure
        imshow(smooth)
        title('Glint')
        figure
        imshow(perode)
        title('Pupil')
        if error == 0
            tl = [gx,gy,px,py];
                                                  end     


%find tr calibration xy               

    case 2
        
        [MaAp2,MiAp2,MaAg2,MiAg2,Error,Blink,gx,gy,px,py,smooth,perode] =
newmethod 
        figure
        imshow(smooth)
        title('Glint')
        figure
        imshow(perode)
        title('Pupil')
        if error == 0
            tr = [gx,gy,px,py];
        end       
        
    %find bl calibration  xy   
    case 3
        
        [MaAp3,MiAp3,MaAg3,MiAg3,Error,Blink,gx,gy,px,py,smooth,perode] =
newmethod 
        figure
        imshow(smooth)
        title('Glint')
        figure
        imshow(perode)
        title('Pupil')
        if error == 0
            bl = [gx,gy,px,py];
        end       
        
    %find br calibration xy            
    case 4
        
        [MaAp4,MiAp4,MaAg4,MiAg4,Error,Blink,gx,gy,px,py,smooth,perode] =
newmethod 
        figure
        imshow(smooth)
        title('Glint')
        figure
        imshow(perode)
        title('Pupil')
        if error == 0
            br = [gx,gy,px,py];
        end      

    % do program        

    case 5

    % Set error calibration
    map = [MaAp1 MaAp2 MaAp3 MaAp4];
    mip = [MiAp1 MiAp2 MiAp3 MiAp4];
    mag = [MaAg1 MaAg2 MaAg3 MaAg4];
    mig = [MiAg1 MiAg2 MiAg3 MiAg4];
    CalMaxg = max(mag);
    CalMing = min(mig);
    CalMaxp = max(map);
    CalMinp = min(mip); 

    Programloop(‘start’,CalMaxg,CalMing,CalMaxp,CalMinp,tl,tr,bl,br);

% --- Executes during object creation, after setting all properties.
function popupmenu1_CreateFcn(hObject, eventdata, handles)
if ispc
    set(hObject,'BackgroundColor','white');
else
    set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end 

set(hObject, 'String', {'Top Left Cal', 'Top Right Cal', 'Bottom Left Cal', 'Bottom Right
Cal' , 'Run Program'});

% --- Executes on selection change in popupmenu1.
function popupmenu1_Callback(hObject, eventdata, handles)

% executes on program button start
function programloop(action, CalMaxg,CalMing,CalMaxp,CalMinp,tl,tr,bl,br)
% This function loops the program to calculate the number being looked at
% Action can be 'start' or 'stop'
% If push button press, action is set to 'stop'

%Start program loop

if strcmp(action,'start')

    %setup outputs
    t = uicontrol('type','text','string','0','position',[])% setup running number
    r = uicontrol('type','text','string','0','position',[])% setup type number
        
    % reset blinkcount and save old number
    blinkcount = 0;
    oldnumber = number;
    save = n3;
    %find xy1 
    [MaAp,MiAp,MaAg,MiAg,errora,Blinka,gx,gy,px,py,smooth,perode] =
newmethod(CalMaxg, CalMing,CalMaxp,CalMinp)
    XY1 = [gx,gy,px,py];
    if Blinka == 1;
        % incr blinkcount, use old xy vals 
        blinkcount = blinkcount + 1;
    elseif errora == 0
        % output number found 
        n1 = findnumber(gx,gy,px,py,tl,tr,bl,br);
    else
        n1 = save 
    end 

    %find xy2 
    [MaAp,MiAp,MaAg,MiAg,errorb,Blinkb,gx,gy,px,py,smooth,perode] =
newmethod(CalMaxg, CalMing,CalMaxp,CalMinp)
     XY2 = [gx,gy,px,py];
     if errorb == 0
        % output number found 
        n2 = findnumber(gx,gy,px,py,tl,tr,bl,br);
     elseif Blinkb == 1;
        % incr blinkcount, use old xy vals 
        blinkcount = blinkcount + 1;
     elseif errora == 1     
        n2 = oldnumber
     else n2 = n1 
     end 

    %find xy3 
    [MaAp,MiAp,MaAg,MiAg,errorc,Blinkc,gx,gy,px,py,smooth,perode] =
newmethod(CalMaxg, CalMing,CalMaxp,CalMinp)
    XY3 = [gx,gy,px,py];
    if errorc == 0
        % output number found 
        n3 = findnumber(gx,gy,px,py,tl,tr,bl,br);
    elseif Blinkc == 1;
        % incr blinkcount, use old xy vals 
        blinkcount = blinkcount + 1;
    elseif errorb == 1
        if errora == 1
            n3 = oldnumber;
        else n3 = n1;
        end 
    else n3 = n2;
    end  
if blinkcount == 3

  % update type display
            set(t,'string',oldnumber);

  else
  c = ((n1+n2+ n3)/3);
  if c == n1 
      number = c;
     else    
      avg = [XY1;XY2;Xy3];
      xg = mean(avg(:,1));
      yg = mean(avg(:,2));
      xp = mean(avg(:,3));
      yp = mean(avg(:,4));
    number = findnumber(xg,yg,xp,yp,tl,tr,bl,br);
   end 

             % update running display
 set(r,'string',number);      
   end 
    
    % At this time if button is pressed, loop exits
    z = uicontrol('type','pushbutton','string','Stop?','position',[])
    
    % loop back
    programloop('start');

% if action = 'stop', exit loop
elseif strcmp(action,'stop')
return;             
end 

% this function find the number gazed at derived from the capture co-ordinates.
function n = findnumber(gx,gy,px,py,tl,tr,bl,br);

still to be completed……

 SINGLE CAPTURE CODE

function [MaAp,MiAp,MaAg,MiAg,Error,Blink,gx,gy,px,py,smooth,perode] =
newmethod(CalMaxg, CalMing,CalMaxp,CalMinp) 

if nargin > 4
    return;
else 

%-------------------------------
% Constants
%-------------------------------  
  
N               = 10;
Sigma           = 1;
Theta1          = pi/2;
Theta2          = 0;
glintprethresh  = 0.165;
glintthresh     = 1.30;
resizex         = 50:305;  
resizey         = 25:280;
log2intconst    = 255;
pupilthresh     = 95;
bordclearconst  = 4;
pupilprethresh  = 0.06;
pupilthresh     = 0.3;
ErrorToleranceMaxg = 1.2; % CalMaxg  – Input These are worked out before 
ErrorToleranceMing = 0.8; % CalMing – Input    singlefind in  program loop
ErrorToleranceMaxp = 1.2; % CalMaxp - Input
ErrorToleranceMinp = 0.8; % CalMinp -  Input
%CalMing = 7; 
%CalMaxg = 10; 
%CalMinp = 27; 
%CalMaxp = 34;


%-------------------------------
% Structures
%---------------------------

s1 = [0 0 1 1 1 0 0;
      0 1 1 1 1 1 0;
      1 1 1 1 1 1 1;
      1 1 1 1 1 1 1;
      1 1 1 1 1 1 1;
      0 1 1 1 1 1 0;
      0 0 1 1 1 0 0];


s2 = [0 1 0;
         0 1 0;
         0 1 0];
s3 = [0 0 0;
         1 1 1;
         0 0 0];
  
s4 = [0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0 ;
         0 0 0 0 1 0 0 0 0]; 
s5 = [0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0;
         1 1 1 1 1 1 1 1 1;
         0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0;
         0 0 0 0 0 0 0 0 0];
  
s6 = [0 0 1 0 0;
         0 0 1 0 0;
         0 0 1 0 0;
         0 0 1 0 0;
         0 0 1 0 0];
s7 = [0 0 0 0 0;
         0 0 0 0 0;
         1 1 1 1 1;
         0 0 0 0 0;
         0 0 0 0 0];

s8 = [0 0 0 1 1 1 1 1 1 1 0 0 0;
         0 0 1 1 1 1 1 1 1 1 1 0 0;
         0 1 1 1 1 1 1 1 1 1 1 1 0;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         1 1 1 1 1 1 1 1 1 1 1 1 1;
         0 1 1 1 1 1 1 1 1 1 1 1 0;
         0 0 1 1 1 1 1 1 1 1 1 0 0;
         0 0 0 1 1 1 1 1 1 1 0 0 0]; 


%--------------------------------------
% x,y coordinate algorithm
%-----------------------------------
% Find Glint x,y         %
%------------------------%

% 1. grab image
In8 = vfm('grab');
%monochr omise
Out8 = rgb2gray(In8);
        %Doub = double(In8);
        %Av = ((Doub(:,:,1)/3)+(Doub(:,:,2)/3)+(Doub(:,:,3)/3));
        %Out8 = uint8(Av);
Out8 = Out8(resizey,resizex);
%Imshow(Out8);

%2. X- Axis filter
filterx=D2gauss(N,Sigma, Theta1);
Xf= conv2 (Out8,filterx,'same');
%figure,imagesc(Xf),title('Xf');

% 3. Y- axis filter
filtery=D2gauss(N,Sigma, Theta2);
Yf=conv2(Out8,filtery,'same'); 
%figure,imagesc(Yf),title('Yf');

% 4. Norm of the gradient 
norm=sqrt(Xf.*Xf+Yf.*Yf);
%figure,imagesc(norm),title ('Norm of Gradient');

% 5. Find max & min vals
normmax=max(max(norm));
normmin=min(min(norm));

% 6. set prethresh levels;
level=glintprethresh*(normmax- normmin)+normmin;
d = level.*ones(size(norm));
prethresh=max(norm,d);
prethresh = uint8(prethresh);
%figure,imagesc(prethresh),title('prethresh');

% 7. Threshold 
thresh = IPLthreshold(prethresh,glintthresh*level);
%figure, imshow(thresh),title('threshold')

% 8. Fill in gaps
fill = imfill(thresh, 'holes');
%figure, imshow(fill), title('fill') 


% 9. Thin the edge by opening
thin = imopen(fill, s1);
        %thin = imerode(thresh, [s1 s2]);
        %thin = bwareaopen(thresh,4);
%figure, imshow(thin),title('thin')

% 10. Fill in gaps
%fill = imfill(thin, 'holes');
%figure, imshow(fill), title('fill')

% 11. Clear the border objects
clear = imclearborder(thin,bordclearconst);
clear = uint8(clear);
%figure, imshow(clear),title('clear')

% 12. Test for Blink
if clear == 0
    blink = 1; %Reset all the rest of the outputs
    return;
else blink = 0;  
end 

% 13. Smooth line
smooth = imerode(clear,[s2 s3]);
%figure, imshow(smooth),title('smooth');

% 14. find x,y of glint
counter = 1;
for GX = 1:255
    for GY = 1:255
        if smooth(GY,GX) == 255;
            c(counter) = GX;
            v(counter) = GY;
            counter = counter + 1;
        end    
    end 
end 
gx = (sum(c)/(counter- 1));
gy = (sum(v)/(counter- 1));


% 15. Error checking for glint
if nargin == 4;
gstats = regionprops(smooth,'MajorAxisLength','MinorAxisLength');    
MaxTolg = ErrorToleranceMaxg*CalMaxg;
MinTolg = ErrorToleranceMing*CalMing;
MaAg = gstats(255).MajorAxisLength;
MiAg = gstats(255).MinorAxisLength;





if MaAg >= MaxTolg
  error = 1;
  return;
end 
if MiAg <= MinTolg
  error = 1;
  return;
end 
error = 0;
end 


% Optional: Demonstration
% find perimeter
%perim = bwperim(smooth);
%perim = uint8(perim);
%BW8 = immultiply(perim, log2intconst);
%figure, imshow(BW8), title('perim');
% add original and edge detected images
%Segout = imsubtract(Out8, BW8);
%figure, imshow(Segout),title('segout')

%-----------------------%
% find pupil            %
%-----------------------%

% 16. Prethresholding
level=pupilprethresh*(normmax- normmin)+normmin;
d = level.*ones(size(norm));
pprethresh=max(norm,d);
pprethresh = uint8(pprethresh);
%figure,imagesc(pprethresh),title('pprethresh');

% 17. Evaluate to strengthen pupil edge
prethresh = double(prethresh);
pprethresh = double(pprethresh);
z1 = min(min(prethresh));
pt1 = prethresh- z1;
z2 = min(min(pprethresh));
pt2 = pprethresh- z2;
check = imsubtract(pt2, pt1);
%figure, imagesc(check),title('strengthen');

% 18. Thresholding and Border Clear
z3 = max(max(check));
pt = z3*pupilthresh;
check = uint8(check);
pt = uint8(pt);
pthresh = IPLthreshold(check, pt);
%figure, imshow(pthresh), title('pthresh');


prem = imclearborder(pthresh, 4);

% 19. Dilate to ensure pupil edge joined
prem = imdilate(prem, [s2 s3]);
%figure, imshow(prem), title('dialate');

% 20. Fill holes
pfill = imfill(prem, 'holes');
%figure, imshow(pfill), title('pfill');

% 21. Remove Glint from image and refill gap if in front of pupil
remglint = imsubtract(pfill, thin);
%figure, imshow(remglint), title('remglint');
refill = imfill(remglint, 'holes');

% 22. Erode image back and open 
perode = imerode(refill, [s4 s5]);
figure, imshow(perode), title('perode1');
perode = imopen(perode, s8);
figure, imshow(perode), title('perode3');
perode = imdilate(perode, s1);
figure, imshow(perode), title('perode4');
%perode = imdilate(perode, [s6 s7]); 
%figure, imshow(perode), title('perode5');

% 23. Find X,Y
counter = 1;
for ZX = 1:255
    for ZY = 1:255
        if perode(ZY,ZX) == 255;
            c(counter) = ZX;
            v(counter) = ZY;
            counter = counter + 1;
        end    
    end 
end 
px = (sum(c)/(counter- 1));
py =  (sum(v)/(counter-1));

% 24. error checking

if nargin == 4
pstats = regionprops(perode,'MajorAxisLength','MinorAxisLength');
MaxTolp = ErrorToleranceMaxp*CalMaxp;
MinTolp = ErrorToleranceMinp*CalMinp;
MaAp = pstats(255).MajorAxisLength;
MiAp = pstats(255).MinorAxisLength;
if MaAp >= MaxTolp
  error = 1;
  return;

end 
if MiAp <= MinTolp 
  error = 1;
  return;
end 
end 

% Optional: Demonstration
%find perimeter
%pperim = bwperim(perode);
%pperim = uint8(pperim);
%pBW8 = immultiply(pperim, log2intconst);
% add original and edge detected images
%pSegout = imsubtract(Out8, pBW8);
%figure, imshow(pSegout),title('psegout')

end 

% Filter for edge detection
function h = D2gauss(n,sigma,theta)
r=[cos(theta) -sin(theta);
   sin(theta)  cos(theta)];
for i = 1 : n 
    for j = 1 : n
        u = r * [j- (n+1)/2; i-(n+1)/2];
        h(i,j) = g1(u(1),sigma)*g2(u(2),sigma);
    end 
end 
h = h / sqrt(sum(sum(abs(h).*abs(h))));


function y = g2(n,s)
y = exp(- n^2/(2*s^2)) / (s*sqrt(2*pi));


%first order derivative of gauss function
function y = g2(u,s)
y = - u * gauss(u,s) / s^2;

PUKINJE IMAGE DETECTION FLOW CHART


 PUPIL DETECTION FLOW CHART
 
CROSS CORRELATION

function [pic5,glint,template,pic,error,find,edge,find2,mult,thresh] = crosscorrelation

% ------------------------------------------------ 
% Returns [x1,y1,x2,y2] co- ordinates of edge detected image.
%
% Uses cross correlation to match template to edge detected image
%
%
% -----------------------------------------------

% grab
%y = grab;

% gray
%G = gray(y);

% edge
%edge = IPLedge(G, 'c');
[edge,pic] = loadedge;
figure
Imshow(edge)



% Templates
glint = edge(112:121,180:189);
glint2 = edge(119:128,189:197);
%figure;
%Imshow(glint)
%figure;
%Imshow(glint2)
glint3 =[0 0 0 1 1 0 0 0;
         0 0 1 0 0 1 0 0;
         0 1 0 0 0 0 1 0;
         1 0 0 0 0 0 0 1;
         0 1 0 0 0 0 1 0;
         0 0 1 0 0 1 0 0;
         0 0 0 1 1 0 0 0];

% -----------------------------------------
%
% Glint
%
% ----------------------------------------

find = normxcorr2(glint(:,:),edge(:,:));
find = find(1:256,1:256);

% output
%pic5 = im2uint8(edge);
%pic6 = imadd(pic5,pic);
%figure
%imshow(pic6)

% test for error

multiply = find*255;
mult = abs(multiply);
mult = uint8(mult);
thresh = IPLthreshold(mult,100);

if thresh == 0;
    error = 1;
else
    error = 0;
end 

% This section uses a double process as at the time two purkinje images were being
used......%
%outputs x,y1 of bottom right hand side of template.
[findind,vecti] = max(abs(find(:)));
[y1,x1] = ind2sub(size(find),vecti(:));

% Delete maxpoint
find2 = find;
find2(y1,x1) = 0;

%outputs x,y2 of bottom right hand side of template.
[findind2,vecti2] = max(abs(find2(:)));
[y2,x2] = ind2sub(size(find),vecti2(:));

% recalibrate x,y for centre position
xx1 = x1- (((size(glint,2)-1))/2);
yx1 = y1- (((size(glint,1)-1))/2);
xx2 = x2- (((size(glint,2)-1))/2);
yx2 = y2- (((size(glint,1)-1))/2);

% function to load edge detected image
function [edge,pic] = loadedge
pic = imread('purkinjemon.bmp');
pic = rgb2gray(pic);
pic = pic(1:256,1:256);
edge = IPLedge(pic, 'c');
edge = bwareaopen(edge,10);

% capture camera still
function [In8] = grab
In8 = vfm('grab');

% grey scale
function [G] = gray(pic)
grayim = rgb2gray(pic);
G = grayim(1:256,1:256);   


  SLOW, INNACURATE DETECTION METHOD


% -------------------------------------------------------
%
% Original slow and innacurate method of finding the glint centre
%
%
%
%--------------------------------------------------------%

% grabs image from camera
In8 = vfm('grab',2);

%converts for monocrhome transform
Doub = double(In8);

% monchrome by average
Av = ((Doub(:,:,1)/3)+(Doub(:,:,2)/3)+(Doub(:,:,3)/3));
Out8 = uint8(Av);

% cuts image for use with IPL
G = Out8(1:256,1:256);
%Imshow(G);
%figure;

% canny edge detection
edge = IPLedge(G, 'c');
edge(1:256,1:130)=0;

%Imshow(edge);
% now we have to find centre

% starting from the top of circle
topmaxx=0;
topmaxy=0;
x=1;
y=1;
while topmaxx==0
    if edge(y,x)==255
        topmaxx=x
        topmaxy=y
    else x=x+1;
    end 
    if x==255
        x=1;
        y=y+1;
        if y==256
            topmaxx=1
            topmaxy=1
        end 
    end 
end 

% find left most bit of circle...corresponding to right side of pupil
leftmaxx=0;
leftmaxy=0;
x=1;
y=1;
while leftmaxx==0
    if edge(y, x)==255
        leftmaxx=x
        leftmaxy=y
    else y=y+1;
    end 
    if y==256
        y=1;
        x=x+1;
        if x==256
            leftmaxx=1
            leftmaxy=1
        end 
    end 
end 

% find bottom most part of circle
bottommaxx=0;
bottommaxy=0;
x=256;
y=256;

while bottommaxx==0
    if edge(y,x)==255
        bottommaxx=x
        bottommaxy=y
    else x=x-1;
    end 
    if x==1
        x=256;
        y=y- 1;
        if y==0
            bottommaxx=1
            bottommaxy=1
        end 
    end 
end 

% find right most part of circle...corresponding to left side of pupil
rightmaxx=0;
rightmaxy=0;
x=256;
y=256;
while rightmaxx==0
    if edge(y,x)==255
        rightmaxx=x
        rightmaxy=y
    else y=y-1;
    end 
    if y==1
        y=256;
        x=x- 1;
        if x==1
            rightmaxx=1
            rightmaxy=1
        end 
    end 
end 

% now to equate a centre x,y
% check if blink or not
if topmaxx==1
    blink=1
else
centrex = ((rightmaxx- leftmaxx)/2)+leftmaxx
centrey = ((bottommaxy- topmaxy) /2)+topmaxy
end

INVERSION/THRESHOLD METHOD OF PUPIL DETECTION

% This way works sometimes, however, not very accurately.

% threshold eye
threshtest = IPLthreshold(Out8, pupilthresh);
%figure, imshow(threshtest)

% invert intensities and open image borders
pupilcomp = imcomplement(threshtest);
pupilopen = imopen(pupilcomp, s1);
%figure, imshow(pupilopen), title('pupilopen');

% remove noise and clear border
pupilo = BWareaopen(pupilcomp, 4);
%figure, imshow(pupilo), title('op');
pupilclr = imclearborder(pupilopen, 4);
%figure, imshow(pupilclr), title('clr');

  MATLAB EDGE DETECTION USEAGE

function [Segout] = findcentre  
I = imread('purkinje.jpg');
%DI = imadjust(I, [], [0 1]);
BWs = edge(I, 'canny', [graythresh(I) * .35]);
%imshow(BWs)
se90 = [0 0 0; 1 1 1;0 0 0]; 
se0 = [0 1 0; 0 1 0;0 1 0]; 
BWsdil = imdilate(BWs, [se90 se0]);
%figure, imshow(BWsdil)
BWdfill = imfill(BWsdil, 'holes');
%figure, imshow(BWdfill)
BWnobord = imclearborder(BWdfill, 4);
%figure, imshow(BWnobord)
seD = [0 1 0;1 1 1;0 1 0];
BWfinal = imerode(BWnobord,seD);
BWfinal = imerode(BWfinal,seD);
%figure, imshow(BWfinal)
BWoutline = bwperim(BWfinal);
Segout = imadd(I, immultiply(BWoutline, 255););
figure, imshow(Segout)
stats = regionprops(BWfinal);
area = stats.Area;
centre = stats.Centroid;
check = BWs(1:90,1:110);
x = centre(1);
y = centre(2);

  

0 comments:

Post a Comment

Follow by Email

Recent Comments

Popular Matlab Topics

Share your knowledge - help others

Crazy over Matlab Projects ? - Join Now - Follow Me

Sites U Missed to Visit ?

Give Support

Give Support
Encourage me through Comments and by Followers
Related Posts Plugin for WordPress, Blogger...

Special Search For Matlab Projects

MATLAB PROJECTS

counter

Bharadwaj. Powered by Blogger.