Académique Documents
Professionnel Documents
Culture Documents
INTRODUCTION
1.1 INTRODUCTION ABOUT THE PROJECT
Blindness is the condition of lacking visual observation due to
neurological and physiological factors. For blind pedestrian secure mobility is
one of the biggest challenges faced in their daily life. According to the World
Health Organization (WHO) in 2012, out of 7 billion global population there
were over 285 million visually impaired people and 39 million were totally
blind out of which 19 million are children (below 15 years) and this number is
growing at an alarming rate.[l] So, some navigation system is required to assist
or guide this people. Many researches are being conducted to build navigation
system for blind people. Most of these technologies have boundaries as its
challenge involves accuracy, interoperability, usability, coverage which is not
easy to overcome with current technology for both indoor and outdoor
navigation.
GPS based technique is "Drishti" which can switch the system from an
indoor to an outdoor environment and vice versa with a simple vocal
command. To provide complete navigation system, authors extend indoor
version of Drishti to the outdoor versions for blind pedestrians by adding only
two ultrasonic transceivers that are smaller than a credit card and are tagged to
the user's shoulder. System provides a real-time communication between user
and the mobile client via the headphone in which user can ask for the path,
obstacle prompts, and even his/her current location in familiar or unfamiliar
surrounding also. Unfortunately, this system has two limitations. As only two
beacons attached to the user's shoulder, so it becomes impossible to obtain the
height data of the user. Used algorithm calculates the location of user in two
dimensions assuming the average height of a person, which gives larger error if
the user sits or lies down. Another limitation is that because of signals
reflection or blocking by walls and furniture, there are some "dead spots" due
1
to the bad faulty date reads
(ii) India
3
constriction of field, or reduced peak contrast sensitivity with either of the
above conditions. In the United States, the terms "partially sighted", "low
vision", "legally blind" and "totally blind" are used by schools, colleges,
and other educational institutions to describe students with visual
impairments .
Legally Blind indicates that a person has less than 20/200 vision in the
better eye after best correction (contact lenses or glasses), or a field of
vision of less than 20 degrees in the better eye; and totally blind students
learn via Braille or other non-visual media.
4
Visual impairment is the consequence of a functional loss of vision, rather
than the eye disorder itself. Eye disorders which can lead to visual
impairments can include retinal degeneration, albinism, cataracts, and
glaucoma, muscular problems that result in visual disturbances, corneal
disorders, diabetic retinopathy, congenital disorders, and infection."
Visual impairment can also be caused by brain and nerve disorders, in
which case it is usually termed cortical visual impairment (CVI).
5
Blindness: It refers to a condition where a person suffers from any of the
following conditions, namely:
For deciding the blindness, the visual acuity as well as field of vision
have been considered [40].
Employment possibilities are stressed for the blind from an early age
because, for persons with, and those without disabilities, the acquisition
and retention of gainful employment has multiple benefits. Inter alia, it
leads to economic freedom which, in turn, heightens one’s standard of
6
living, increases socio-economic contribution and enhances a person’s
sense of self-worth. These possibilities are especially crucial to persons
with disabilities who face limited employment opportunities mainly
because of a zillion societal setbacks and interpersonal complications.
More than many disability categories, these setbacks and complications
are exacerbated for the blind and visually impaired .
Number
7
Figure 1a: Worldwide disability wise statistics of population.
Figure 1a shows that Visual Impairment (VI) is the most prevailing type
of disability as compared to others with close to a billion people having
disability.
8
Figure 1 b - Age Wise Population Distribution of VI in India.
49% of the disabled people in India are visually impaired. While we dare
to dream of India becoming a world’s super power by 2020, we cannot
afford to have such a large segment of population disadvantaged in any
manner. Thus imparting quality education to visually impaired citizen is a
necessity and not an option [16].
2. Issues of Visually Impaired People in India a. Quality of life
9
Many people with disabilities do not have equal access to health care,
education and employment opportunities. They do not receive the
disability-related services that they require, and experience exclusion
from everyday life activities. As per the current directives given at the
United Nations Convention on the Rights of Persons with Disabilities
(CRPD), disability is increasingly understood as a human rights issue .
b. Job and Employment
10
countries have developed policies and responses to address the needs of
people with disabilities [40].
3. Education as a solution
“Education must aim at giving the blind child knowledge of the realities
around him, the confidence to cope with these realities, and the feeling
that he is recognized and accepted as an individual in his own right.” -
Berthold Lowenfeld .
11
impaired in India
12
the publisher. A number of issues still remain, especially in the areas of
copyright and copy protection. In addition to information published in
print, vast amounts of information are available directly on the Internet
and on CD ROM and DVD. Finally, electronic books (or eBooks) are
emerging in the mainstream market. A recent survey estimated that by
2014, electronic books will account for as much as 15 per cent of the
total American market for published books
13
prices of these technologies will drop .
14
1.2 PROBLEM DEFINITION
According to the World Health Organization (WHO), 285 million
people are estimated to be visually impaired: 39 million are blind and 246
million have low vision . Moreover, ninety percent of this population live in a
low income setting. Visually impaired people are dependent on a cane stick,
specially trained guide dogs or, in certain circumstances, a good Samaritan to
help them navigate. There are a variety of obstacle avoidance mechanisms that
can be used such as electromagnetic tracking devices to detect obstacles , RF
(Radio Frequency) localization or ultrasonic SONAR (SOund Navigation And
Ranging) sensors. None of these techniques, if used independently can offer a
concrete solution to aid the visually impaired. Even though there are numerous
prototypes and product designs available, none of them are economically
feasible for a majority of the blind population.
20
CHAPTER 2
SYSTEM ANALYSIS
2.1 EXISTING SYSTEM
The visually impaired have a lot of challenges they face in their day to
day lives. Tasks that seem to be so meagre to able bodied people such as a
walk to the park or social networking with friends may not be that simple.
Most of the visually impaired individuals do not have a very high income
thereby accessibility to resources are again limited. Research on the blind
shows that the visually impaired who are secure and capable of movement are
better adjusted psychologically and have an easier time in gaining
employment . Thus, the need for an easily accessible navigation tool comes
into the picture. Some other researches include the ‗Multimodal Electronic
Travel Aid‘ device which uses a laser pointer to gauge the angular depth and
position of the object . This mechanism is heavy on power consumption due to
the use of a laser, hence the battery life of the device suffers. Some research
focuses on vibrations as an output mechanism which is commonly known as
the vibrotactile feedback. There also was research conducted for astronauts to
not feel spatially disoriented when there is lack of gravity. This was used as a
prototype to build similar technologies for the visually impaired.
2.2 PROPOSED SYSTEM
Proposed system is mainly aiming at novel approach towards designing and
developing a shoes and portable audio playing device in order to assist blind
person to move on different surface and in different path. By the means of
creating fusion between visual sensing technologies, object finding technology
and the voice guidance technology. This system consists of •
Design and developing a shoe having multiple depth, obstacle detection
and RGB sensor.
Design a control board to detect multiple level of obstacle and the
ground object.
21
Develop sound recording and playing module for voice assistance.
2.3 MODULES
MODULE DESCRIPTION
22
CURRENT POSITION IDENTIFICATION
VOICE MODULE
In this module, we implement voice guided system. For blind and
visually impaired people is quite impossible to be autonomous in the
contemporary world, in which we are completely surrounded by information,
but only visual information.
23
CHAPTER 3
DEVELOPMENT ENVIRONMENT
3.1 Software Requirements:
Operating System : Windows 7 or Higher
Language : Android, Java
Developing Tool : Eclipse
Backend : MySql Server
3.2 Hardware Requirements:
Processor : Dual Core
Ram : 2 GB
Hard Disk : 160 GB Space
Overview of Android
Android is a software stack for mobile devices that includes an operating system,
middleware and key applications. The Android SDK provides the tools and APIs
necessary to begin developing applications on the Android platform using the Java
programming language.
- Android is open
- Android is free
- Community support
24
July 2005
5 Nov 2007
12 Nov 2007
Oct-2008
Features
- Media support for common audio, video, and still image formats (MPEG4,
H.264, MP3, AAC, AMR, JPG, PNG, GIF)
25
- Rich development environment including a device emulator, tools for
debugging, memory and performance profiling, and a plugin for the Eclipse
IDE
Linux Kernel
Android relies on Linux version 2.6 for core system services such as
- security
- memory management
- process management
- network stack
- driver model
- The kernel also acts as an abstraction layer between the hardware and the rest
of the software stack.
Android Runtime
- Android includes a set of core libraries that provides most of the functionality
available in the core libraries of the Java programming language.
- Every Android application runs in its own process, with its own instance of
the Dalvik virtual machine. Dalvik has been written so that a device can run
multiple VMs efficiently.
- The Dalvik VM executes files in the Dalvik Executable (.dex) format which is
optimized for minimal memory footprint.
- The Dalvik VM relies on the Linux kernel for underlying functionality such as
threading and low-level memory management.
Libraries
26
Android includes a set of C/C++ libraries used by various components of the
Android system. These capabilities are exposed to developers through the Android
application framework.
- System C Library
- Media Library
- Surface Manager
- LibWebCore
- SGL
- 3D libraries
- Free Type
- SQLite
Application Framework
- Developers have full access to the same framework APIs used by the core
applications.
- Views – used to build applications (lists, grid, buttons, text boxes and even
embeddable web browser)
- Android ships with a set of core applications including an email client, SMS
program, calendar, maps, browser, contacts, and others.
Architecture:
- AndroidManifest.xml
the control file-tells the system what to do with the top-level components
- Activities
an object that has a life cycle-is a chunk of code that does some work
- Views
- Intents
- Notifications
- Services
Development Tools
29
The Android SDK includes a variety of custom tools that help you develop mobile
applications on the Android platform.Three of the most significant tools are:
1. Android Emulator -A virtual mobile device that runs on our computer -use to
design, debug, and test our applications in an actual Android run-time
environment
2. Android Development Tools Plugin -for the Eclipse IDE - adds powerful
extensions to the Eclipse integrated environment
3. Dalvik Debug Monitor Service (DDMS) -Integrated with Dalvik -this tool
let us manage processes on an emulator and assists in debugging
Lifecycle of activity
30
OVERVIEW OF XML:
XML’s design goals emphasize simplicity, generality, and usability over the Internet
It is a textual data format, with strong support via Unicode for the languages of the
world. Although XML’s design focuses on documents, it is widely used for the
representation of arbitrary data structures, for example in web services.
Advantages:
The main usage of xml Is we can store and retrieve data easily with the help of xml.
XML Introduction:
MIDP devices have memory constraints when it comes to code, both in terms of the
amount of code you can store on the device, and memory available to applications at
runtime. So, keeping the size of applications and features in check is of paramount
importance to the J2ME developer. That’s where small-sized XML parsers come into
play.
XML parsers
This section describes the XML parsing process and introduces some small XML
parsers for MIDP.
1. XML input processing. In this stage, the application parses and validates the
source document recognizes and searches for relevant information based on its
location or its tagging in the source document; extracts the relevant information when
31
it is located; and, optionally, maps and binds the retrieved information to business
objects.
2. Business logic handling. This is the stage in which the actual processing of the
input information takes place. It might result in the generation of output information.
3. XML output processing. In this stage, the application constructs a model of the
document to be generated with the Document Object Model (DOM). It then either
applies XSLT style sheets or directly serializes to XML.
OVERVIEW OF GPS:
The GPS consists of three parts: the space segment, the control segment, and the user
segment. The U.S. Air Force develops, maintains, and operates the space and control
segments. GPS satellites broadcast signals from space, which each GPS receiver uses
to calculate its three-dimensional location (latitude, longitude, and altitude) plus the
current time.
The space segment is composed of 24 to 32 satellites in medium Earth orbit and also
includes the boosters required to launch them into orbit. The control segment is
composed of a master control station, an alternate master control station, and a host
32
of dedicated and shared ground antennas and monitor stations. The user segment is
composed of hundreds of thousands of U.S. and allied military users of the secure
GPS Precise Positioning Service, and tens of millions of civil, commercial, and
scientific users of the Standard Positioning Service (see GPS navigation devices).
Applications:
GPS has become a widely used aid to navigation worldwide, and a useful tool for
map-making, land surveying, commerce, scientific uses, tracking and surveillance,
and hobbies such as geo caching and way marking. The precise time reference
provided by GPS is used in many applications including the scientific study of
earthquakes and as a time synchronization source for cellular network protocols.
In addition, GPS has, in the words of the website gps.gov, become a mainstay of
transportation systems worldwide, providing navigation for aviation, ground, and
maritime operations. Disaster relief and emergency services depend upon GPS for
location and timing capabilities in their life-saving missions. The accurate timing
provided by GPS facilitates everyday activities such as banking, mobile phone
operations, and even the control of power grids. Farmers, surveyors, geologists and
countless others perform their work more efficiently, safely, economically, and
accurately using the free and open GPS signals.
33
Track movement
Depending on device, there may be several technologies that Android can use
to determine the current location with different capabilities(power
consumption, monetary cost, accuracy and ability to determine altitude, speed
etc..)
LocationManager.GPS_PROVIDER
LocationManager.NETWORK_PROVIDER
<uses-permission
android:name="android.permission.ACCESS_FINE_LOCATION“/>
<uses-permission
android:name="android.permission.ACCESS_COARSE_LOCATION“/>
Location location =
locationManager.getLastKnownLocation(LocationManager.GPS_PROVIDER);
Location object contains all the position information which can be retrieved
using get methods.
Using Geocoder
Geocoding lets you translate between street addresses and map coordinates.
Reverse Geocoding
if(addresses.size()>0)
sb.append(address.getAddressLine(i)).append("\n");
sb.append(address.getLocality()).append("\n");
sb.append(address.getPostalCode()).append("\n");
sb.append(address.getCountryName());
latLongString = sb.toString();
} catch (Exception e) {
Log.d("Geocoder",e.getMessage());
Forward Geocoding
36
try {
} catch (IOException e) {
Log.d("geocoder", e.getMessage());
Map views offer full programmatic control of map display (zoom, location
and display modes-satellite, street and traffic views)
MapActivity – base class to create a new Activity that can include Map
View.
37
CHAPTER 4
SYSTEM IMPLEMENTATION
package com.gstech.proximityexample;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.os.AsyncTask;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Locale;
String recvname="";
String mobile="";
String sendername="";
Connection conn;
EditText edmessage;
String
complaint,area,landmark,description,date1,status,diet1,diet2,diag;
Button sendmsg;
ImageButton template;
TextView t1,t2,t3,t4,t5,t6,t7;
String s1,s2;
EditText edt1,edt2;
Button b1,b2;
Text2Speech t2s;
HashMap<String,String> usersList1 = null;
ArrayList<HashMap<String,String>> usersList2 = new
ArrayList<HashMap<String,String>>();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.viewlist);
b1 = (Button)findViewById(R.id.button1);
b1.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
try {
s1 = edt1.getText().toString();
s2 = edt2.getText().toString();
new StatusUpdate().execute();
}
catch(Exception e){
//
Toast.makeText(applicationContext.getApplicationContext(),e.toStrin
g(),Toast.LENGTH_LONG).show();
}
}
});
ProgressDialog pDialog ;
Exception error;
String Text="";
ResultSet rs;
@Override
protected void onPreExecute() {
super.onPreExecute();
pDialog.setProgressStyle(ProgressDialog.STYLE_SPINNER);
pDialog.setIndeterminate(false);
pDialog.setCancelable(false);
pDialog.show();
}
@Override
protected Boolean doInBackground(String... args) {
39
try {
Class.forName("com.mysql.jdbc.Driver");
conn =
DriverManager.getConnection("jdbc:mysql://103.10.235.220:3306/indoo
rgps","root","password");
} catch (SQLException se) {
Log.e("ERRO1",se.getMessage());
} catch (ClassNotFoundException e) {
Log.e("ERRO2",e.getMessage());
} catch (Exception e) {
Log.e("ERRO3",e.getMessage());
}
try {
String COMANDOSQL="select * from routetable
where from1='"+s1+"' && to1='"+s2+"'";
Statement statement = conn.createStatement();
rs = statement.executeQuery(COMANDOSQL);
if(rs.next()){
diag = rs.getString(3);
return true;
}
return false;
// Toast.makeText(getBaseContext(),
// "Successfully Inserted.",
Toast.LENGTH_LONG).show();
} catch (Exception e) {
error = e;
return false;
// Toast.makeText(getBaseContext(),"Successfully
Registered...", Toast.LENGTH_LONG).show();
}
@SuppressLint("NewApi")
@Override
protected void onPostExecute(Boolean result1) {
pDialog.dismiss ( ) ;
if(result1)
{
t2s.talk(diag);
t6.setText(diag);
//
//Toast.makeText( getApplicationCo
ntext(),Text,Toast.LENGTH_LONG).show();
// edmessage.clearFocus();
// edmessage.setText("");
40
}else
{
if(error!=null)
{
//
Toast.makeText(getBaseContext(),error.getMessage().toString()
,Toast.LENGTH_LONG).show();
}
else
{
Toast.makeText( getApplicationContext(),Text,Toast.LENGTH_SHORT).sh
ow();
}
}
super.onPostExecute(result1);
}
}
package com.gstech.proximityexample;
package com.gstech.proximityexample;
import java.io.IOException;
import java.util.Locale;
import android.app.Activity;
import android.app.Dialog;
import android.content.ContentValues;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.location.Location;
import android.os.Bundle;
import android.util.Log;
41
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
import android.widget.Toast;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
t2s=new Text2Speech(getApplicationContext(), Locale.UK);
cmap=new customMap(this, getApplicationContext(), new
MapClickListener() {
@Override
public void onMarkerClicked(double latitude, double
longitude, Marker latlng) {
// TODO Auto-generated method stub
Toast.makeText(getApplicationContext(),
latlng.getTitle(), 1).show();
}
@Override
public void onMapLongClicked(final double latitude,final
double longitude,
LatLng latlng) {
@Override
public void onMapClicked(double latitude, double
longitude, LatLng latlng) {
// TODO Auto-generated method stub
}
});
openDB(true);
locationTracker locTracker=new locationTracker();
locTracker.startLocationTrack(getApplicationContext(), true,
this);
locTracker.startTracking(false);
}
42
@Override
protected void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
openDB(false);
}
@Override
public void locationReceived(Double Latitude, Double
Longitude,
Location location) {
// TODO Auto-generated method stub
myLocation=location;
Toast.makeText(getApplicationContext(), "loc Received",
1).show();
readLocationDetails();
}
sqliteDB = dbHander.getReadableDatabase();
return true;
} else {
print("Closing DB");
sqliteDB.close();
dbHander.close();
return true;
}
43
long rc = sqliteDB.insert(tags.db_tb_location,null,
args);
System.out.println("$$$$$$$$$$$ rows returnedDB1 :" +
rc);
readLocationDetails();
return rc;
cursor.moveToFirst();
if (cursor.getPosition() != -1)
{
while (!cursor.isAfterLast()) {
test.setLatitude(Double.parseDouble(cursor.getString(0)));
test.setLongitude(Double.parseDouble(cursor.getString(1)));
Log.e("MAP",
Double.parseDouble(cursor.getString(0))+"//"+
Double.parseDouble(cursor.getString(1))+"//"+cursor.getString(2));
cmap.addMarker(Double.parseDouble(cursor.getString(0)),
Double.parseDouble(cursor.getString(1)),cursor.getString(2), 2);
String takerMessage=cursor.getString(2);
float distanceInMeters =
myLocation.distanceTo(test);
boolean isWithin100m = distanceInMeters <
RADIUS;
if(isWithin100m)
{
t2s.talk(takerMessage);
Toast.makeText(getApplicationContext(),
"Location Near is "+takerMessage, 1).show();
}
cursor.moveToNext();
}
}
return false;
import java.io.IOException;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.graphics.Color;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
import android.widget.TextView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.login);
myContext = getApplicationContext();
openDB(true);
@Override
public void onClick(View arg0) {
// TODO Auto-generated method stub
45
switch (arg0.getId()) {
case R.id.login_btn_login:
status.setText("");
if(login_verify())
{
}
else
{
break;
case R.id.login_btn_register:
break;
case R.id.button1:
break;
}
}
@Override
protected void onResume() {
// TODO Auto-generated method stub
super.onResume();
username.setText("");
46
password.setText("");
}
@Override
protected void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
openDB(false);
}
cursor.moveToFirst();
if (cursor.getPosition() != -1)
{
while (!cursor.isAfterLast()) {
//if(userName.equalsIgnoreCase(cursor.getStri
ng(0).toString()) &&
passWord.equalsIgnoreCase(cursor.getString(1).toString()))
if(userName.equalsIgnoreCase("admin") &&
passWord.equalsIgnoreCase("admin"))
{
print("User Exsist"+userName);
return true;
}
cursor.moveToNext();
}
}
return false;
sqliteDB = dbHander.getReadableDatabase();
return true;
} else {
print("Closing DB");
47
sqliteDB.close();
dbHander.close();
return true;
}
package com.gstech.proximityexample;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
package com.gstech.proximityexample;
import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.view.View;
import android.widget.Toast;
import com.google.android.gms.maps.CameraUpdateFactory;
import com.google.android.gms.maps.GoogleMap;
import com.google.android.gms.maps.GoogleMap.OnMapClickListener;
import
com.google.android.gms.maps.GoogleMap.OnMapLongClickListener;
import com.google.android.gms.maps.GoogleMap.OnMarkerClickListener;
import com.google.android.gms.maps.MapFragment;
import com.google.android.gms.maps.model.BitmapDescriptorFactory;
import com.google.android.gms.maps.model.CameraPosition;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
import com.google.android.gms.maps.model.MarkerOptions;
48
super(context);
this.activity=activity;
this.context=context;
this.mapClickListener=mapClickListener;
// TODO Auto-generated constructor stub
try {
// Loading map
initilizeMap();
Log.e("Gopi","10");
double latitude = 17.385044;
double longitude = 78.486671;
// Adding a marker
MarkerOptions marker = new MarkerOptions().position(
new LatLng(randomLocation[0],
randomLocation[1]))
.title("Hello Maps " + i);
} catch (Exception e) {
e.printStackTrace();
}
}
/**
* @param latitude
* @param longitude
* @param title
* @param colour 0-9
*/
public void addMarker(double latitude,double longitude,String
title,int colour)
{
MarkerOptions marker = new MarkerOptions().position(
new LatLng(latitude, longitude))
.title(title);
if (colour == 0)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_AZ
URE));
if (colour== 1)
49
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_BL
UE));
if (colour== 2)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_CY
AN));
if (colour== 3)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_GR
EEN));
if (colour== 4)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_MA
GENTA));
if (colour== 5)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_OR
ANGE));
if (colour== 6)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_RE
D));
if (colour== 7)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_RO
SE));
if (colour== 8)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_VI
OLET));
if (colour== 9)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_YE
LLOW));
googleMap.animateCamera(CameraUpdateFactory
.newCameraPosition(cameraPosition));
googleMap.addMarker(marker);
/**
* function to load map If map is not created it will create it for
you
* */
private void initilizeMap() {
if (googleMap == null) {
googleMap = ((MapFragment)
activity.getFragmentManager().findFragmentById(R.id.map)).getMap();
50
// check if map is created successfully or not
if (googleMap == null) {
Toast.makeText(context,
"Sorry! unable to create maps", Toast.LENGTH_SHORT)
.show();
}
else
{
// Changing map type
googleMap.setMapType(GoogleMap.MAP_TYPE_NORMAL);
// googleMap.setMapType(GoogleMap.MAP_TYPE_HYBRID);
//
googleMap.setMapType(GoogleMap.MAP_TYPE_SATELLITE);
//
googleMap.setMapType(GoogleMap.MAP_TYPE_TERRAIN);
// googleMap.setMapType(GoogleMap.MAP_TYPE_NONE);
// Showing / hiding your current location
googleMap.setMyLocationEnabled(true);
// Enable / Disable zooming controls
googleMap.getUiSettings().setZoomControlsEnabled(false);
// Enable / Disable my location button
googleMap.getUiSettings().setMyLocationButtonEnabled(true);
// Enable / Disable Compass icon
googleMap.getUiSettings().setCompassEnabled(true);
// Enable / Disable Rotate gesture
googleMap.getUiSettings().setRotateGesturesEnabled(true);
// Enable / Disable zooming functionality
googleMap.getUiSettings().setZoomGesturesEnabled(true);
googleMap.setOnMapClickListener(new
OnMapClickListener() {
@Override
public void onMapClick(LatLng arg0) {
// TODO Auto-generated method stub
mapClickListener.onMapClicked(arg0.latitude, arg0.longitude,
arg0);
}
});
googleMap.setOnMapLongClickListener(new
OnMapLongClickListener() {
@Override
public void onMapLongClick(LatLng arg0) {
// TODO Auto-generated method stub
mapClickListener.onMapLongClicked(arg0.latitude,
arg0.longitude, arg0);
}
});
51
googleMap.setOnMarkerClickListener(new
OnMarkerClickListener() {
@Override
public boolean onMarkerClick(Marker arg0) {
// TODO Auto-generated method stub
mapClickListener.onMarkerClicked(arg0.getPosition().latitude,
arg0.getPosition().longitude, arg0);
return true;
}
});
}
}
}
/*
* creating random postion around a location for testing purpose
only
*/
private double[] createRandLocation(double latitude, double
longitude) {
}
package com.gstech.proximityexample;
import java.util.Locale;
import android.content.Context;
import android.speech.tts.TextToSpeech;
import android.widget.Toast;
Toast.makeText(context, text,
Toast.LENGTH_SHORT).show();
text2Speech.speak(text, TextToSpeech.QUEUE_FLUSH, null);
}
}
package com.gstech.proximityexample;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import android.content.Context;
import android.database.SQLException;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteException;
import android.database.sqlite.SQLiteOpenHelper;
/**
* Constructor
* Takes and keeps a reference of the passed context in order
to access to the application assets and resources.
* @param context
*/
public DataBaseHandler(Context context) {
/**
* Creates a empty database on the system and rewrites it with
your own database.
* */
public void createDataBase() throws IOException{
if(dbExist){
//do nothing - database already exist
}else{
try {
copyDataBase();
} catch (IOException e) {
}
}
/**
* Check if the database already exist to avoid re-copying the
file each time you open the application.
* @return true if it exists, false if it doesn't
*/
private boolean checkDataBase(){
try{
String myPath = DB_PATH + DB_NAME;
checkDB = SQLiteDatabase.openDatabase(myPath, null,
SQLiteDatabase.OPEN_READONLY);
}catch(SQLiteException e){
if(checkDB != null){
checkDB.close();
54
}
/**
* Copies your database from your local assets-folder to the
just created empty database in the
* system folder, from where it can be accessed and handled.
* This is done by transfering bytestream.
* */
private void copyDataBase() throws IOException{
@Override
public synchronized void close() {
if(myDataBase != null)
myDataBase.close();
super.close();
@Override
55
public void onCreate(SQLiteDatabase db) {
@Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int
newVersion) {
package com.gstech.proximityexample;
import android.app.Activity;
import android.content.Context;
import android.location.Location;
import android.location.LocationListener;
import android.location.LocationManager;
import android.os.Bundle;
56
System.out.println("GopiL Latitude :
"+location.getLatitude()+" Longitude : "+location.getLongitude()+"
Accuracy : "+location.getAccuracy()+" Altitude :
"+location.getAltitude());
((LocationReceiver)callingActivityGbl).locationReceived(location.ge
tLatitude(), location.getLongitude(),location);
System.out.println("Gopi "+provider+status);
}
locationManager.requestLocationUpdates(locationProvider,
INTERVAL*60*1000, 0, locationListener);
}
57
CHAPTER 4
ARCHITECTURE DESIGN
4.1 ARCHITECTURE DIAGRAM
There are different obstacle sensors available in market like sonar sensor,
ultrasonic sensor and IR sensor. But proposed system uses IR sensor as IR
58
sensors are highly directional and cheaper compared to others. So it can easily
differentiate the direction of obstacle.
RGB SENSOR:
RGB sensor is used to detect obstacles depending upon its red, green and blue
color level intensities of detected obstacle. RGB sensor is used to detect the
red, green, blue color level from reflected light at the boundary of obstacle.
This sensor will be connected on shoes at front facing toward ground. The
output of RGB sensor in the form of 3 different values of color intensities is
given to microcontroller.
59
CHAPTER 6
SYSTEM TESTING
Software system meets its requirements and user expectations and does not fail
in an unacceptable manner. There are various types of test. Each test type
addresses a specific testing requirement.
TYPES OF TESTS
Unit testing
Unit testing involves the design of test cases that validate that the
internal program logic is functioning properly, and that program inputs produce
valid outputs. All decision branches and internal code flow should be validated.
It is the testing of individual software units of the application .it is done after
the completion of an individual unit before integration. This is a structural
testing, that relies on knowledge of its construction and is invasive. Unit tests
perform basic tests at component level and test a specific business process,
application, and/or system configuration. Unit tests ensure that each unique
path of a business process performs accurately to the documented
specifications and contains clearly defined inputs and expected results.
60
Integration testing
Functional test
System Test
System testing ensures that the entire integrated software system meets
requirements. It tests a configuration to ensure known and predictable results.
An example of system testing is the configuration oriented system integration
test. System testing is based on process descriptions and flows, emphasizing
pre-driven process links and integration points.
62
under test is treated, as a black box .you cannot “see” into it. The test provides
inputs and responds to outputs without considering how the software works.
Unit testing is usually conducted as part of a combined code and unit test
phase of the software lifecycle, although it is not uncommon for coding and
unit testing to be conducted as two distinct phases.
Test objectives
All field entries must work properly.
Pages must be activated from the identified link.
The entry screen, messages and responses must not be delayed.
Features to be tested
Verify that the entries are of the correct format
No duplicate entries should be allowed
All links should take the user to the correct page.
63
6.2 Integration Testing
Test Results: All the test cases mentioned above passed successfully. No
defects encountered.
Test Results: All the test cases mentioned above passed successfully. No
defects encountered.
64
In this proposed system more attention is paid to sensor fusion, seamless
switch between indoor and outdoor navigation, route announcement,
minimizing the amount of infrastructure argument that is required for
localizing the user. So, the overall aim is to construct and design a portable,
imple, less costly device that will help visually impaired people to move in
unfamiliar environment also. Proposed system is designed considering
usefulness of all ages, user friendliness and does not need pre training and
knowledge of advanced technologies. The primary objective of this system is
to design a cost effective and easier to handle even for a visually impaired
illiterate person.
65
LIMITATIONS OF THE SYSTEM
The biggest limitation in this method is that it will not work in all types
of lighting condition. The lighting condition has to be good for the objects to
get detected. Secondly, the accuracy of detecting an object is higher in the case
of an ultrasonic sensor as compared to an infrared sensor.
FUTURE ENHANCEMENT
Our project can be a good platform for someone who would like to start
production for these navigation systems. We also have some ideas for future
research and enhancements on our system. First, there are number of
technologies which are in their nascent stage but, if proven successful,
wearable technology can be used to stitch all our circuits into the user‘s
clothes. During our survey, we came across a common request that the users
did not want to stand out in a crowd. With the wearable technology, the product
66
could have a designer appeal, something which the users highly desire. Second,
there should be a study done to improve the accuracy of indoor GPS. Third, the
system can be further enhanced with the use of piezoelectric-sensors which can
detect the capacitance and warn the user in case of changing terrain conditions
such as black-ice, water or oils pill. Fourth, the over-head branch detection
sensors can be incorporated in sunglasses so as to provide an alternative to the
cap. Fifth, the SONAR sensors should be improved to warn the user with
speech feedback in case of an approaching target such as a cyclist. In our
current version of the system, we do not consider the targets to be approaching
faster than walking speed. Also, there are cases when the cyclist could be
unaware of the user being blind and expecting to yield way to the cyclist. The
system should be capable of warning the user as well as activate some
emergency notifications to the cyclist.
67
APPENDIX
68
69
70
71
72
73
74
REFERNCES
[1] Chaitali K. Lakde, Dr. Prakash S Prasad," Review Paper on Navigation
System for Visually Impaired People", international Journal of Advanced
Research in Computer and Communication Engineering, Vol. 4, issue 1,
January 20i 5
[2] N.Mahmud, R.K.Saha, R.B. Zafar, MB.H. Bhuian, and SSSanvar,
"Vibration and Voice Operated Navigation System for Visually Impaired
Person ", 3rd international Conference on informatics, Electronics &
Vision, 2014.
[3] B. B. Blasch, W. R. Wiener, and R. L. Welsh, "Foundations of
Orientation and Mobility", 2nd ed. New York: AFB Press, 1997.
[4] Roshni: indoor Navigation System for Visually impaired by D.Jain and
MBalakrishnan, P. VMRao.
[5] R. Tapu, B. Mocanu, T. Zaharia" Real time static/dynamic obstacle
detection for visually impaired persons" iEEE international Conference
on consumer electronics (iCCE),978-1- 4799-2191-9114, pp. 394-
395,2014.
[6] V Kulyukin, C. Gharpure, 1. Nicholson, S Pavithran, "RFlD in Robot-
Assisted Indoor Navigation for the Visually Impaired ", Proceedings of
2004 IEEE/RSJ international Conference on intelligent Robots and
Systems, September 28 -October 2,2004, Sendai, Japan.
[7] Lisa Ran, SumiHelal and Steve Moore, "Drishti: An Integrated
Indoor/Outdoor Blind Navigation System and Service ", Proceedings of
the Second IEEE Annual Conference on Pervasive Computing and
Communications 2004 iEEE.
[8] Arjun Sharma, Rahul Patidar, ShubhamMandovara, ishwarRathod,
"Blind Audio Guidance System ", international Journal of Emerging
75
Technology and Advanced Engineering, volume 3, January 2013,pp.17-
19.
[9] Shraga Shoval, Johann Borenstein, and Yoram Koren," The navbelt - a
computerized travel aid for the blind based on mobile robotics
technology",lEEE Transactions on Biomedical Engineering, Vol. 45, No.
ll, pp. 1376-1386, 2014.
A. Aladren, G. Lopez-Nicolas, Luis Puig, and Josechu J. Guerrero,
"Navigation Assistance for the Visually Impaired Using RGB-D
Sensor With Range Expansion ", 20i 4 iEEE. [II]
MounirBousbia-Salah ,AbdelghaniRecljati, Mohamed Fezari,
MaamarBettayeb, "An Ultrasonic Navigation System For Blind
People", iEEE international Conference on Signal Processing and
Communications (lCSPC 2007),Dubai,24-27November2007,pp.
i003-i006.
76