Vous êtes sur la page 1sur 72

CHAPTER 1

INTRODUCTION
1.1 INTRODUCTION ABOUT THE PROJECT
Blindness is the condition of lacking visual observation due to
neurological and physiological factors. For blind pedestrian secure mobility is
one of the biggest challenges faced in their daily life. According to the World
Health Organization (WHO) in 2012, out of 7 billion global population there
were over 285 million visually impaired people and 39 million were totally
blind out of which 19 million are children (below 15 years) and this number is
growing at an alarming rate.[l] So, some navigation system is required to assist
or guide this people. Many researches are being conducted to build navigation
system for blind people. Most of these technologies have boundaries as its
challenge involves accuracy, interoperability, usability, coverage which is not
easy to overcome with current technology for both indoor and outdoor
navigation.
GPS based technique is "Drishti" which can switch the system from an
indoor to an outdoor environment and vice versa with a simple vocal
command. To provide complete navigation system, authors extend indoor
version of Drishti to the outdoor versions for blind pedestrians by adding only
two ultrasonic transceivers that are smaller than a credit card and are tagged to
the user's shoulder. System provides a real-time communication between user
and the mobile client via the headphone in which user can ask for the path,
obstacle prompts, and even his/her current location in familiar or unfamiliar
surrounding also. Unfortunately, this system has two limitations. As only two
beacons attached to the user's shoulder, so it becomes impossible to obtain the
height data of the user. Used algorithm calculates the location of user in two
dimensions assuming the average height of a person, which gives larger error if
the user sits or lies down. Another limitation is that because of signals
reflection or blocking by walls and furniture, there are some "dead spots" due
1
to the bad faulty date reads

World Health Organization (WHO)

 Disabilities are an umbrella term, covering impairments, activity


limitations and participation restrictions. Impairment is a problem
in body function or structure; an activity limitation is a difficulty
encountered by an individual in executing a task or action; while a
participation restriction is a problem experienced by an individual
in involvement in life situations.

 Thus, disability is a complex phenomenon, reflecting an


interaction between features of a person’s body and features of the
society in which he or she lives [168].

People with a disability may include:

 Blind or partially sighted


 Learning or intellectual disabilities
 Deaf or hearing impaired
 Physical disability
 Long term illnesses
 Psychological difficulties
 Acquired brain injury

Statistics Related to Disability


2
(i) World-Wide

Disability affects hundreds of millions of families in developing


countries. Currently, around 10 percent of the total world's population, or
roughly 650 million people, live with a disability. Having a disability
places one in the world's largest minority group. As the population ages,
this figure is expected to increase. Eighty per cent of persons with
disabilities live in developing countries, according to the United Nations
Development Program (UNDP).

(ii) India

According to the survey conducted by World Bank in 2007, there is


growing evidence that people with disabilities comprise between 4 and 8
% of the Indian population (around 40-90 million individuals). This
report, prepared at the request of Government of India, explores the social
and economic situation of this sizeable group .
Definition of Visual Impairment
Visual impairment (or vision impairment) is vision loss (of a person) to
such a degree as to qualify as an additional support need through a
significant limitation of visual capability resulting from either disease,
trauma, or congenital or degenerative conditions that cannot be
corrected by conventional means, such as refractive correction or
medication .

This functional loss of vision is typically defined to manifest with best


corrected visual acuity of less than 20/60, or significant central field
defect, significant peripheral field defect including homonymous or
heteronymous bilateral visual, field defect or generalized contraction or

3
constriction of field, or reduced peak contrast sensitivity with either of the
above conditions. In the United States, the terms "partially sighted", "low
vision", "legally blind" and "totally blind" are used by schools, colleges,
and other educational institutions to describe students with visual
impairments .

Different Types of visual impairments are defined as follows:

Partially sighted indicates some type of visual problem, with a need of


person to receive special education in some cases;

Low vision generally refers to a severe visual impairment, not


necessarily limited to distance vision. Low vision applies to all
individuals with sight who are unable to read the newspaper at a normal
viewing distance, even with the aid of eyeglasses or contact lenses. They
use a combination of vision and other senses to learn, although they may
require adaptations in lighting or the size of print, and, sometimes,
Braille;
Myopic - unable to see distant objects clearly, commonly called near-
sighted or short- sighted

Hyperopic - unable to see close objects clearly, commonly called far-


sighted or long- sighted

Legally Blind indicates that a person has less than 20/200 vision in the
better eye after best correction (contact lenses or glasses), or a field of
vision of less than 20 degrees in the better eye; and totally blind students
learn via Braille or other non-visual media.

4
Visual impairment is the consequence of a functional loss of vision, rather
than the eye disorder itself. Eye disorders which can lead to visual
impairments can include retinal degeneration, albinism, cataracts, and
glaucoma, muscular problems that result in visual disturbances, corneal
disorders, diabetic retinopathy, congenital disorders, and infection."
Visual impairment can also be caused by brain and nerve disorders, in
which case it is usually termed cortical visual impairment (CVI).

The American Medical Association's Guides to the Evaluation of


Permanent Impairment attempts to provide "a standardized, objective
approach to evaluating medical impairments." The Visual System
chapter "provides criteria for evaluating permanent impairment of the
visual system as it affects an individual's ability to perform activities
of daily living." The Guide has estimated that the loss of one eye equals
25% impairment of the visual system and 24% impairment of the whole
person; total loss of vision in both eyes is considered to be 100% visual
impairment and 85% impairment of the whole person.

In India, the broad definition of visual impairment as adopted in the


Persons with Disabilities (Equal Opportunities, Protection of Rights and
Full Participation) Act, 1995 as well as under the National Programme for
Control of Blindness (NPCB) is given below:

5
Blindness: It refers to a condition where a person suffers from any of the
following conditions, namely:

O Total absence of sight; or


O Visual acuity not exceeding 6/60 or 20/200 in the better eye even
with correction lenses; or
O Limitation of the field of vision subtending an angle of 20 degree or
worse.

For deciding the blindness, the visual acuity as well as field of vision
have been considered [40].

d. Statistics of Visually Impaired People India And World Wide

According to World wide data of visually impaired by WHO, India is now


home to the world's largest number of blind people. Of the 37 million
people across the globe who are blind, over 15 million are from India .

The multi-faceted process of providing special education and


rehabilitation services for children and individuals who are blind or
visually impaired embodies myriad tangible and intangible benefits.
Ultimately, however, it is important—sometimes actively impelling—
that, beyond the consumption benefits of education, this provision leads
to employment of persons who are blind or visually impaired.

Employment possibilities are stressed for the blind from an early age
because, for persons with, and those without disabilities, the acquisition
and retention of gainful employment has multiple benefits. Inter alia, it
leads to economic freedom which, in turn, heightens one’s standard of
6
living, increases socio-economic contribution and enhances a person’s
sense of self-worth. These possibilities are especially crucial to persons
with disabilities who face limited employment opportunities mainly
because of a zillion societal setbacks and interpersonal complications.
More than many disability categories, these setbacks and complications
are exacerbated for the blind and visually impaired .

The following table shows statistics of population with various disabilities


in the world:

Table 1: Worldwide Disability Wise Statistics of Population .

Number

1 Seeing 10634881 48.55


2 Speech 1640868 7.49
3 Hearing 1261722 5.76
4 Movement 6105477 27.87
5 Mental 2263821 10.33
6 Total 21906769 100

7
Figure 1a: Worldwide disability wise statistics of population.

Figure 1a shows that Visual Impairment (VI) is the most prevailing type
of disability as compared to others with close to a billion people having
disability.

8
Figure 1 b - Age Wise Population Distribution of VI in India.

About 284 million people are visually impaired (VI) worldwide: 39


million are blind and 245 have low vision. About 90% of the world`s
visually impaired live in developing countries
The statistics show that visual disability is the most predominant
disability amongst all.

49% of the disabled people in India are visually impaired. While we dare
to dream of India becoming a world’s super power by 2020, we cannot
afford to have such a large segment of population disadvantaged in any
manner. Thus imparting quality education to visually impaired citizen is a
necessity and not an option [16].
2. Issues of Visually Impaired People in India a. Quality of life

9
Many people with disabilities do not have equal access to health care,
education and employment opportunities. They do not receive the
disability-related services that they require, and experience exclusion
from everyday life activities. As per the current directives given at the
United Nations Convention on the Rights of Persons with Disabilities
(CRPD), disability is increasingly understood as a human rights issue .
b. Job and Employment

Physical and mental impairments are compounded by poor education


outcomes. Children with disabilities (CWD) have very high out of school
rates compared to other children. As for any other group, education is
critical to expanding the life prospects of people with disabilities.
Disabled people have much lower educational attainment rates, with 52
percent illiteracy against a 35 percent average for the general population.
Illiteracy is high across all categories of disability, and extremely so for
children with visual, multiple and mental disabilities (and for
severely disabled children of all categories). Disabled people also
have significantly lower employment rates than average, and this gap
has been increasing over the past 15 years. Low educational attainment,
poor employment prospects and stigma mean that Person with disability
(PWD) and their households are notably worse off than average [40].

Evidence shows that persons with disabilities experience worse


socioeconomic crises and poverty as compared to persons without
disabilities. Despite the magnitude of the issue, both awareness of and
scientific information on disability issues are lacking. There is no
agreement on definitions and little internationally comparable
information on the incidence, distribution and trends of disability. There
are few documents providing a compilation and analysis of the ways

10
countries have developed policies and responses to address the needs of
people with disabilities [40].

3. Education as a solution

a. Current Education Status

“Education must aim at giving the blind child knowledge of the realities
around him, the confidence to cope with these realities, and the feeling
that he is recognized and accepted as an individual in his own right.” -
Berthold Lowenfeld .

International Initiatives in Support of Inclusive Education

Inclusive education has evolved as a movement to challenge exclusionary


policies and practices and has gained ground over the past decade to
become a favoured adopted approach in addressing the learning needs
of all students in regular schools and classrooms. International
initiatives from the United Nations Educational, Scientific and cultural ,
the World Bank and elsewhere jointly add up to a growing consensus that
all children have the right to be educated together, regardless of their
disability or learning difficulty, and that inclusion makes good
educational and social sense (UNESCO, 1998). As mentioned earlier,
evidence shows that persons with disabilities experience worse
socioeconomic crises and poverty as compared to persons without
disabilities .

Role of technology for education of visually

11
impaired in India

That the computer can be used as an intelligent interface between the


visually impaired and the sighted is not new. For years, visually
impaired have been able to command the user interfaces of computers
using screen readers, speech synthesis, Braille displays and screen
magnification systems. Furthermore, the visually impaired have
access to the vast majority of all business applications, personal
productivity tools, office applications, email systems and web-
browsers. Using enabling technologies in combination with general-
purpose computer systems, the blind and partially sighted have been
able to transform information from formats aimed at the sighted into
formats more suitable to meet the needs of the visually impaired.
Enabling technologies for such automatic transformation include
Braille translation systems, screen magnification systems and text-to-
speech engines .

In the mid-1980’s, the capture of information was one of the key


challenges in making information available in Braille and other suitable
formats. Word-processing and desktop publishing were not used as
widely as is the case today. Consequently, most information was available
only in print. The solution to this problem turned out to be scanning,
Optical Character Recognition (OCR) processing and simply manually
retyping and storing the printed text on a computer .

Today, the situation has changed dramatically. All information is


produced electronically and is at least in theory available directly from

12
the publisher. A number of issues still remain, especially in the areas of
copyright and copy protection. In addition to information published in
print, vast amounts of information are available directly on the Internet
and on CD ROM and DVD. Finally, electronic books (or eBooks) are
emerging in the mainstream market. A recent survey estimated that by
2014, electronic books will account for as much as 15 per cent of the
total American market for published books

c. Role of technology in education and employability for visually


impaired

Enabling technologies are becoming mainstream. Although enabling


technologies developed specially for people with special needs
represent powerful tools, these enabling technologies are typically
developed using proprietary technology. Their limited markets result in
relatively high prices while at the same time demonstrating rather low
quality. The manufacturers cannot be held to blame for this. The high
price/low quality relation is simply a reflection of the market conditions .

However, in recent years many technologies that used to be utilized more


or less exclusively as enabling technologies are being adopted by the
mainstream market. The result could well be high-quality, low-price
products based on industry and/or de-facto standards. An example of a
technology that is being adopted by the mainstream market is speech
synthesis voice recognition. As talking computers, voice controlled
computer interfaces and voice-response systems become the norm, the
quality of speech synthesis and voice recognition will improve while the

13
prices of these technologies will drop .

14
1.2 PROBLEM DEFINITION
According to the World Health Organization (WHO), 285 million
people are estimated to be visually impaired: 39 million are blind and 246
million have low vision . Moreover, ninety percent of this population live in a
low income setting. Visually impaired people are dependent on a cane stick,
specially trained guide dogs or, in certain circumstances, a good Samaritan to
help them navigate. There are a variety of obstacle avoidance mechanisms that
can be used such as electromagnetic tracking devices to detect obstacles , RF
(Radio Frequency) localization or ultrasonic SONAR (SOund Navigation And
Ranging) sensors. None of these techniques, if used independently can offer a
concrete solution to aid the visually impaired. Even though there are numerous
prototypes and product designs available, none of them are economically
feasible for a majority of the blind population.

20
CHAPTER 2
SYSTEM ANALYSIS
2.1 EXISTING SYSTEM
The visually impaired have a lot of challenges they face in their day to
day lives. Tasks that seem to be so meagre to able bodied people such as a
walk to the park or social networking with friends may not be that simple.
Most of the visually impaired individuals do not have a very high income
thereby accessibility to resources are again limited. Research on the blind
shows that the visually impaired who are secure and capable of movement are
better adjusted psychologically and have an easier time in gaining
employment . Thus, the need for an easily accessible navigation tool comes
into the picture. Some other researches include the ‗Multimodal Electronic
Travel Aid‘ device which uses a laser pointer to gauge the angular depth and
position of the object . This mechanism is heavy on power consumption due to
the use of a laser, hence the battery life of the device suffers. Some research
focuses on vibrations as an output mechanism which is commonly known as
the vibrotactile feedback. There also was research conducted for astronauts to
not feel spatially disoriented when there is lack of gravity. This was used as a
prototype to build similar technologies for the visually impaired.
2.2 PROPOSED SYSTEM
Proposed system is mainly aiming at novel approach towards designing and
developing a shoes and portable audio playing device in order to assist blind
person to move on different surface and in different path. By the means of
creating fusion between visual sensing technologies, object finding technology
and the voice guidance technology. This system consists of •
 Design and developing a shoe having multiple depth, obstacle detection
and RGB sensor.
 Design a control board to detect multiple level of obstacle and the
ground object.
21
 Develop sound recording and playing module for voice assistance.

2.3 MODULES

• Cloud Data Administrative


• Current Position Identification
• Nearest Target Analysis
• Voice Module

MODULE DESCRIPTION

CLOUD DATA ADMINISTRATIVE

Administrative responsible for manage entire data in cloud. The


information module comprises all data related to points of interest in the
building. Every location, store or service is associated to one specific category,
subcategory or subcategory. Most categories have subcategories sectioning the
amplitude of category. Additionally, tags or keywords, that identify the subject
of the category, are associated to each category and subcategory.
Complementary, each store and service has a unique description provided by
the brand or store owner. This information must be clear and concise once it
will be used for the selection of stores or services. It will be delivered to the
user by synthesized speech. Interface of the system. User registration and login
were authentication provided for each user. Location based information
updated by the administrative.

22
CURRENT POSITION IDENTIFICATION

• Maps and GPS receivers show latitude and longitude angles


• Latitude is used to express how far north or south you are,
relative to the equator. If you are on the equator your latitude is
zero. If you are near the north pole your latitude is nearly 90
degrees north.
• Longitude shows your location in an east-west direction, relative
to the Greenwich meridian.
• Based on GPS value our current location is identified.

NEAREST TARGET ANALYSIS

• Path or Routes were analyzed based on current location.


• The system gets the sources as input from the user as command
then it gets the second input which is the destination.
• Efficient method to answer k nearest neighbor (KNN) Distance
Calculation queries in spatial networks.
• Shortest path analyzed and navigated .

VOICE MODULE
In this module, we implement voice guided system. For blind and
visually impaired people is quite impossible to be autonomous in the
contemporary world, in which we are completely surrounded by information,
but only visual information.

23
CHAPTER 3
DEVELOPMENT ENVIRONMENT
3.1 Software Requirements:
 Operating System : Windows 7 or Higher
 Language : Android, Java
 Developing Tool : Eclipse
 Backend : MySql Server
3.2 Hardware Requirements:
 Processor : Dual Core
 Ram : 2 GB
 Hard Disk : 160 GB Space

3.3 SOFTWARE DESCRIPTION

Overview of Android

A free, open source mobile platform. A Linux-based, multiprocess, Multithreaded


OS. Android is not a device or a product It’s not even limited to phones You could
build a DVR, a handheld GPS, an MP3 player, etc.

Android is a software stack for mobile devices that includes an operating system,
middleware and key applications. The Android SDK provides the tools and APIs
necessary to begin developing applications on the Android platform using the Java
programming language.

- Makes mobile development easy.

- Full phone software stack including applications

- Designed as a platform for software development

- Android is open

- Android is free

- Community support

24
 July 2005

Google acquired Android Inc.

 5 Nov 2007

Open HandSet Alliance formed-

Google, HTC, Intel, Motorola, Qualcomm,T-Mobile

 Android is the OHA first product

 12 Nov 2007

OHA released a preview of the Android OHA

 Oct-2008

First Device – T-Mobile G1 was released.

Latest – Motorola Milestone released in India.

Features

- Application framework enabling reuse and replacement of components

- Dalvik virtual machine optimized for mobile devices

- Integrated browser based on the open source WebKit engine

- Optimized graphics powered by a custom 2D graphics library; 3D graphics


based on the OpenGL ES 1.0 specification (hardware acceleration optional)

- SQLite for structured data storage

- Media support for common audio, video, and still image formats (MPEG4,
H.264, MP3, AAC, AMR, JPG, PNG, GIF)

- GSM Telephony (hardware dependent)

- Bluetooth, EDGE, 3G, and WiFi (hardware dependent)

- Camera, GPS, compass, and accelerometer (hardware dependent)

25
- Rich development environment including a device emulator, tools for
debugging, memory and performance profiling, and a plugin for the Eclipse
IDE

Linux Kernel

Android relies on Linux version 2.6 for core system services such as

- security

- memory management

- process management

- network stack

- driver model

- The kernel also acts as an abstraction layer between the hardware and the rest
of the software stack.

Android Runtime

- Android includes a set of core libraries that provides most of the functionality
available in the core libraries of the Java programming language.

- Every Android application runs in its own process, with its own instance of
the Dalvik virtual machine. Dalvik has been written so that a device can run
multiple VMs efficiently.

- The Dalvik VM executes files in the Dalvik Executable (.dex) format which is
optimized for minimal memory footprint.

- The Dalvik VM relies on the Linux kernel for underlying functionality such as
threading and low-level memory management.

Libraries
26
Android includes a set of C/C++ libraries used by various components of the
Android system. These capabilities are exposed to developers through the Android
application framework.

- System C Library

- Media Library

- Surface Manager

- LibWebCore

- SGL

- 3D libraries

- Free Type

- SQLite

Application Framework

- Being a open development platform, Android offers developers the ability to


build extremely rich and innovative applications.

- Developers have full access to the same framework APIs used by the core
applications.

- Views – used to build applications (lists, grid, buttons, text boxes and even
embeddable web browser)

- Content providers – enable applications to access data from other


applications or share their own data.

- Resource manager – provides access to non-code resources such as localized


strings, graphic and layout files.

- Notification manager – enables applications to display custom alerts in status


bar
27
- Activity manager – manages lifecycle of applications and provides
navigation backstack

- Android ships with a set of core applications including an email client, SMS
program, calendar, maps, browser, contacts, and others.

- All applications are written using the Java programming language.

Architecture:

Anatomy of an Android Application

There are four building blocks for an Android application:


28
- Activity - a single screen

- Broadcast Receiver- to execute in reaction to an external event(Phone Ring)

- Service - code that is long-lived and runs without a UI(Media Player)

- Content Provider - an application's data to be shared with other applications

Android Building Blocks

These are the most important parts of the Android APIs:

- AndroidManifest.xml

the control file-tells the system what to do with the top-level components

- Activities

an object that has a life cycle-is a chunk of code that does some work

- Views

an object that knows how to draw itself to the screen

- Intents

a simple message object that represents an "intention" to do something

- Notifications

is a small icon that appears in the status bar(SMS messages)

for alerting the user

- Services

is a body of code that runs in the background

Development Tools
29
The Android SDK includes a variety of custom tools that help you develop mobile
applications on the Android platform.Three of the most significant tools are:

1. Android Emulator -A virtual mobile device that runs on our computer -use to
design, debug, and test our applications in an actual Android run-time
environment

2. Android Development Tools Plugin -for the Eclipse IDE - adds powerful
extensions to the Eclipse integrated environment

3. Dalvik Debug Monitor Service (DDMS) -Integrated with Dalvik -this tool
let us manage processes on an emulator and assists in debugging

4. Android Asset Packaging Tool (AAPT) – Constructs the distributable


Android package files (.apk)

5. Android Debug Bridge (ADB) – provides link to a running emulator. Can


copy files to emulator, install .apk files and run commands.

Lifecycle of activity

30
OVERVIEW OF XML:

XML (Extensible Markup Language) is a set of rules for encoding documents


electronically. It is defined in the XML 1.0 Specification produced by the W3C, and
several other related specifications, all gratis open standards.

XML’s design goals emphasize simplicity, generality, and usability over the Internet
It is a textual data format, with strong support via Unicode for the languages of the
world. Although XML’s design focuses on documents, it is widely used for the
representation of arbitrary data structures, for example in web services.

Advantages:

The main usage of xml Is we can store and retrieve data easily with the help of xml.

XML Introduction:

MIDP devices have memory constraints when it comes to code, both in terms of the
amount of code you can store on the device, and memory available to applications at
runtime. So, keeping the size of applications and features in check is of paramount
importance to the J2ME developer. That’s where small-sized XML parsers come into
play.

XML parsers

This section describes the XML parsing process and introduces some small XML
parsers for MIDP.

XML parsing process:

The XML parsing process operates in three phases:

1. XML input processing. In this stage, the application parses and validates the
source document recognizes and searches for relevant information based on its
location or its tagging in the source document; extracts the relevant information when
31
it is located; and, optionally, maps and binds the retrieved information to business
objects.

2. Business logic handling. This is the stage in which the actual processing of the
input information takes place. It might result in the generation of output information.

3. XML output processing. In this stage, the application constructs a model of the
document to be generated with the Document Object Model (DOM). It then either
applies XSLT style sheets or directly serializes to XML.

An application that implements such a processing model is called an XML parser.


You can integrate an XML parser into your Java applications with the Java API for
XML Processing (JAXP). JAXP allows applications to parse and transform XML
documents using an API that is independent of any particular XML processor
implementation. Through a plug-in scheme, developers can change XML processor
implementations without altering their applications.

OVERVIEW OF GPS:

The Global Positioning System (GPS) is a space-based global navigation satellite


system that provides reliable location and time information in all weather and at all
times and anywhere on or near the Earth where there is an unobstructed line of sight
to four or more GPS satellites. It is maintained by the United States government and
is freely accessible by anyone with a GPS receiver.

The GPS consists of three parts: the space segment, the control segment, and the user
segment. The U.S. Air Force develops, maintains, and operates the space and control
segments. GPS satellites broadcast signals from space, which each GPS receiver uses
to calculate its three-dimensional location (latitude, longitude, and altitude) plus the
current time.

The space segment is composed of 24 to 32 satellites in medium Earth orbit and also
includes the boosters required to launch them into orbit. The control segment is
composed of a master control station, an alternate master control station, and a host
32
of dedicated and shared ground antennas and monitor stations. The user segment is
composed of hundreds of thousands of U.S. and allied military users of the secure
GPS Precise Positioning Service, and tens of millions of civil, commercial, and
scientific users of the Standard Positioning Service (see GPS navigation devices).

Applications:

GPS has become a widely used aid to navigation worldwide, and a useful tool for
map-making, land surveying, commerce, scientific uses, tracking and surveillance,
and hobbies such as geo caching and way marking. The precise time reference
provided by GPS is used in many applications including the scientific study of
earthquakes and as a time synchronization source for cellular network protocols.

In addition, GPS has, in the words of the website gps.gov, become a mainstay of
transportation systems worldwide, providing navigation for aviation, ground, and
maritime operations. Disaster relief and emergency services depend upon GPS for
location and timing capabilities in their life-saving missions. The accurate timing
provided by GPS facilitates everyday activities such as banking, mobile phone
operations, and even the control of power grids. Farmers, surveyors, geologists and
countless others perform their work more efficiently, safely, economically, and
accurately using the free and open GPS signals.

Using Location based services (LBS)

 LBS – different technologies used to find a device’s current location

 Two main LBS elements

 Location Manager – provides hooks to the LBS

 Location Providers – represents a different location-finding


technology used to determine the device’s current location.

Using Location Manager,

 Obtain current location

33
 Track movement

 Set proximity alerts

 Find available Location Providers

Selecting a Location Provider

 Depending on device, there may be several technologies that Android can use
to determine the current location with different capabilities(power
consumption, monetary cost, accuracy and ability to determine altitude, speed
etc..)

 To get an instance of a specific provider,

String providerName = LocationManager.GPS_PROVIDER;

LocationProvider gpsProvider = locationManager.getProvider(providerName);

This is useful to determine abilities of a particular provider.

 Two common Location providers

 LocationManager.GPS_PROVIDER

 LocationManager.NETWORK_PROVIDER

Finding your location

 Access to the location-based service is handled by Location Manager system


service.

LocationManager locationManager = (LocationManager)


getSystemService(LOCATION_SERVICE);
34
 Add one or more uses-permissions in manifest.

<uses-permission
android:name="android.permission.ACCESS_FINE_LOCATION“/>

<uses-permission
android:name="android.permission.ACCESS_COARSE_LOCATION“/>

 To find the last location fix determined by a particular provider,

Location location =
locationManager.getLastKnownLocation(LocationManager.GPS_PROVIDER);

 Location object contains all the position information which can be retrieved
using get methods.

 getLastKnownLocation() does not ask the provider to update the current


position.

Using Geocoder

 Geocoding lets you translate between street addresses and map coordinates.

 Lookups are done on the server, so Internet permission is required.

 Geocoder class provides access to two geocoding functions

 Forward geocoding – finds coordinates of an address

 Reverse geocoding – finds street address for given points.

Geocoder gc = new Geocoder (this, Locale.getDefault());

Reverse Geocoding

 Map coordinates to address

Geocoder gc = new Geocoder (this, Locale.getDefault());


35
try {

List<Address> addresses =null;

addresses = gc.getFromLocation(lat, lng, 10);

StringBuilder sb = new StringBuilder();

if(addresses.size()>0)

Address address = addresses.get(0);

for(int i=0; i<address.getMaxAddressLineIndex();i++)

sb.append(address.getAddressLine(i)).append("\n");

sb.append(address.getLocality()).append("\n");

sb.append(address.getPostalCode()).append("\n");

sb.append(address.getCountryName());

latLongString = sb.toString();

} catch (Exception e) {

Log.d("Geocoder",e.getMessage());

Forward Geocoding

 Address to map co-ordinates

Geocoder gc = new Geocoder (this, Locale.US);

String streetAddress = "160 Riverside Drive, New York, New York";

List<Address> locations =null;

36
try {

locations= gc.getFromLocationName(streetAddress, 10);

} catch (IOException e) {

Log.d("geocoder", e.getMessage());

Creating Map Based activities

 MapView provides an ideal user interface option for presenting geographical


data.

 Using MapView, we can create Activities that feature an interactive map.

 Map views offer full programmatic control of map display (zoom, location
and display modes-satellite, street and traffic views)

 Classes to support Android maps

 MapView – the Map View control.

 MapActivity – base class to create a new Activity that can include Map
View.

 OverLay – to annotate maps. Provides Canvas to draw on MapView.

 MapController – to control map (center location and zoom levels)

 MyLocationOverlay – special Overlay used to display the current


position and orientation of device.

ItemizedOverlays and OverlayItems – together used to create a layer of map


markers, displayed using Drawables and text.

37
CHAPTER 4
SYSTEM IMPLEMENTATION
package com.gstech.proximityexample;

import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.os.AsyncTask;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Locale;

public class RouteView extends Activity {

String recvname="";
String mobile="";
String sendername="";
Connection conn;
EditText edmessage;
String
complaint,area,landmark,description,date1,status,diet1,diet2,diag;
Button sendmsg;
ImageButton template;
TextView t1,t2,t3,t4,t5,t6,t7;
String s1,s2;
EditText edt1,edt2;
Button b1,b2;
Text2Speech t2s;
HashMap<String,String> usersList1 = null;
ArrayList<HashMap<String,String>> usersList2 = new
ArrayList<HashMap<String,String>>();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.viewlist);

t2s=new Text2Speech(getApplicationContext(), Locale.UK);


edt1 = (EditText) findViewById(R.id.login_username);
edt2 = (EditText) findViewById(R.id.login_password);
38
t6 = (TextView) findViewById(R.id.textView1);

b1 = (Button)findViewById(R.id.button1);
b1.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
try {
s1 = edt1.getText().toString();
s2 = edt2.getText().toString();

new StatusUpdate().execute();

}
catch(Exception e){
//

Toast.makeText(applicationContext.getApplicationContext(),e.toStrin
g(),Toast.LENGTH_LONG).show();
}
}
});

public class StatusUpdate extends AsyncTask<String, Void,


Boolean> {

ProgressDialog pDialog ;
Exception error;
String Text="";
ResultSet rs;

@Override
protected void onPreExecute() {
super.onPreExecute();

pDialog = new ProgressDialog(RouteView.this);


pDialog.setTitle("View Route");
pDialog.setMessage("Retrieving Route details...");

pDialog.setProgressStyle(ProgressDialog.STYLE_SPINNER);
pDialog.setIndeterminate(false);
pDialog.setCancelable(false);
pDialog.show();
}

@Override
protected Boolean doInBackground(String... args) {

39
try {

Class.forName("com.mysql.jdbc.Driver");
conn =
DriverManager.getConnection("jdbc:mysql://103.10.235.220:3306/indoo
rgps","root","password");
} catch (SQLException se) {
Log.e("ERRO1",se.getMessage());
} catch (ClassNotFoundException e) {
Log.e("ERRO2",e.getMessage());
} catch (Exception e) {
Log.e("ERRO3",e.getMessage());
}

try {
String COMANDOSQL="select * from routetable
where from1='"+s1+"' && to1='"+s2+"'";
Statement statement = conn.createStatement();
rs = statement.executeQuery(COMANDOSQL);
if(rs.next()){

diag = rs.getString(3);
return true;
}

return false;

// Toast.makeText(getBaseContext(),
// "Successfully Inserted.",
Toast.LENGTH_LONG).show();
} catch (Exception e) {
error = e;
return false;
// Toast.makeText(getBaseContext(),"Successfully
Registered...", Toast.LENGTH_LONG).show();
}

@SuppressLint("NewApi")
@Override
protected void onPostExecute(Boolean result1) {
pDialog.dismiss ( ) ;
if(result1)
{

t2s.talk(diag);
t6.setText(diag);
//
//Toast.makeText( getApplicationCo
ntext(),Text,Toast.LENGTH_LONG).show();
// edmessage.clearFocus();
// edmessage.setText("");
40
}else
{
if(error!=null)
{
//

Toast.makeText(getBaseContext(),error.getMessage().toString()
,Toast.LENGTH_LONG).show();
}
else
{

Toast.makeText( getApplicationContext(),Text,Toast.LENGTH_SHORT).sh
ow();
}
}
super.onPostExecute(result1);
}
}

package com.gstech.proximityexample;

public class tags {

public static String db_tb_userdetails="userdetails";


public static String db_username="username";
public static String db_password="password";
public static String db_email="email";
public static String db_phonenumber="phonenumber";

public static String db_tb_location="location";


public static String db_latitude="latitude";
public static String db_longitude="longitude";
public static String db_time="time";
public static String db_date="date";

package com.gstech.proximityexample;

import java.io.IOException;
import java.util.Locale;

import android.app.Activity;
import android.app.Dialog;
import android.content.ContentValues;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.location.Location;
import android.os.Bundle;
import android.util.Log;
41
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
import android.widget.Toast;

import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;

public class LocTaker1 extends Activity implements


locationTracker.LocationReceiver{
int RADIUS=25;//in meters
customMap cmap;
DataBaseHandler dbHander;
SQLiteDatabase sqliteDB;
Location myLocation;
Text2Speech t2s;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
t2s=new Text2Speech(getApplicationContext(), Locale.UK);
cmap=new customMap(this, getApplicationContext(), new
MapClickListener() {

@Override
public void onMarkerClicked(double latitude, double
longitude, Marker latlng) {
// TODO Auto-generated method stub
Toast.makeText(getApplicationContext(),
latlng.getTitle(), 1).show();
}

@Override
public void onMapLongClicked(final double latitude,final
double longitude,
LatLng latlng) {

@Override
public void onMapClicked(double latitude, double
longitude, LatLng latlng) {
// TODO Auto-generated method stub

}
});

openDB(true);
locationTracker locTracker=new locationTracker();
locTracker.startLocationTrack(getApplicationContext(), true,
this);
locTracker.startTracking(false);

}
42
@Override
protected void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
openDB(false);
}

@Override
public void locationReceived(Double Latitude, Double
Longitude,
Location location) {
// TODO Auto-generated method stub
myLocation=location;
Toast.makeText(getApplicationContext(), "loc Received",
1).show();
readLocationDetails();
}

public boolean openDB(boolean open) {


if (open) {
print("Opening DB");
dbHander = new
DataBaseHandler(getApplicationContext());
try {
dbHander.createDataBase();
dbHander.openDataBase();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return false;
}

sqliteDB = dbHander.getReadableDatabase();
return true;
} else {
print("Closing DB");
sqliteDB.close();
dbHander.close();
return true;
}

public long insertLocationDetails(String latitude,String


longitude,String message) {

ContentValues args = new ContentValues();


args.put(tags.db_latitude,latitude);
args.put(tags.db_longitude,longitude);
args.put(tags.db_time,message);

print("Inserting User Details in DB");

43
long rc = sqliteDB.insert(tags.db_tb_location,null,
args);
System.out.println("$$$$$$$$$$$ rows returnedDB1 :" +
rc);
readLocationDetails();
return rc;

public boolean readLocationDetails() {

final Cursor cursor = sqliteDB


.query(false, tags.db_tb_location, null,
null,
null, null, null, null, null);

cursor.moveToFirst();
if (cursor.getPosition() != -1)
{
while (!cursor.isAfterLast()) {

Location test=new Location(myLocation);

test.setLatitude(Double.parseDouble(cursor.getString(0)));

test.setLongitude(Double.parseDouble(cursor.getString(1)));
Log.e("MAP",
Double.parseDouble(cursor.getString(0))+"//"+
Double.parseDouble(cursor.getString(1))+"//"+cursor.getString(2));

cmap.addMarker(Double.parseDouble(cursor.getString(0)),
Double.parseDouble(cursor.getString(1)),cursor.getString(2), 2);
String takerMessage=cursor.getString(2);
float distanceInMeters =
myLocation.distanceTo(test);
boolean isWithin100m = distanceInMeters <
RADIUS;
if(isWithin100m)
{
t2s.talk(takerMessage);
Toast.makeText(getApplicationContext(),
"Location Near is "+takerMessage, 1).show();
}
cursor.moveToNext();
}

}
return false;

void print(String what) {


System.out.println("MAP TEST" + what);
}
44
}
package com.gstech.proximityexample;

import java.io.IOException;

import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.graphics.Color;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
import android.widget.TextView;

public class Login extends Activity implements OnClickListener {


Context myContext;
String TAG = "Login Activity : ";
DataBaseHandler dbHander;
SQLiteDatabase sqliteDB;

Button login, register,settingsbtn,b1;


EditText username, password,ipaddress;
TextView status;
// public static String HOST_IPADDRESS;

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.login);
myContext = getApplicationContext();
openDB(true);

login = (Button) findViewById(R.id.login_btn_login);


login.setOnClickListener(this);
register = (Button)
findViewById(R.id.login_btn_register);
register.setOnClickListener(this);
settingsbtn = (Button)
findViewById(R.id.login_btn_settings);
settingsbtn.setOnClickListener(this);
b1 = (Button) findViewById(R.id.button1);
b1.setOnClickListener(this);
username = (EditText) findViewById(R.id.login_username);
password = (EditText) findViewById(R.id.login_password);
ipaddress= (EditText) findViewById(R.id.edtIP);
status=(TextView)findViewById(R.id.login_status);

@Override
public void onClick(View arg0) {
// TODO Auto-generated method stub
45
switch (arg0.getId()) {
case R.id.login_btn_login:
status.setText("");
if(login_verify())
{

Intent homeActivity=new Intent(myContext,


LocTaker.class);
homeActivity.putExtra("username",
username.getText().toString());
homeActivity.putExtra("password",
password.getText().toString());
startActivity(homeActivity);

}
else
{

status.setText("Invalid UserName (Or) Password");


status.setTextColor(Color.RED);

break;

case R.id.login_btn_register:

Intent homeActivity=new Intent(myContext,


LocTaker1.class);
startActivity(homeActivity);
//Intent register = new Intent(myContext,
Register.class);
//startActivity(register);

break;

case R.id.button1:

Intent homeActivity1=new Intent(myContext,


RouteView.class);
startActivity(homeActivity1);
//Intent register = new Intent(myContext,
Register.class);
//startActivity(register);

break;

}
}

@Override
protected void onResume() {
// TODO Auto-generated method stub
super.onResume();
username.setText("");
46
password.setText("");
}

@Override
protected void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
openDB(false);
}

public boolean login_verify() {


String userName = username.getText().toString();
String passWord = password.getText().toString();
final Cursor cursor = sqliteDB
.query(false, "userdetails", new String[]
{tags.db_username,tags.db_password}, null,
null, null, null, null, null);

cursor.moveToFirst();
if (cursor.getPosition() != -1)
{
while (!cursor.isAfterLast()) {
//if(userName.equalsIgnoreCase(cursor.getStri
ng(0).toString()) &&
passWord.equalsIgnoreCase(cursor.getString(1).toString()))
if(userName.equalsIgnoreCase("admin") &&
passWord.equalsIgnoreCase("admin"))

{
print("User Exsist"+userName);
return true;
}
cursor.moveToNext();
}

}
return false;

public boolean openDB(boolean open) {


if (open) {
print("Opening DB");
dbHander = new DataBaseHandler(myContext);
try {
dbHander.createDataBase();
dbHander.openDataBase();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return false;
}

sqliteDB = dbHander.getReadableDatabase();
return true;
} else {
print("Closing DB");
47
sqliteDB.close();
dbHander.close();
return true;
}

public void print(String what) {


System.out.println(TAG + what);
}
}

package com.gstech.proximityexample;

import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;

public interface MapClickListener


{
public void onMarkerClicked(double latitude,double
longitude, Marker latlng);
public void onMapClicked(double latitude,double
longitude, LatLng latlng);
public void onMapLongClicked(double latitude,double
longitude, LatLng latlng);
}

package com.gstech.proximityexample;

import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.view.View;
import android.widget.Toast;

import com.google.android.gms.maps.CameraUpdateFactory;
import com.google.android.gms.maps.GoogleMap;
import com.google.android.gms.maps.GoogleMap.OnMapClickListener;
import
com.google.android.gms.maps.GoogleMap.OnMapLongClickListener;
import com.google.android.gms.maps.GoogleMap.OnMarkerClickListener;
import com.google.android.gms.maps.MapFragment;
import com.google.android.gms.maps.model.BitmapDescriptorFactory;
import com.google.android.gms.maps.model.CameraPosition;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
import com.google.android.gms.maps.model.MarkerOptions;

public class customMap extends View{


private GoogleMap googleMap;
Activity activity;Context context;
MapClickListener mapClickListener;

public customMap(Activity activity,Context context,MapClickListener


mapClickListener ) {

48
super(context);
this.activity=activity;
this.context=context;
this.mapClickListener=mapClickListener;
// TODO Auto-generated constructor stub
try {
// Loading map
initilizeMap();

Log.e("Gopi","10");
double latitude = 17.385044;
double longitude = 78.486671;

// lets place some 10 random markers


for (int i = 0; i < 10; i++) {
// random latitude and logitude
double[] randomLocation = createRandLocation(latitude,
longitude);

// Adding a marker
MarkerOptions marker = new MarkerOptions().position(
new LatLng(randomLocation[0],
randomLocation[1]))
.title("Hello Maps " + i);

Log.e("Random", "> " + randomLocation[0] + ", "


+ randomLocation[1]);

// changing marker color

} catch (Exception e) {
e.printStackTrace();
}
}

/**
* @param latitude
* @param longitude
* @param title
* @param colour 0-9
*/
public void addMarker(double latitude,double longitude,String
title,int colour)
{
MarkerOptions marker = new MarkerOptions().position(
new LatLng(latitude, longitude))
.title(title);

if (colour == 0)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_AZ
URE));
if (colour== 1)
49
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_BL
UE));
if (colour== 2)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_CY
AN));
if (colour== 3)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_GR
EEN));
if (colour== 4)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_MA
GENTA));
if (colour== 5)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_OR
ANGE));
if (colour== 6)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_RE
D));
if (colour== 7)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_RO
SE));
if (colour== 8)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_VI
OLET));
if (colour== 9)
marker.icon(BitmapDescriptorFactory
.defaultMarker(BitmapDescriptorFactory.HUE_YE
LLOW));

CameraPosition cameraPosition = new


CameraPosition.Builder().target(new LatLng(latitude,
longitude)).zoom(17).build();

googleMap.animateCamera(CameraUpdateFactory
.newCameraPosition(cameraPosition));

googleMap.addMarker(marker);

/**
* function to load map If map is not created it will create it for
you
* */
private void initilizeMap() {
if (googleMap == null) {
googleMap = ((MapFragment)
activity.getFragmentManager().findFragmentById(R.id.map)).getMap();
50
// check if map is created successfully or not
if (googleMap == null) {
Toast.makeText(context,
"Sorry! unable to create maps", Toast.LENGTH_SHORT)
.show();
}
else
{
// Changing map type
googleMap.setMapType(GoogleMap.MAP_TYPE_NORMAL);
// googleMap.setMapType(GoogleMap.MAP_TYPE_HYBRID);
//
googleMap.setMapType(GoogleMap.MAP_TYPE_SATELLITE);
//
googleMap.setMapType(GoogleMap.MAP_TYPE_TERRAIN);
// googleMap.setMapType(GoogleMap.MAP_TYPE_NONE);
// Showing / hiding your current location
googleMap.setMyLocationEnabled(true);
// Enable / Disable zooming controls

googleMap.getUiSettings().setZoomControlsEnabled(false);
// Enable / Disable my location button

googleMap.getUiSettings().setMyLocationButtonEnabled(true);
// Enable / Disable Compass icon
googleMap.getUiSettings().setCompassEnabled(true);
// Enable / Disable Rotate gesture

googleMap.getUiSettings().setRotateGesturesEnabled(true);
// Enable / Disable zooming functionality

googleMap.getUiSettings().setZoomGesturesEnabled(true);

googleMap.setOnMapClickListener(new
OnMapClickListener() {

@Override
public void onMapClick(LatLng arg0) {
// TODO Auto-generated method stub

mapClickListener.onMapClicked(arg0.latitude, arg0.longitude,
arg0);
}
});

googleMap.setOnMapLongClickListener(new
OnMapLongClickListener() {

@Override
public void onMapLongClick(LatLng arg0) {
// TODO Auto-generated method stub

mapClickListener.onMapLongClicked(arg0.latitude,
arg0.longitude, arg0);
}
});
51
googleMap.setOnMarkerClickListener(new
OnMarkerClickListener() {

@Override
public boolean onMarkerClick(Marker arg0) {
// TODO Auto-generated method stub

mapClickListener.onMarkerClicked(arg0.getPosition().latitude,
arg0.getPosition().longitude, arg0);
return true;
}
});
}
}
}

/*
* creating random postion around a location for testing purpose
only
*/
private double[] createRandLocation(double latitude, double
longitude) {

return new double[] { latitude + ((Math.random() - 0.5) / 500),


longitude + ((Math.random() - 0.5) / 500),
150 + ((Math.random() - 0.5) * 10) };
}

}
package com.gstech.proximityexample;

import java.util.Locale;

import android.content.Context;
import android.speech.tts.TextToSpeech;
import android.widget.Toast;

public class Text2Speech {


Context context;
TextToSpeech text2Speech;
/**
* @param context
* @param locale ie: Locale.UK
*/
public Text2Speech(Context context, final Locale locale) {
// TODO Auto-generated constructor stub
this.context=context;
text2Speech=new TextToSpeech(context,
new TextToSpeech.OnInitListener() {
@Override
public void onInit(int status) {
if(status != TextToSpeech.ERROR){
text2Speech.setLanguage(locale);
}
}
});
}
52
public void stop()
{
if(text2Speech !=null){
text2Speech.stop();
text2Speech.shutdown();
}
}

public void talk(String text)


{

Toast.makeText(context, text,
Toast.LENGTH_SHORT).show();
text2Speech.speak(text, TextToSpeech.QUEUE_FLUSH, null);

}
}

package com.gstech.proximityexample;

import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;

import android.content.Context;
import android.database.SQLException;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteException;
import android.database.sqlite.SQLiteOpenHelper;

public class DataBaseHandler extends SQLiteOpenHelper{

//The Android's default system path of your application


database.
private static String DB_PATH = "";

private static String DB_NAME = "db";

private SQLiteDatabase myDataBase;

private final Context myContext;

/**
* Constructor
* Takes and keeps a reference of the passed context in order
to access to the application assets and resources.
* @param context
*/
public DataBaseHandler(Context context) {

super(context, DB_NAME, null, 1);


DB_PATH="/data/data/"+context.getPackageName()+"/databases/";
53
this.myContext = context;
}

/**
* Creates a empty database on the system and rewrites it with
your own database.
* */
public void createDataBase() throws IOException{

boolean dbExist = checkDataBase();

if(dbExist){
//do nothing - database already exist
}else{

//By calling this method and empty database will be


created into the default system path
//of your application so we are gonna be able to
overwrite that database with our database.
this.getReadableDatabase();

try {

copyDataBase();

} catch (IOException e) {

throw new Error("Error copying database");

}
}

/**
* Check if the database already exist to avoid re-copying the
file each time you open the application.
* @return true if it exists, false if it doesn't
*/
private boolean checkDataBase(){

SQLiteDatabase checkDB = null;

try{
String myPath = DB_PATH + DB_NAME;
checkDB = SQLiteDatabase.openDatabase(myPath, null,
SQLiteDatabase.OPEN_READONLY);

}catch(SQLiteException e){

//database does't exist yet.

if(checkDB != null){

checkDB.close();
54
}

return checkDB != null ? true : false;


}

/**
* Copies your database from your local assets-folder to the
just created empty database in the
* system folder, from where it can be accessed and handled.
* This is done by transfering bytestream.
* */
private void copyDataBase() throws IOException{

//Open your local db as the input stream


InputStream myInput = myContext.getAssets().open(DB_NAME);

// Path to the just created empty db


String outFileName = DB_PATH + DB_NAME;

//Open the empty db as the output stream


OutputStream myOutput = new FileOutputStream(outFileName);

//transfer bytes from the inputfile to the outputfile


byte[] buffer = new byte[1024];
int length;
while ((length = myInput.read(buffer))>0){
myOutput.write(buffer, 0, length);
}

//Close the streams


myOutput.flush();
myOutput.close();
myInput.close();

public void openDataBase() throws SQLException{

//Open the database


String myPath = DB_PATH + DB_NAME;
myDataBase = SQLiteDatabase.openDatabase(myPath,
null,SQLiteDatabase.OPEN_READONLY);

@Override
public synchronized void close() {

if(myDataBase != null)
myDataBase.close();

super.close();

@Override
55
public void onCreate(SQLiteDatabase db) {

@Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int
newVersion) {

// Add your public helper methods to access and get content


from the database.
// You could return cursors by doing "return
myDataBase.query(....)" so it'd be easy
// to you to create adapters for your views.

package com.gstech.proximityexample;

import android.app.Activity;
import android.content.Context;
import android.location.Location;
import android.location.LocationListener;
import android.location.LocationManager;
import android.os.Bundle;

public class locationTracker {

public interface LocationReceiver


{
public void locationReceived(Double Latitude,Double
Longitude,Location location);
};
LocationManager locationManager;
LocationListener locationListener;
int INTERVAL=5,COUNT=1,TMPCOUNT=0;
double last_latitude=0,last_longitude=0;
Activity callingActivityGbl;

public void startLocationTrack(Context context,boolean


init_destroy,Activity callingActivity)
{
this.callingActivityGbl=callingActivity;
if(init_destroy){
locationManager = (LocationManager)
context.getSystemService(Context.LOCATION_SERVICE);

// Define a listener that responds to location


updates
locationListener = new LocationListener() {
public void onLocationChanged( Location
location) {
// Called when a new location is found by
the network location provider.

56
System.out.println("GopiL Latitude :
"+location.getLatitude()+" Longitude : "+location.getLongitude()+"
Accuracy : "+location.getAccuracy()+" Altitude :
"+location.getAltitude());

((LocationReceiver)callingActivityGbl).locationReceived(location.ge
tLatitude(), location.getLongitude(),location);

public void onStatusChanged(String provider,


int status, Bundle extras) {

System.out.println("Gopi "+provider+status);
}

public void onProviderEnabled(String provider)


{
System.out.println("Gopi1 "+provider);
}

public void onProviderDisabled(String


provider) {
System.out.println("Gopi2 "+provider);
}
};
}
else
{
if(locationManager!=null)
locationManager.removeUpdates(locationListener);
locationManager=null;
}
}

public void startTracking(boolean use_gps)


{
String locationProvider= LocationManager.NETWORK_PROVIDER;
if(use_gps)
locationProvider = LocationManager.GPS_PROVIDER;

locationManager.requestLocationUpdates(locationProvider,
INTERVAL*60*1000, 0, locationListener);
}

57
CHAPTER 4
ARCHITECTURE DESIGN
4.1 ARCHITECTURE DIAGRAM

the basic concept of IR(infrared) obstacle detection is to transmit the IR


signal(radiation) in a direction and a signal is received at the IR
receiver(photodiode) when the IR radiation bounces back from a surface of the
object. Sensing unit IR based system can be used to detect the obstacle in
particular direction even the distance of the object.
Proposed system consists of number of IR sensors to alert visually impaired
person from obstacles in path. These IR sensors are connected on shoes at its
front, left and right side to accurately detect the position of obstacle

There are different obstacle sensors available in market like sonar sensor,
ultrasonic sensor and IR sensor. But proposed system uses IR sensor as IR

58
sensors are highly directional and cheaper compared to others. So it can easily
differentiate the direction of obstacle.
RGB SENSOR:
RGB sensor is used to detect obstacles depending upon its red, green and blue
color level intensities of detected obstacle. RGB sensor is used to detect the
red, green, blue color level from reflected light at the boundary of obstacle.
This sensor will be connected on shoes at front facing toward ground. The
output of RGB sensor in the form of 3 different values of color intensities is
given to microcontroller.

59
CHAPTER 6

SYSTEM TESTING

The purpose of testing is to discover errors. Testing is the process of


trying to discover every conceivable fault or weakness in a work product. It
provides a way to check the functionality of components, sub assemblies,
assemblies and/or a finished product It is the process of exercising software
with the intent of ensuring that the

Software system meets its requirements and user expectations and does not fail
in an unacceptable manner. There are various types of test. Each test type
addresses a specific testing requirement.

TYPES OF TESTS

Unit testing
Unit testing involves the design of test cases that validate that the
internal program logic is functioning properly, and that program inputs produce
valid outputs. All decision branches and internal code flow should be validated.
It is the testing of individual software units of the application .it is done after
the completion of an individual unit before integration. This is a structural
testing, that relies on knowledge of its construction and is invasive. Unit tests
perform basic tests at component level and test a specific business process,
application, and/or system configuration. Unit tests ensure that each unique
path of a business process performs accurately to the documented
specifications and contains clearly defined inputs and expected results.

60
Integration testing

Integration tests are designed to test integrated software components to


determine if they actually run as one program. Testing is event driven and is
more concerned with the basic outcome of screens or fields. Integration tests
demonstrate that although the components were individually satisfaction, as
shown by successfully unit testing, the combination of components is correct
and consistent. Integration testing is specifically aimed at exposing the
problems that arise from the combination of components.

Functional test

Functional tests provide systematic demonstrations that functions tested


are available as specified by the business and technical requirements, system
documentation, and user manuals.

Functional testing is centered on the following items:

Valid Input : identified classes of valid input must be accepted.

Invalid Input : identified classes of invalid input must be rejected.

Functions : identified functions must be exercised.

Output : identified classes of application outputs must be


exercised.

Systems/Procedures: interfacing systems or procedures must be invoked.


61
Organization and preparation of functional tests is focused on requirements,
key functions, or special test cases. In addition, systematic coverage pertaining
to identify Business process flows; data fields, predefined processes, and
successive processes must be considered for testing. Before functional testing
is complete, additional tests are identified and the effective value of current
tests is determined.

System Test
System testing ensures that the entire integrated software system meets
requirements. It tests a configuration to ensure known and predictable results.
An example of system testing is the configuration oriented system integration
test. System testing is based on process descriptions and flows, emphasizing
pre-driven process links and integration points.

White Box Testing


White Box Testing is a testing in which in which the software tester has
knowledge of the inner workings, structure and language of the software, or at
least its purpose. It is purpose. It is used to test areas that cannot be reached
from a black box level.

Black Box Testing


Black Box Testing is testing the software without any knowledge of the
inner workings, structure or language of the module being tested. Black box
tests, as most other kinds of tests, must be written from a definitive source
document, such as specification or requirements document, such as
specification or requirements document. It is a testing in which the software

62
under test is treated, as a black box .you cannot “see” into it. The test provides
inputs and responds to outputs without considering how the software works.

6.1 Unit Testing:

Unit testing is usually conducted as part of a combined code and unit test
phase of the software lifecycle, although it is not uncommon for coding and
unit testing to be conducted as two distinct phases.

Test strategy and approach


Field testing will be performed manually and functional tests will be
written in detail.

Test objectives
 All field entries must work properly.
 Pages must be activated from the identified link.
 The entry screen, messages and responses must not be delayed.

Features to be tested
 Verify that the entries are of the correct format
 No duplicate entries should be allowed
 All links should take the user to the correct page.

63
6.2 Integration Testing

Software integration testing is the incremental integration testing of two


or more integrated software components on a single platform to produce
failures caused by interface defects.

The task of the integration test is to check that components or software


applications, e.g. components in a software system or – one step up – software
applications at the company level – interact without error.

Test Results: All the test cases mentioned above passed successfully. No
defects encountered.

6.3 Acceptance Testing

User Acceptance Testing is a critical phase of any project and requires


significant participation by the end user. It also ensures that the system meets
the functional requirements.

Test Results: All the test cases mentioned above passed successfully. No
defects encountered.

64
In this proposed system more attention is paid to sensor fusion, seamless
switch between indoor and outdoor navigation, route announcement,
minimizing the amount of infrastructure argument that is required for
localizing the user. So, the overall aim is to construct and design a portable,
imple, less costly device that will help visually impaired people to move in
unfamiliar environment also. Proposed system is designed considering
usefulness of all ages, user friendliness and does not need pre training and
knowledge of advanced technologies. The primary objective of this system is
to design a cost effective and easier to handle even for a visually impaired
illiterate person.

65
LIMITATIONS OF THE SYSTEM

The biggest limitation in this method is that it will not work in all types
of lighting condition. The lighting condition has to be good for the objects to
get detected. Secondly, the accuracy of detecting an object is higher in the case
of an ultrasonic sensor as compared to an infrared sensor.

MERITS OF THE SYSTEM


A prototype which will incorporate technologies such as GPS, SONAR
and Wi-Fi that can communicate with each other to increase the accuracy of the
navigation system. Functionalities such as location detection with accurate
GPS co-ordinates and navigation for the user with real-time obstacle detection
are prime objectives of our prototype. This project is also designed to measure
the feasibility and reliability for creating an augmented reality-based
navigation system. Finally, after lab testing, our real-world tests aimed to
measure the feasibility of our prototype to replace the cane stick or the guide
dog

FUTURE ENHANCEMENT
Our project can be a good platform for someone who would like to start
production for these navigation systems. We also have some ideas for future
research and enhancements on our system. First, there are number of
technologies which are in their nascent stage but, if proven successful,
wearable technology can be used to stitch all our circuits into the user‘s
clothes. During our survey, we came across a common request that the users
did not want to stand out in a crowd. With the wearable technology, the product
66
could have a designer appeal, something which the users highly desire. Second,
there should be a study done to improve the accuracy of indoor GPS. Third, the
system can be further enhanced with the use of piezoelectric-sensors which can
detect the capacitance and warn the user in case of changing terrain conditions
such as black-ice, water or oils pill. Fourth, the over-head branch detection
sensors can be incorporated in sunglasses so as to provide an alternative to the
cap. Fifth, the SONAR sensors should be improved to warn the user with
speech feedback in case of an approaching target such as a cyclist. In our
current version of the system, we do not consider the targets to be approaching
faster than walking speed. Also, there are cases when the cyclist could be
unaware of the user being blind and expecting to yield way to the cyclist. The
system should be capable of warning the user as well as activate some
emergency notifications to the cyclist.

67
APPENDIX

68
69
70
71
72
73
74
REFERNCES
[1] Chaitali K. Lakde, Dr. Prakash S Prasad," Review Paper on Navigation
System for Visually Impaired People", international Journal of Advanced
Research in Computer and Communication Engineering, Vol. 4, issue 1,
January 20i 5
[2] N.Mahmud, R.K.Saha, R.B. Zafar, MB.H. Bhuian, and SSSanvar,
"Vibration and Voice Operated Navigation System for Visually Impaired
Person ", 3rd international Conference on informatics, Electronics &
Vision, 2014.
[3] B. B. Blasch, W. R. Wiener, and R. L. Welsh, "Foundations of
Orientation and Mobility", 2nd ed. New York: AFB Press, 1997.
[4] Roshni: indoor Navigation System for Visually impaired by D.Jain and
MBalakrishnan, P. VMRao.
[5] R. Tapu, B. Mocanu, T. Zaharia" Real time static/dynamic obstacle
detection for visually impaired persons" iEEE international Conference
on consumer electronics (iCCE),978-1- 4799-2191-9114, pp. 394-
395,2014.
[6] V Kulyukin, C. Gharpure, 1. Nicholson, S Pavithran, "RFlD in Robot-
Assisted Indoor Navigation for the Visually Impaired ", Proceedings of
2004 IEEE/RSJ international Conference on intelligent Robots and
Systems, September 28 -October 2,2004, Sendai, Japan.
[7] Lisa Ran, SumiHelal and Steve Moore, "Drishti: An Integrated
Indoor/Outdoor Blind Navigation System and Service ", Proceedings of
the Second IEEE Annual Conference on Pervasive Computing and
Communications 2004 iEEE.
[8] Arjun Sharma, Rahul Patidar, ShubhamMandovara, ishwarRathod,
"Blind Audio Guidance System ", international Journal of Emerging

75
Technology and Advanced Engineering, volume 3, January 2013,pp.17-
19.
[9] Shraga Shoval, Johann Borenstein, and Yoram Koren," The navbelt - a
computerized travel aid for the blind based on mobile robotics
technology",lEEE Transactions on Biomedical Engineering, Vol. 45, No.
ll, pp. 1376-1386, 2014.
A. Aladren, G. Lopez-Nicolas, Luis Puig, and Josechu J. Guerrero,
"Navigation Assistance for the Visually Impaired Using RGB-D
Sensor With Range Expansion ", 20i 4 iEEE. [II]
MounirBousbia-Salah ,AbdelghaniRecljati, Mohamed Fezari,
MaamarBettayeb, "An Ultrasonic Navigation System For Blind
People", iEEE international Conference on Signal Processing and
Communications (lCSPC 2007),Dubai,24-27November2007,pp.
i003-i006.

76

Vous aimerez peut-être aussi