Vous êtes sur la page 1sur 23

COMMON ASSIGNMENT FOR Research Methodology

Name of the guide: Dr. M.G. MULLA


Name of the student: Singh Lavakush Shivbali
Research centre: Department of Commerce and Research Centre of Pune University
2016-17
Q.1 Explain the term Research and Development? What is the role of Research and
Development in advancement of an economy?
ANSWER No .1
Role of R& D in the development of Economy.
Research and development (R&D) spend in India has grown steadily to approximately $40 billion,
representing around 0.9 per cent of gross domestic product (GDP). This is still short of the Indian
government's target of 2 per cent of GDP that was set in 2010 and reaffirmed in 2012.R&D spend is
an indicator of competitiveness of a country's economy and many countries have set a target of
investing one per cent of their GDP on R&D. Some developed countries have set their targets at three
per cent of GDP. The USA, the UK, Australia, China, Japan, New Zealand, the Republic of Korea, the
Russian Federation and Singapore have spent more than one per cent (in the range of one to four per
cent) of their respective GDP on R&D.The pace of technological progress is directly proportional to
the efforts on R&D. The expenditure levels on R&D could, therefore, act as reliable barometer of
innovative capacity.Given the importance of R&D for any economy, it is important for the
Government of India to provide incentives to encourage companies and organisations to carry out or
expand R&D activity.Currently, weighted deduction of 200 per cent is allowed on R&D expenditure
incurred by an approved in-house R&D facility of any company engaged in manufacturing or
production of articles or things.In this regard, in the upcoming Budget, the government can introduce
necessary amendment or clarification to boost R&D activity, particularly with respect to the
following:
Answer No 1.
Introduction :
Many definitions of R&D are given by various organizations. The UNESCO defines R&D as - R&D
is any creative systematic activity undertaken in order to increase the stock of knowledge, including
knowledge of man, culture and society, and the use of this knowledge to devise new applications.
The United Nations statistics division defines R&D as - Research and Development by a market

producer is an activity undertaken for the purpose of discovering or developing new products,
including improved versions or qualities of existing products, or discovering or developing new or
more efficient processes of production.
Definition :Research and development (R&D) consists of investigative activities that a business
chooses to conduct with the intention of making a discovery that can either lead to the development
of new products or procedures, or to improvement of existing products or procedures. Research and
development is one of the means by which business can experience future growth by developing new
products or processes to improve and expand their operations.
Meaning
Research and Development (R&D), also known in Europe as research & technical (or
technological)development (RTD), is a general term for activities in connection with corporate or
governmental innovation. R&D is a component of Innovation and is situated at the front end of the
Innovation lifecycle. Innovation builds on R&D and includes commercialization phases. The
activities that are classified as R&D differ from company to company, but there are two primary
models, with an R&D department being either staffed by engineers and tasked with directly
developing new products, or staffed with industrial scientists and tasked with applied research in
scientific or technological fields which may facilitate future product development. In either case,
R&D differs from the vast majority of corporate activities in that it is not often intended to yield
immediate profit, and generally carries greater risk and an uncertain return on investment.
Research and development (R&D) consists of investigative activities that a business chooses to
conduct with the intention of making a discovery that can either lead to the development of new
products or procedures, or to improvement of existing products or procedures. Research and
development is one of the means by which business can experience future growth by developing new
products or processes to improve and expand their operations. Research and Development (R&D),
also known in Europe as research and technical (or technological) development (RTD), is a general

term for activities in connection with corporate or governmental innovation. R&D is a component of
Innovation and is situated at the front end of the Innovation lifecycle. Innovation builds on R&D and
includes commercialization phases. The activities that are classified as R&D differ from company to
company, but there are two primary models, with an R&D department being either staffed by
engineers and tasked with directly developing new products, or staffed with industrial scientists and
tasked with applied research in scientific or technological fields which may facilitate future product
development. In either case, R&D differs from the vast majority of corporate activities in that it is not
often intended to yield immediate profit, and generally carries greater risk and an uncertain return on
investment.
New product design and development is more often than not a crucial factor in the survival of a
company. In an industry that is changing fast, firms must continually revise their design and range of
products. This is necessary due to continuous technology change and development as well as other
competitors and the changing preference of customers. Without an R&D program, a firm must rely on
strategic alliances, acquisitions, and networks to tap into the innovations of others.
A system driven by marketing is one that puts the customer needs first, and only produces goods that
are known to sell. Market research is carried out, which establishes what is needed. If the
development is technology driven then R&D is directed toward developing products that market
research indicates will meet an unmet need.
In general, R&D activities are conducted by specialized units or centers belonging to a company, or
can be out-sourced to a contract research organization, universities, or state agencies. In the context of
commerce, "research and development" normally refers to future-oriented, longer-term activities in
science or technology, using similar techniques to scientific research but directed toward desired
outcomes and with broad forecasts of commercial yield.

Statistics on organizations devoted to "R&D" may express the state of an industry, the degree of
competition or the lure of progress. Some common measures include: budgets, numbers of patents or
on rates of peer-reviewed publications. Bank ratios are one of the best measures, because they are
continuously maintained, public and reflect risk.
In the U.S., a typical ratio of research and development for an industrial company is about 3.5% of
revenues; this measure is called "R&D intensity". A high technology company such as a computer
manufacturer might spend 7%. Although Allergan (a biotech company) tops the spending table with
43.4% investment, anything over 15% is remarkable and usually gains a reputation for being a high
technology company. Companies in this category include pharmaceutical companies such as Merck &
Co. (14.1%) or Novartis (15.1%), and engineering companies like Ericsson (24.9%).[1] Such
companies are often seen as credit risks because their spending ratios are so unusual.
Generally such firms prosper only in markets whose customers have extreme needs, such as
medicine, scientific instruments, safety-critical mechanisms (aircraft) or high technology military
armaments. The extreme needs justify the high risk of failure and consequently high gross margins
from 60% to 90% of revenues. That is, gross profits will be as much as 90% of the sales cost, with
manufacturing costing only 10% of the product price, because so many individual projects yield no
exploitable product. Most industrial companies get 40% revenues only.
On a technical level, high tech organizations explore ways to re-purpose and repackage advanced
technologies as a way of amortizing the high overhead. They often reuse advanced manufacturing
processes, expensive safety certifications, specialized embedded software, computer-aided design
software, electronic designs and mechanical subsystems.
Research has shown that firms with a persistent R&D strategy outperform those with an irregular or
no R&D investment program

Economic theory emphasises the accumulation of R&D and human capital in explaining economic
growth.1 We can answer the question in the heading above by looking at how much output will
increase when the level of R&D input increases. We measure this by estimating the elasticity of
output with respect to capital stock. This is equal to the rate of return to R&D multiplied by the share
of the R&D stock in output. A large empirical literature has sought to estimate the rate of return to
R&D. In general, the empirical literature finds the social rates of return to R&D substantially above
private rates of return.2 These findings are summarised by Griliches (1992): In spite of (many)
difficulties, there has been a significant number of reasonably well-done studies, all pointing in the
same direction: R&D spillovers are present, their magnitude may be quite large, and social rates of
return remain significantly above private rates. The private rate of return can be estimated by looking
at the impact of a firms own R&D on the firms output.3 Estimates of the private rate of return to
R&D are made using US firm-level data in Griliches (1992). The estimated elasticity of output with
respect to R&D is around 0.07. This says that for a 10% increase in R&D expenditure there will be a

bit less than a 1% increase in output (0.7%). This implies a rate of return of around 27% for R&D.4
Hall (1996) reports that estimates of private rates of return to R&D cluster around 1015% though
can be as high as 30% in some studies. The social rate of return is generally obtained by estimating
the impact on growth in one firm of R&D done in other firms. These estimates range from 17% to
34%. The second column shows the social return attributable to R&D conducted in one industry but
used in another (for example, R&D carried out in an upstream industry). Estimates of the social rate
of return on this R&D are significantly higher. Adding the two together implies a social rate of return

of around 100%.

These estimates are largely based on data for the manufacturing sector. Empirical results on the social
rate of return to R&D are integrated into a macroeconomic model of endogenous innovation and
growth by Jones and Williams (1998). in leading to new innovations but also in enhancing firms
ability to imitate. R&D not only stimulates innovation but also plays an important role in the adoption
of existing technologies. Empirical evidence lends support to these ideas. A countrys distance from
the technological frontier is used as a direct measure of the potential for technology transfer, where
the frontier is defined for each industry as the country with the highest level of total factor
productivity (TFP). Eaton et al. (1998) calculate an estimate of the value of invention relative to the
reward for production work (an average wage) and find that incentives to carry out R&D are much
lower than those in the US and Japan in all countries. In the UK, for example, the incentive to do
research is 0.372, compared with 1.0 in the US and 0.504 in Japan. The explanation provided by
Eaton et al. is that research incentives are lower due to smaller market size in other OECD countries.
Market failures such as underdevelopment of financial markets may also act as barriers to R&D
investment.
Answer 2.
Introduction
The research ecosystem in India presents a significant opportunity for multinational corporations
across the world due to its intellectual capital available in the country. Legions of Indian engineers
working across the globe highlight the highly trained manpower available at competitive costs.
Consequently, several MNCs have shifted or are shifting their research and development (R&D) base
to India. These R&D bases either develop products to serve the local market or help the parent
company overseas deliver new innovative generation of products faster to the markets across the
world. India is presently ranked 76th among a total of 143 economies, as per the Global Innovation
Index (GII).

Market Size
Overall India-based R&D Globalization and R&D Services market reached US$ 20 billion@ in 2015,
up by 9.9 per cent over 2014. R&D Services market stood at US$ 7.76 billion and R&D Globalisation
market (Captives) stood at US$ 12.25 billion. Indias R&D globalisation and services market is set to
almost double by 2020 to US$ 38 billion.
According to the study, India-based R&D services companies, which account for almost 22 per cent
of the global addressed market, grew much faster at 12.67 per cent. The market for Engineering R&D
(ER&D) companies in India is mainly structured across pure play PES companies such as Cyient,
QuEST, eInfochips and the larger IT companies with a PES play such as Wipro, TCS, HCL. India's
ER&D services market is expected to reach US$ 15-17 billion by 2020 and North America continues
to be the largest market contributing to 55 per cent of revenues.
Recent Investments and developments
Indian Space Research Organisation (ISRO) is taking steps towards developing its own reusable
rocket using a Winged Reusable Launch Vehicle Technology Demonstrator (RLV-TD), whose tech
demo is expected to be conducted in February 2016.
German automotive firm Bosch Gmbh has signed a memorandum of understanding (MoU) with
Indian Institute of Science (IISc), Bangalore with a view to strengthen Boschs research and
development in areas including mobility and healthcare thereby driving innovation for India-centric
requirements.
Chinese telecom gear maker Huawei has launched a Research and Development (R&D) campus in
Bengaluru with an investment of US$ 170 million. The campus, the first by any Chinese company,
has a capacity to accommodate 5,000 engineers and is the largest R&D centre of Huawei outside
China. At present 2,700 people are working here, with 98 per cent being local workers.

Kellogg plans to set up its R&D centre in India in 2015. Besides developing products specifically for
India, Kellogg aim to make its India R&D unit a lead centre for creating products for other emerging
markets where Kellogg is present.
Chinese smartphone maker Xiaomi has signed the lease for its R&D centre in Bengaluru, which is
expected to be launched later in 2015. Xiaomi has leased a 20,000 square feet building space for their
R&D centre in Cessna Business Park in Bengaluru.
American chipmaker Broadcom is betting big on developing solutions tailored to India and other
emerging nations from its Bengaluru R&D unit. The key opportunities focused on are in the areas of
Internet of Things and wearables market in India. Out of Broadcoms global headcount of about
11,000 people, the India R&D centre has about 1,500 people.
Dell is looking towards strengthening its engineering team in India and increasing the number of
patent applications filed from India. The country has already become the largest contributor to
software patents for Dell.
Specialty chemicals maker BASF SE is setting up an innovation campus in Navi Mumbai at a total
investment of Rs 360 crore (US$ 54 million). The innovation campus is in addition to the R&D
centers in Mumbai and Navi Mumbai.
Twitter Inc is planning to set up its first facility outside the US in the form of an R&D centre in
Bengaluru to grow faster and accelerate user adoption in emerging markets. Twitter plans to use
Bengaluru-based mobile marketing and analytics companyZipDial Mobile Solutions Pvt.Ltds team
to build this new R&D facility.
French three-dimensional design software firm DassaultSystemes has decided to extend its 'Living
Heart' research initiative to India, a move that will give surgeons in Indias top cardiac institutes,
access to the company's 3D heart simulation platform.

California based Cohesity, a start-up involved in secondary storage space, has recently launched
operations in India and is expected to invest US$ 10 million in the country over the next two years in
research and development.
Sequoia Capital will invest US$ 120 million in Bengalurus MedGenome, a genomics based
diagnositcs and research firm that uses DNA sequencing and data analytics to conduct genetic testing
for cancer and other rare diseases.
Government Initiatives
The Government of India has taken several steps to promote the R&D sector in India. In the Union
Budget 2015-16, the government announced plans to establish National Institution to Transforming
India (NITI) to increase involvement of entrepreneurs, researchers to foster scientific innovations.
Indias steel minister Mr Narendra Singh Tomar has announced creating a fund of Rs 100 crore (US$
15 million) to help setting up R&D units with the participation from industries and the government to
overcome the technological gaps. Mr Tomar said It is under the active consideration of the
government to infuse more funds in this initiative to utilise locally available cheap raw material, to
remain competitive in the world market.
A team of scientists from India and Bangladesh will conduct for the first time, joint marine research
within Bangladeshs Exclusive Economic Zone (EEZ), which is expected to help in understanding
climate change and monsoon patterns in India.
Road Ahead
With the governments support, the R&D sector in India is all set to witness some robust growth in
the coming years. According to a study by management consulting firm Zinnov, engineering R&D
market in India is estimated to grow at a CAGR of 14 per cent to reach US$ 42 billion by 2020.

India is also expected to witness strong growth in its agriculture and pharmaceutical sectors as the
government is investing large sums to set up dedicated research centres for R&D in these sectors. The
Indian IT industry is also expected to add to the development of the R&D sector.

Source: World Bank


Answer 5
Scientific and Critical Thinking
When one uses the scientific method to study or investigate nature or the universe, one is practicing
scientific thinking. All scientists practice scientific thinking, of course, since they are actively
studying nature and investigating the universe by using the scientific method. But scientific thinking
is not reserved solely for scientists. Anyone can "think like a scientist" who learns the scientific
method and, most importantly, applies its precepts, whether he or she is investigating nature or not.
When one uses the methods and principles of scientific thinking in everyday life--such as when
studying history or literature, investigating societies or governments, seeking solutions to problems of

economics or philosophy, or just trying to answer personal questions about oneself or the meaning of
existence--one is said to be practicing critical thinking. Critical thinking is thinking correctly for
oneself that successfully leads to the most reliable answers to questions and solutions to problems. In
other words, critical thinking gives you reliable knowledge about all aspects of your life and society,
and is not restricted to the formal study of nature. Scientific thinking is identical in theory and
practice, but the term would be used to describe the method that gives you reliable knowledge about
the natural world. Clearly, scientific and critical thinking are the same thing, but where one (scientific
thinking) is always practiced by scientists, the other (critical thinking) is sometimes used by humans
and sometimes not. Scientific and critical thinking was not discovered and developed by scientists
(that honor must go to ancient Hellenistic philosophers, such as Aristotle, who also are sometimes
considered the first scientists), but scientists were the ones to bring the practice of critical thinking to
the attention and use of modern society (in the 17th and 18th centuries), and they are the most
explicit, rigorous, and successful practitioners of critical thinking today. Some professionals in the
humanities, social sciences, jurisprudence, business, and journalism practice critical thinking as well
as any scientist, but many, alas, do not. Scientists must practice critical thinking to be successful, but
the qualifications for success in other professions do not necessarily require the use of critical
thinking, a fact that is the source of much confusion, discord, and unhappiness in our sociey .
The scientific method has proven to be the most reliable and successful method of thinking in human
history, and it is quite possible to use scientific thinking in other human endeavors. For this reason,
critical thinking--the application of scientific thinking to all areas of study and topics of
investigation--is being taught in schools throughout the United States, and its teaching is being
encouraged as a universal ideal. You may perhaps have been exposed to critical thinking skills and
exercises earlier in your education. The important point is this: critical thinking is perhaps the most
important skill a student can learn in school and college, since if you master its skills, you know how
to think successfully and reach reliable conclusions, and such ability will prove valuable in any

human endeavor, including the humanities, social sciences, commerce, law, journalism, and
government, as well as in scholarly and scientific pursuits. Since critical thinking and scientific
thinking are, as I claim, the same thing, only applied for different purposes, it is therefore reasonable
to believe that if one learns scientific thinking in a science class, one learns, at the same time, the
most important skill a student can possess--critical thinking. This, to my mind, is perhaps the
foremost reason for college students to study science, no matter what one's eventual major, interest, or
profession.

The Three Central Components of Scientific and Critical Thinking


What is scientific thinking? At this point, it is customary to discuss questions, observations, data,
hypotheses, testing, and theories, which are the formal parts of the scientific method, but these are
NOT the most important components of the scientific method. The scientific method is practiced
within a context of scientific thinking, and scientific (and critical) thinking is based on three things:
using empirical evidence (empiricism), practicing logical reasonsing (rationalism), and possessing a
skeptical attitude (skepticism) about presumed knowledge that leads to self-questioning, holding
tentative conclusions, and being undogmatic (willingness to change one's beliefs). These three ideas
or principles are universal throughout science; without them, there would be no scientific or critical
thinking. Let's examine each in turn.
1. Empiricism: The Use of Empirical Evidence
Empirical evidence is evidence that one can see, hear, touch, taste, or smell; it is evidence that is
susceptible to one's senses. Empirical evidence is important because it is evidence that others besides
yourself can experience, and it is repeatable, so empirical evidence can be checked by yourself and
others after knowledge claims are made by an individual. Empirical evidence is the only type of

evidence that possesses these attributes and is therefore the only type used by scientists and critical
thinkers to make vital decisions and reach sound conclusions.
The most common alternative to empirical evidence, authoritarian evidence, is what authorities
(people, books, billboards, television commercials, etc.) tell you to believe. Sometimes, if the
authority is reliable, authoritarian evidence is reliable evidence, but many authorities are not reliable,
so you must check the reliability of each authority before you accept its evidence. In the end, you
must be your own authority and rely on your own powers of critical thinking to know if what you
believe is reliably true. (Transmitting knowledge by authority is, however, the most common method
among humans for three reasons: first, we are all conditioned from birth by our parents through the
use of positive and negative reinforcement to listen to, believe, and obey authorities; second, it is
believed that human societies that relied on a few experienced or trained authorities for decisions that
affected all had a higher survival value than those that didn't, and thus the behaviorial trait of
susceptibility to authority was strengthened and passed along to future generations by natural
selection; third, authoritarian instruction is the quickest and most efficient method for transmitting
information we know about. But remember: some authoritarian evidence and knowledge should be
validated by empirical evidence, logical reasoning, and critical thinking before you should consider it
reliable, and, in most cases, only you can do this for yourself.
Another name for empirical evidence is natural evidence: the evidence found in nature. Naturalism is
the philosophy that says that "Reality and existence (i.e. the universe, cosmos, or nature) can be
described and explained solely in terms of natural evidence, natural processes, and natural laws." This
is exactly what science tries to do. Another popular definition of naturalism is that "The universe
exists as science says it does." This definition emphasizes the strong link between science and natural
evidence and law, and it reveals that our best understanding of material reality and existence is
ultimately based on philosophy The supernatural, if it exists, cannot be examined or tested by science,
so it is irrelevant to science. It is impossible to possess reliable knowledge about the supernatural by

the use of scientific and critical thinking. Individuals who claim to have knowledge about the
supernatural do not possess this knowledge by the use of critical thinking, but by other methods of
knowing.
Science has unquestionably been the most successful human endeavor in the history of civilization,
because it is the only method that successfully discovers and formulates reliable knowledge. The
evidence for this statement is so overwhelming that many individuals overlook exactly how modern
civilization came to be (our modern civilization is based, from top to bottom, on the discoveries of
science and their application, known as technology, to human purposes.).
2. Rationalism: The Practice of Logical Reasoning
Scientists and critical thinkers always use logical reasoning. Logic allows us to reason correctly, but it
is a complex topic and not easily learned; many books are devoted to explaining how to reason
correctly, and we can not go into the details here. However, I must point out that most individuals do
not reason logically, because they have never learned how to do so. Logic is not an ability that
humans are born with or one that will gradually develop and improve on its own, but is a skill or
discipline that must be learned within a formal educational environment. Emotional thinking, hopeful
thinking, and wishful thinking are much more common than logical thinking, because they are far
easier and more congenial to human nature. Most individuals would rather believe something is true
because they feel it is true, hope it is true, or wish it were true, rather than deny their emotions and
accept that their beliefs are false.unchallenged in modern society--often leading to results that are
counterproductive to the good of society or even tragic--because so many people don't recognize
them for what they are.
3. Skepticism: Possessing a Skeptical Attitude
The final key idea in science and critical thinking is skepticism, the constant questioning of your
beliefs and conclusions. Good scientists and critical thinkers constantly examine the evidence,

arguments, and reasons for their beliefs. Self-deception and deception of yourself by others are two of
the most common human failings. Self-deception often goes unrecognized because most people
deceive themselves. The only way to escape both deception by others and the far more common trait
of self-deception is to repeatedly and rigorously examine your basis for holding your beliefs. You
must question the truth and reliability of both the knowledge claims of others and the knowledge you
already possess. One way to do this is to test your beliefs against objective reality by predicting the
consequences or logical outcomes of your beliefs and the actions that follow from your beliefs. If the
logical consequences of your beliefs match objective reality--as measured by empirical evidence--you
can conclude that your beliefs are reliable knowledge (that is, your beliefs have a high probability of
being true).Science treats new ideas with the same skepticism: extraordinary claims require
extraordinary evidence to justify one's credulity. We are faced every day with fantastic, bizarre, and
outrageous claims about the natural world; if we don't wish to believe every pseudoscientific
allegation or claim of the paranormal, we must have some method of deciding what to believe or not,
and that method is the scientific method which uses critical thinking.
The ScientificThinking In Reserach
One must ask a meaningful question or identify a significant problem, and one should be able to state
the problem or question in a way that it is conceivably possible to answer it. Any attempt to gain
knowledge must start here. Here is where emotions and outside influences come in. For example, all
scientists are very curious about nature, and they have to possess this emotional characteristic to
sustain the motivation and energy necessary to perform the hard and often tedious work of science.
Other emotions that can enter are excitement, ambition, anger, a sense of unfairness, happiness, and
so forth. Note that scientists have emotions, some in high degree; however, they don't let their
emotions give false validity to their conclusions, and, in fact, the scientific method prevents them
from trying to do this even if they wished.

Many outside factors can come into play here.

Scientists must choose which problems to work on, they decide how much time to devote to
different problems, and they are often influenced by cultural, social, political, and economic
factors. Scientists live and work within a culture that often shapes their approach to problems;
they work within theories that often shape their current understanding of nature; they work
within a society that often decides what scientific topics will be financially supported and
which will not; and they work within a political system that often determines which topics are

permitted and financially rewarded and which are not.


Also, at this point, normally non-scientific emotional factors can lead to divergent pathways.
Scientists could be angry at polluters and choose to investigate the effects of pollutants; other
scientists could investigate the results of smoking cigarettes on humans because they can earn
a living doing this by working for tobacco companies; intuition can be used to suggest

different approaches to problems; even dreams can suggest creative solutions to problems.
I wish to emphasize, however, that the existence of these frankly widespread nonscientific
emotional and cultural influences does not compromize the ultimate reliability and objectivity
of scientific results, because subsequent steps in the scientific method serve to eliminate these
outside factors and allow science to reach reliable and objective conclusions (admittedly it

may take some time for subjective and unreliable scientific results to be eliminated).
There exists a school of thought today in the humanities (philosophy, history, and sociology)
called post-modernism or scientific constructivism, that claims that science is a social and
cultural construct, that scientific knowledge inevitably changes as societies and cultures
change, and that science has no inherently valid foundation on which to base its knowledge

claims of objectivity and reliability.


In brief, post-modernists believe that the modern, scientific world of Enlightenment
rationality and objectivity must now give way to a post-modern world of relativism, social
constructivism, and equality of belief. Almost all scientists who are aware of this school of
thought reject it, as do I; post-modernism is considered irrelevant by scientists and has had no

impact on the practice of science at all. We will have to leave this interesting topic for a later
time, unfortunately, but you may be exposed to these ideas in a humanities class. If you are,

remember to think critically!


One must next gather relevant information to attempt to answer the question or solve the

problem by making observations.


The first observations could be data obtained from the library or information from your own

experience.
Another souce of observations could be from trial experiments or past experiments. These
observations, and all that follow, must be empirical in nature--that is, they must be sensible,
measurable, and repeatable, so that others can make the same observations. Great ingenuity
and hard work on the part of the scientist is often necessary to make scientific observations.
Furthermore, a great deal of training is necessary in order to learn the methods and techniques

of gathering scientific data.


Now one can propose a solution or answer to the problem or question. In science, this
suggested solution or answer is called a scientific hypothesis, and this is one of the most
important steps a scientist can perform, because the proposed hypothesis must be stated in

such a way that it is testable.


A scientific hypothesis is an informed,testable, and predictive solution to a scientific problem
that explains a natural phenomenon, process, or event. In critical thinking, as in science, your
proposed answer or solution must be testable, otherwise it is essentially useless for further
investigation. Most individuals--noncritical thinkers all--stop here, and are satisfied with their
first answer or solution, but this lack of skepticism is a major roadblock to gaining reliable
knowledge. While some of these early proposed answers may be true, most will be false, and

further investigation will almost always be necessary to determine their validity.


Next, one must test the hypothesis before it is corroborated and given any real validity. There

are two ways to do this.


First, one can conduct an experiment. This is often presented in science textbooks as the only
way to test hypotheses in science, but a little reflection will show that many natural problems

are not amenable to experimentation, such as questions about stars, galaxies, mountain

formation, the formation of the solar system, ancient evolutionary events, and so forth.
The second way to test a hypothesis is to make further observations. Every hypothesis has
consequences and makes certain predictions about the phenomenon or process under
investigation. Using logic and empirical evidence, one can test the hypothesis by examining
how successful the predictions are, that is, how well the predictions and consequences agree
with new data, further insights, new patterns, and perhaps with models. The testability or
predictiveness of a hypothesis is its most important characteristic. Only hypotheses involving
natural processes, natural events, and natural laws can be tested; the supernatural cannot be

tested, so it lies outside of science and its existence or nonexistence is irrelevant to science.
If the hypothesis fails the test, it must be rejected and either abandoned or modified. Most
hypotheses are modified by scientists who don't like to simply throw out an idea they think is
correct and in which they have already invested a great deal of time or effort. Nevertheless, a

modified hypothesis must be tested again.


If the hypothesis passes the further tests, it is considered to be a corroborated hypothesis, and
can now be published. A corroborated hypothesis is one that has passed its tests, i.e., one
whose predictions have been verified. Now other scientists test the hypothesis. If further
corroborated by subsequent tests, it becomes highly corroborated and is now considered to be
reliable knowledge. By the way, the technical name for this part of the scientific method is the
"hypothetico-deductive method," so named because one deduces the results of the predictions
of the hypothesis and tests these deductions. Inductive reasoning, the alternative to deductive
reasoning, was used earlier to help formulate the hypothesis. Both of these types of reasoning

are therefore used in science, and both must be used logically.


Scientists never claim that a hypothesis is "proved" in a strict sense (but sometimes this is
quite legitimately claimed when using popular language), because proof is something found
only in mathematics and logic, disciplines in which all logical parameters or constraints can
be defined, and something that is not true in the natural world. Scientists prefer to use the

word "corroborated" rather than "proved," but the meaning is essentially the same. A highly
corroborated hypothesis becomes something else in addition to reliable knowledge--it

becomes a scientific fact.


A scientific fact is a highly corroborated hypothesis that has been so repeatedly tested and for
which so much reliable evidence exists, that it would be perverse or irrational to deny it. This
type of reliable knowledge is the closest that humans can come to the "truth" about the
universe (I put the word "truth" in quotation marks because there are many different kinds of
truth, such as logical truth, emotional truth, religious truth, legal truth, philosophical truth,
etc.; it should be clear that this essay deals with scientific truth, which, while certainly not the
sole truth, is nevertheless the best truth humans can possess about the natural world).

Answer No 6
Meaning: A pilot study, pilot project or pilot experiment is a small scale preliminary study conducted
in order to evaluate feasibility, time, cost, adverse events, and effect size (statistical variability) in an
attempt to predict an appropriate sample size and improve upon the study design prior to performance
of a full-scale research project. Pilot studies, therefore, may not be appropriate for case studies.
Definition:

Landserton says,

Pilot experiments are frequently carried out before large-scale quantitative research, in an attempt to
avoid time and money being wasted on an inadequately designed project. A pilot study is usually
carried out on members of the relevant population, but not on those who will form part of the final
sample. This is because it may influence the later behaviour of research subjects if they have already
been involved in the research.
A pilot experiment/study is often used to test the design of the full-scale experiment which then can
be adjusted. It is a potentially valuable insight and should anything be missing in the pilot study it can

be added to the full-scale (and more expensive) experiment to improve the chances of a clear
outcome.
Here are some more reasons to consider a pilot study:
1. It permits preliminary testing of the hypotheses that leads to testing more precise hypotheses in the
main study. It may lead to changing some hypotheses, dropping some, or developing new hypotheses.
2. It often provides the researcher with ideas, approaches, and clues you may not have foreseen
before conducting the pilot study. Such ideas and clues increase the chances of getting clearer
findings in the main study.
3. It permits a thorough check of the planned statistical and analytical procedures, giving you a
chance to evaluate their usefulness for the data. You may then be able to make needed alterations in
the data collecting methods, and therefore, analyze data in the main study more efficiently.
4. It can greatly reduce the number of unanticipated problems because you have an opportunity to
redesign parts of your study to overcome difficulties that the pilot study reveals.
5. It may save a lot of time and money. Unfortunately, many research ideas that seem to show great
promise are unproductive when actually carried out. The pilot study almost always provides enough
data for the researcher to decide whether to go ahead with the main study.
6. In the pilot study, the researcher may try out a number of alternative measures and then select
those that produce the clearest results for the main study.
7. It is needed to detect possible flaws in measurement procedures (including instructions, time limits,
etc) and in the operationalisation of independent variables.
8. A pilot study is also valuable to identify unclear or ambiguous items in a questionnaire.

9. The non-verbal behaviour of participants in the pilot study may give important information about
any embarrassment or discomfort experienced concerning the content or wording of items in a
questionnaire.
Steps involved to conduct an effective Pilot Study
1 Plan and design the pilot study
2 Train personnel to accomplish change
3 Support and monitor pilot study
4 Evaluate pilot results
5 Make recommendations & improvement

Vous aimerez peut-être aussi