Vous êtes sur la page 1sur 6

COMPUTERS AND RESEARCHERS

1. Performing calculations almost at the speed of light, the computer has become one of the
most useful research tools in modern times. Computers are ideally suited for data analysis
concerning large research projects. Researchers are essentially concerned with huge
storage of data, their faster retrieval when required and processing of data with the aid of
various techniques. In all these operations, computers are of great help.

2. The computers can perform many statistical calculations easily and quickly. Computation
of means, standard deviations, correlation coefficients, „t‟ tests, analysis of variance,
analysis of covariance, multiple regression, factor analysis and various nonparametric
analyses are just a few of the programs and subprograms that are available at almost all
computer centres.

3. Techniques involving trial and error process are quite frequently employed in research
methodology. This involves lot of calculations and work of repetitive nature. Computer is
best suited for such techniques, thus reducing the drudgery of researchers on the one hand
and producing the final result rapidly on the other.

4. The storage facility which the computers provide is of immense help to a researcher for
he can make use of stored up data whenever he requires doing so.

The above description indicates clearly the usefulness of computers to researchers in data
analysis. Researchers, using computers, can carry on their task at faster speed and with
greater reliability.
In spite of all this sophistication we should not forget that basically computers are machines
that Only compute, they do not think. The human brain remains supreme and will continue to
be so for all times. As such, researchers should be fully aware about the following limitations
of computer-based analysis:

1) Computerized analysis requires setting up of an elaborate system of monitoring, collection


and feeding of data. All these require time, effort and money. Hence, computer based analysis
may not prove economical in case of small projects.
2) Various items of detail which are not being specifically fed to computer may get lost sight of.
3) The computer does not think; it can only execute the instructions of a thinking person. If poor
data or faulty programs are introduced into the computer, the data analysis would not be
worthwhile. The expression “garbage in, garbage out” describes this limitation very well.

Some of the limitations of computer are as follows:

No Self-Intelligence

Computer does not have intelligence of its own to complete the tasks. They give wrong
output if the inputs given by humans are wrong. It works according to the instructions given
to it by the user.

No Thinking and Decision Making Power

The computer cannot think itself. The concept of artificial intelligence shows that computer
can think. But still this concept is dependent on set of instructions. It cannot take any
decision. It can only perform the tasks that are instructed by the users.

No Feeling

Lack of feeling is another limitation of computer. A computer cannot feel like us. It does not
have emotions, feelings, knowledge etc. It does not get tired and keep on doing its tasks. It
can do very risky works which are not capable by human beings.
No Learning Power

Computer has no learning power. Computer cannot perform the tasks without instructions. It
cannot read the same instructions time and again. Once the instructions are given it will work
for one time. It can solve the problems but it cannot learn the problems. It can only work
according to the instructions given.

LIMITATIONS OF THE TESTS OF HYPOTHESES

We have described above some important test often used for testing hypotheses on the basis
of which important decisions may be based. But there are several limitations of the said tests
which should always be borne in mind by a researcher. Important limitations are as follows:

1. The tests should not be used in a mechanical fashion. It should be kept in view that
testing is not decision-making itself; the tests are only useful aids for decision-
making. Hence “proper interpretation of statistical evidence is important to intelligent
decisions.”
2. Tests do not explain the reasons as to why do the difference exist, say between the
means of the two samples. They simply indicate whether the difference is due to
fluctuations of sampling or because of other reasons but the tests do not tell us as to
which is/are the other reason(s) causing the difference.
3. Results of significance tests are based on probabilities and as such cannot be
expressed with full certainty. When a test shows that a difference is statistically
significant, then it simply suggests that the difference is probably not due to chance.
4. Statistical inferences based on the significance tests cannot be said to be entirely
correct evidences concerning the truth of the hypotheses. This is specially so in case
of small samples where the probability of drawing erring inferences happens to be
generally higher. For greater reliability, the size of samples is sufficiently enlarged.

MULTIVARIATE ANALYSIS

In univariate statistics there are one or more independent variables (X1, X2), and only one
dependent variable (Y). Multivariate analysis is concerned with two or more dependent
variables, Y1, Y2, being simultaneously considered for multiple independent variables, X1,
X2, etc. The manual effort used to solve multivariate problems was an obstacle to its
earlier use. Recent advances in computer software and hardware have made it possible
to solve more problems using multivariate analysis. Some of the software programs available
to solve multivariate problems include: SPSS, S-Plus, SAS, and Minitab.

Statistical Analysis Tool: SPSS

SPSS is the most popular tool for statisticians. SPSS stands for Statistical Package for Social
Sciences. The latest version of SPSS is IBM SPSS STATISTICS 20 (purchased by IBM after
version 19). It provides all analysis facilities like following

1. Measures of central tendency & dispersion


2. Statistical inference
3. Correlation & Regression analysis
4. Analysis of variance
5. Non parametric test
6. Hypothesis tests: T-test, chi-square, z-test, ANOVA, Bipartite variable….
7. Multivariate data analysis
8. Frequency distribution
9. Data exposition by using various graphs like line, scatter, bar, ogive, histogram, pie
chart
10. Provides Data view & variable view
Data Analysis Tool: SPREADSHEET PACKAGES

A spreadsheet is a computer application that simulates a paper worksheet. It displays multiple


cells that together make up a grid consisting of rows and columns, each cell containing either
alphanumeric text or numeric values.

ANALYSIS OF VARIANCE (ANOVA)

Analysis of variance (abbreviated as ANOVA) is an extremely useful technique concerning


researches in the fields of economics, biology, education, psychology, sociology, business/
industry and in researches of several other disciplines. This technique is used when multiple
sample cases are involved. As stated earlier, the significance of the difference between the
means of two samples can be judged through either z-test or the t-test, but the difficulty arises
when we happen to examine the significance of the difference amongst more than two sample
means at the same time. The ANOVA technique enables us to perform this simultaneous test
and as such is considered to be an important tool of analysis in the hands of a researcher.
Using this technique, one can draw inferences about whether the samples have been drawn
from populations having the same mean.

The ANOVA technique is important in the context of all those situations where we want to
compare more than two populations such as in comparing the yield of crop from several
varieties of seeds, the gasoline mileage of four automobiles, the smoking habits of five
groups of university students and so on. In such circumstances one generally does not want to
consider all possible combinations of two populations at a time for that would require a great
number of tests before we would be able to arrive at a decision. This would also consume lot
of time and money, and even then certain relationships may be left unidentified (particularly
the interaction effects). Therefore, one quite often utilizes the ANOVA technique and through
it investigates the differences among the means of all the populations simultaneously.

Information and Library Network (INFLIBNET) Program

The information and Library Network ((NFLIBNET) program was started by the University
Grants Commission (UGC) in April 1991. It is a cooperative venture for pooling, sharing,
and optimization of library resources in the country. It aims to provide a channel to the
academicians and researchers for exchange of information from sources within the country
and abroad. It is a major program towards modernization of libraries and information services
in the country, using computer and communication technologies. INFLIBNET include
participants form colleges, universities, R&D institutes, institutes of higher learning,
information centers, institutes of national importance, and document resource centers
(DRCs). All the disciplines such as science, technology, medicine, agriculture, fine arts,
humanities, social sciences, etc., be covered under this program The INFLIBNET
programmes has been set up with the following objectives:

 To promote and establish communication facilities to improve capability in information


transfer and access that provide support to scholarship learning, research and academic
pursuit through cooperation

 To collaborate with institutions, libraries, information centers and other organizations in


India and abroad in the field relevant to the objectives of the Centre;

 To promote R&D and develop necessary facilities and create technical positions for
realizing the objectives of the Centre;
MATLAB

MATLAB is a programming language developed by Math Works. It started out as a matrix


programming language where linear algebra programming was simple. It can be run both
under interactive sessions and as a batch job. This tutorial gives you aggressively a gentle
introduction of MATLAB programming language. It is designed to give students fluency in
MATLAB programming language. Problem-based MATLAB examples have been given in
simple and easy way to make your learning fast and effective. MATLAB (matrix laboratory)
is a fourth-generation high-level programming language and interactive environment for
numerical computation, visualization and programming.

MATLAB's Power of Computational Mathematics


MATLAB is used in every facet of computational mathematics. Following are some
commonly used mathematical calculations where it is used most commonly
• Dealing with Matrices and Arrays
• 2-D and 3-D Plotting and graphics
• Linear Algebra
• Algebraic Equations
• Non-linear Functions
• Statistics
• Data Analysis
• Calculus and Differential Equations
• Numerical Calculations
• Integration
• Transforms
• Curve Fitting
• Various other special functions

An introduction to using Microsoft Office (Excel) for quantitative data analysis

Why use Excel?

With so many specialist software packages available, why use Excel for statistical analysis?
Convenience and cost are two important reasons: many of us have access to Excel on our
own computers and do not need to source and invest in other software. Another benefit,
particularly for those new to data analysis, is to remove the need to learn a software program
as well as getting to grips with the analysis techniques. Excel also integrates easily into other
Microsoft Office software products which can be helpful when preparing reports or
presentations.

What you can do with Excel

As a spreadsheet, Excel can be used for data entry, manipulation and presentation but it also
offers a suite of statistical analysis functions and other tools that can be used to run
descriptive statistics and to perform several different and useful inferential statistical tests that
are widely used in business and management research. In addition, it provides all of the
standard spreadsheet functionality, which makes it useful for other analysis and data
manipulation tasks, including generating graphical and other presentation formats. Finally,
even if using bespoke statistical software, Excel can be helpful when preparing data for
analysis in those packages.

Making the choice

Many basic analysis projects involving primarily data exploration, descriptive statistics and
simple inferential statistics can be successfully completed using standard Excel. More
advanced projects, especially those involving multivariate analysis are more challenging in
Excel and in such cases it is worth considering using specialist analysis software such as IBM
SPSS.

Quantitative data analysis tools in Excel

Excel includes a large number of tools that can be used for general data analysis. Here our
primary concern is those that are relevant to the statistical and related analysis techniques

Statistical functions
Excel offers a broad range of built-in statistical functions. These are used to carry out specific
data manipulation tasks, including statistical tests. An example is the AVERAGE 1 function
that calculates the arithmetic mean of the cells in a specified range.
Data Analysis Tool Pak
The Data Analysis Tool Pak is an Excel add-in. It contains more extensive functions,
including some useful inferential statistical tests. An example is the Descriptive Statistics
routine that will generate a whole range of useful statistics in one go.
Charts
Excel‟s in-built charts (graphs) cover most of the chart types introduced are invaluable in
data exploration and presentation.
Pivot tables
Pivot tables provide a way of generating summaries of your data and organising data in ways
that are more useful for particular tasks. They are extremely useful for creating contingency
tables, cross-tabulations and tables of means or other summary statistics. A brief introduction
to creating pivot tables is given in the guide Data exploration in Excel: univariate analysis.

MICROSOFT POWERPOINT

Daily life uses of Power point: - Microsoft PowerPoint is application software used to
present data and information by using text, images, diagrams with animations and transitional
effects etc. in slides that helps to explain the topic or idea in front of audience easily and
practically. It can be a powerful tool in creating clear, well-structured presentations that have
a strong visual impact. However, over-use or misuse can detract from your presentation.
Following the guidelines in this study guide will ensure that you use PowerPoint
effectively to support your presentation and engage your audience.

Writing a thesis is stressful but preparing an oral defense can be even more painful. But it
doesn‟t have to be, with proper preparation and a good presentation you will be able to better
equip yourself come time to present your thesis defense. A proper presentation helps you
with your thesis defense because it helps you to capture the panel‟s attention and give you
cues and reminders on what to say as well. It also helps keep your data organized, while
visually looking good and provides a flow structure for the rest of your presentation. The
Right PowerPoint Templates for Your Thesis Defense and a powerful outline composed of
best practices and layouts are specifically designed to help you defend your thesis in both
written and oral presentation.
During the presentation, PowerPoint has a variety of advantages to the presenter and
listeners. To progress through a slide show, the presenter just need to click a button, which
allows the presenter to maintain eye contact with your audience and use your hands for
emphasis. A Powerpoint presentation often has a nice appearance and interesting graphics,
which keeps the audience interested. Moreover, it can be projected on a big screen for a large
auditorium or classroom
IMPORTANCE AND USE DATABASES

Data are facts, numbers, letters, and symbols that describe an object, idea, condition,
situation, or other factors. A data element is the smallest unit of information to which
reference is made. Data in a database may be characterized as predominantly word
oriented (e.g., as in a text, bibliography, directory, dictionary), numeric (e.g., properties,
statistics, experimental values), image (e.g., fixed or moving video, such as a film of
microbes under magnification or time-lapse photography of a flower opening), or sound (e.g.,
a sound recording of a tornado or a fire).

A database is a collection of related data and information—generally numeric, word oriented,


sound, and/or image—organized to permit search and retrieval or processing and
reorganizing. A data set is a collection of similar and related data records or data points.
Many databases are a resource from which specific data points, facts, or textual information
are extracted for use in building a derivative database or data product.

Research databases are organized collections of computerized information or data such as


periodical articles, books, graphics and multimedia that can be searched to retrieve
information. Databases can be general or subject oriented with bibliographic citations,
abstracts, and or full text. The sources indexed may be written by scholars, professionals or
generalists.

Advances in computing and communications technologies and the development of digital


networks have revolutionized the manner in which data are stored, communicated, and
manipulated. Databases, and uses to which they can be put, have become increasingly
valuable commodities.

The now-common practice of downloading material from online databases has made it easy
for researchers and other users to acquire data, which frequently have been produced with
considerable investments of time, money, and other resources.

Factual data are both an essential resource for and a valuable output from scientific research.
It is through the formation, communication, and use of facts and ideas that scientists conduct
research. Throughout the history of science, new findings and ideas have been recorded and
used as the basis for further scientific advances and for educating students. The process of
scientific inquiry typically has begun with the formulation of a working hypothesis, based
usually on limited observation and data, followed by experimentation designed to test the
hypothesis. The experimentation results in the accumulation of new data used to confirm or
refute the original hypothesis.

Your topic statement determines the type of database, kind of information, and the date of the
sources that you will use. It is important to clarify whether your topic will require research
from journals, magazines, newspapers, and books or just journals.

Vous aimerez peut-être aussi