Vous êtes sur la page 1sur 38

# MBA SEMESTER III MB0050 Research Methodology- 4 Credits (Book ID: B1206 ) Assignment Set- 1 (60 Marks) 1.a.

Differentiate between nominal, ordinal, interval and ratio scales, with an example of each. Ans. Measurement may be classified into four different levels, based on the characteristics of order, distance and origin. 1. Nominal measurement This level of measurement consists in assigning numerals or symbols to different categories of a variable. The example of male and female applicants to an MBA program mentioned earlier is an example of nominal measurement. The numerals or symbols are just labels and have no quantitative value. The number of cases under each category are counted. Nominal measurement is therefore the simplest level of measurement. It does not have characteristics such as order, distance or arithmetic origin. 2. Ordinal measurement In this level of measurement, persons or objects are assigned numerals which indicate ranks with respect to one or more properties, either in ascending or descending order. Example Individuals may be ranked according to their socio-economic class, which is measured by a combination of income, education, occupation and wealth. The individual with the highest score might be assigned rank 1, the next highest rank 2, and so on, or vice versa. The numbers in this level of measurement indicate only rank order and not equal distance or absolute quantities. This means that the distance between ranks 1 and 2 is not necessarily equal to the distance between ranks 2 and 3. Ordinal scales may be constructed using rank order, rating and paired comparisons. Variables that lend themselves to ordinal measurement include preferences, ratings of organizations and economic status. Statistical techniques that are commonly used to analyze ordinal scale data are the median and rank order correlation coefficients.

3. Interval measurement This level of measurement is more powerful than the nominal and ordinal levels of measurement, since it has one additional characteristic equality of distance. However, it does not have an origin or a true zero. This implies that it is not possible to multiply or divide the numbers on an interval scale. Example The Centigrade or Fahrenheit temperature gauge is an example of the interval level of measurement. A temperature of 50 degrees is exactly 10 degrees hotter than 40 degrees and 10 degrees cooler than 60 degrees. Since interval scales are more powerful than nominal or ordinal scales, they also lend themselves to more powerful statistical techniques, such as standard deviation, product moment correlation and t tests and F tests of significance. 4. Ratio measurement This is the highest level of measurement and is appropriate when measuring characteristics which have an absolute zero point. This level of measurement has all the three characteristics order, distance and origin. Examples Height, weight, distance and area. Since there is a natural zero, it is possible to multiply and divide the numbers on a ratio scale. Apart from being able to use all the statistical techniques that are used with the nominal, ordinal and interval scales, techniques like the geometric mean and coefficient of variation may also be used. The main limitation of ratio measurement is that it cannot be used for characteristics such as leadership quality, happiness, satisfaction and other properties which do not have natural zero points. The different levels of measurement and their characteristics may be summed up. In the table below

## Levels of measurement Nominal Ordinal Interval

Characteristics No order, distance or origin Order, but no distance or origin Both order and distance, but no origin b. What are the purposes of

measurement Ratio Order, distance and origin in social science research? Ans. Measurement has several purposes The researcher constructs theories to explain social and psychological phenomena (e.g. labor unrest, employee satisfaction), which in turn are used to derive hypotheses or assumptions. These hypotheses can be verified statistically only by measuring the variables in the hypotheses. Measurement makes the empirical description of social and psychological phenomena easier. Example When conducting a study of a tribal community, measuring devices help the researcher in classifying cultural patterns and behaviors. Measurement also makes it possible to quantify variables and use statistical techniques to analyze the data gathered. Measurement enables the researcher to classify individuals or objects and to compare them in terms of specific properties or characteristics by measuring the concerned variables. Examples Comparison of male and female students performance in college exams or of length of stay on the job of older and younger employees. 2. a. What are the sources from which one may be able to identify research problems? Ans. The selection of one appropriate researchable problem out of the identified problems requires evaluation of those alternatives against certain criteria, which may be grouped into:

A.

Internal Source -

Internal Criteria consists of: 1) Researchers interest: The problem should interest the researcher and be a challenge to him.

Without interest and curiosity, he may not develop sustained perseverance. Even a small difficulty may become an excuse for discontinuing the study. Interest in a problem depends upon the researchers educational background, experience, outlook and sensitivity. 2) Researchers competence: A mere interest in a problem will not do. The researcher must be competent to plan and carry out a study of the problem. He must have the ability to grasp and deal with int. he must possess adequate knowledge of the subject-matter, relevant methodology and statistical procedures. 3) Researchers own resource: In the case of a research to be done by a researcher on his won, consideration of his own financial resource is pertinent. If it is beyond his means, he will not be able to complete the work, unless he gets some external financial support. Time resource is more important than finance. Research is a time-consuming process; hence it should be properly utilized. B. External Source 1) Research-ability of the problem: The problem should be researchable, i.e., amendable for finding answers to the questions involved in it through scientific method. To be researchable a question must be one for which observation or other data collection in the real world can provide the answer. 2) Importance and urgency: Problems requiring investigation are unlimited, but available research efforts are very much limited. Therefore, in selecting problems for research, their relative importance and significance should be considered. An important and urgent problem should be given priority over an unimportant one. 3) Novelty of the problem: The problem must have novelty. There is no use of wasting ones time and energy on a problem already studied thoroughly by others. This does not mean that replication is always needless. In social sciences in some cases, it is appropriate to replicate (repeat) a study in order to verify the validity of its findings to a different situation. 4) Feasibility: A problem may be a new one and also important, but if research on it is not feasible, it cannot be selected. Hence feasibility is a very important consideration 5) Facilities: Research requires certain facilities such as well-equipped library facility, suitable and competent guidance, data analysis facility, etc. Hence the availability of the facilities relevant to the problem must be considered.

6) Usefulness and social relevance: Above all, the study of the problem should make significant contribution to the concerned body of knowledge or to the solution of some significant practical

problem. It should be socially relevant. This consideration is particularly important in the case of higher level academic research and sponsored research. 7) Research personnel: Research undertaken by professors and by research organizations require the services of investigators and research officers. But in India and other developing countries, research has not yet become a prospective profession. Hence talent persons are not attracted to research projects. Each identified problem must be evaluated in terms of the above internal and external criteria and the most appropriate one may be selected by a research scholar. b. Why literature survey is important in research? Ans. Frequently, an exploratory study is concerned with an area of subject matter in which explicit hypothesis have not yet been formulated. The researchers task then is to review the available material with an eye on the possibilities of developing hypothesis from it. In some areas of the subject matter, hypothesis may have been stated by previous research workers. The researcher has to take stock of these various hypotheses with a view to evaluating their usefulness for further research and to consider whether they suggest any new hypothesis. Sociological journals, economic reviews, the bulletin of abstracts of current social sciences research, directory of doctoral dissertation accepted by universities etc afford a rich store of valuable clues. In addition to these general sources, some governmental agencies and voluntary organizations publish listings of summaries of research in their special fields of service. Professional organizations, research groups and voluntary organizations are a constant source of information about unpublished works in their special fields. 3.a. What are the characteristics of a good research design? Ans. Characteristics of a Good Research Design 1. It is a series of guide posts to keep one going in the right direction. 2. It reduces wastage of time and cost. 3. It encourages co-ordination and effective organization. 4. It is a tentative plan which undergoes modifications, as circumstances demand, when the study progresses, new aspects, new conditions and new relationships come to light and insight into the study deepens. 5. It has to be geared to the availability of data and the cooperation of the informants. 6. It has also to be kept within the manageable limits.

b. What are the components of a research design? Ans. Components of Research Design1. Dependent and Independent variables: A magnitude that varies is known as a variable. The concept may assume different quantitative values, like height, weight, income, etc. Qualitative variables are not quantifiable in the strictest sense of objectivity. However, the qualitative phenomena may also be quantified in terms of the presence or absence of the attribute considered. Phenomena that assume different values quantitatively even in decimal points are known as continuous variables . But, all variables need not be continuous. Values that can be expressed only in integer values are called non-continuous variables . In statistical term, they are also known as discrete variable . For example, age is a continuous variable; where as the number of children is a non-continuous variable. When changes in one variable depends upon the changes in one or more other variables, it is known as a dependent or endogenous variable, and the variables that cause the changes in the dependent variable are known as the independent or explanatory or exogenous variables. For example, if demand depends upon price, then demand is a dependent variable, while price is the independent variable. And if, more variables determine demand, like income and prices of substitute commodity, then demand also depends upon them in addition to the own price. Then, demand is a dependent variable which is determined by the independent variables like own price, income and price of substitute. 2. Extraneous variable: The independent variables which are not directly related to the

purpose of the study but affect the dependent variable are known as extraneous variables. For instance, assume that a researcher wants to test the hypothesis that there is relationship between children s school performance and their self-concepts, in which case the latter is an independent variable and the former, the dependent variable. In this context, intelligence may also influence the school performance. However, since it is not directly related to the purpose of the study undertaken by the researcher, it would be known as an extraneous variable. The influence caused by the extraneous variable on the dependent variable is technically called as an experimental error . Therefore, a research study should always be framed in such a manner that the dependent variable completely influences the change in the independent variable and any other extraneous variable or variables.

3.

## Control: One of the most important features of a good research design is to

minimize the effect of extraneous variable. Technically, the term control is used when a researcher designs the study in such a manner that it minimizes the effects of extraneous independent variables. The term control is used in experimental research to reflect the restrain in experimental conditions. 4. Confounded relationship: The relationship between dependent and independent

variables is said to be confounded by an extraneous variable, when the dependent variable is not free from its effects. Research hypothesis: When a prediction or a hypothesized relationship is tested by adopting scientific methods, it is known as research hypothesis. The research hypothesis is a predictive statement which relates a dependent variable and an independent variable. Generally, a research hypothesis must consist of at least one dependent variable and one independent variable. Whereas, the relationships that are assumed but not be tested are predictive statements that are not to be objectively verified are not classified as research hypothesis. Experimental and control groups: When a group is exposed to usual conditions in an experimental hypothesis-testing research, it is known as control group . On the other hand, when the group is exposed to certain new or special condition, it is known as an experimental group . In the afore-mentioned example, the Group A can be called a control group and the Group B an experimental one. If both the groups A and B are exposed to some special feature, then both the groups may be called as experimental groups . A research design may include only the experimental group or the both experimental and control groups together. Treatments: Treatments are referred to the different conditions to which the experimental and control groups are subject to. In the example considered, the two treatments are the parents with regular earnings and those with no regular earnings. Likewise, if a research study attempts to examine through an experiment regarding the comparative impacts of three different types of fertilizers on the yield of rice crop, then the three types of fertilizers would be treated as the three treatments. Experiment: An experiment refers to the process of verifying the truth of a statistical hypothesis relating to a given research problem. For instance, experiment may be conducted to examine the

yield of a certain new variety of rice crop developed. Further, Experiments may be categorized into two types namely, absolute experiment and comparative experiment. If a researcher wishes to determine the impact of a chemical fertilizer on the yield of a particular variety of rice crop, then it is known as absolute experiment. Meanwhile, if the researcher wishes to determine the impact of chemical fertilizer as compared to the impact of bio-fertilizer, then the experiment is known as a comparative experiment. Experiment unit: Experimental units refer to the predetermined plots, characteristics or the blocks, to which the different treatments are applied. It is worth mentioning here that such experimental units must be selected with great caution. 4. a. Distinguish between Doubles sampling and multiphase sampling.

Ans. Double Sampling and Multiphase Sampling Double sampling refers to the subsection of the final sample form a pre-selected larger sample that provided information for improving the final selection. When the procedure is extended to more than two phases of selection, it is then, called multi-phase sampling. This is also known as sequential sampling, as sub-sampling is done from a main sample in phases. Double sampling or multiphase sampling is a compromise solution for a dilemma posed by undesirable extremes. The statistics based on the sample of n can be improved by using ancillary information from a wide base: but this is too costly to obtain from the entire population of N elements. Instead, information is obtained from a larger preliminary sample L which includes the final sample n. extraneous Double sampling refers to the subsection of the final sample form a pre-selected larger sample that provided information for improving the final selection. When the procedure is extended to more than two phases of selection, it is then, called multi-phase sampling. This is also known as sequential sampling, as sub-sampling is done from a main sample in phases. Double sampling or multiphase sampling is a compromise solution for a dilemma posed by undesirable extremes. The statistics based on the sample of n can be improved by using ancillary information from a wide base: but this is too costly to obtain from the entire population of N elements. Instead, information is obtained from a larger preliminary sample nL which includes the final sample n. b. What is replicated or interpenetrating sampling? Ans. Replicated or Interpenetrating Sampling It involves selection of a certain number of sub-samples rather than one full sample from a

population. All the sub-samples should be drawn using the same sampling technique and each is a self-contained and adequate sample of the population. Replicated sampling can be used with any basic sampling technique: simple or stratified, single or multi-stage or single or multiphase sampling. It provides a simple means of calculating the sampling error. It is practical. The replicated samples can throw light on variable non-sampling errors. But disadvantage is that it limits the amount of stratification that can be employed. 5. a. How is secondary data useful to researcher? Ans. Use of Secondary Data The second data may be used in three ways by a researcher. First, some specific information from secondary sources may be used for reference purpose. For example, the general statistical information in the number of co-operative credit societies in the country, their coverage of villages, their capital structure, volume of business etc., may be taken from published reports and quoted as background information in a study on the evaluation of performance of cooperative credit societies in a selected district/state. Second, secondary data may be used as bench marks against which the findings of research may be tested, e.g., the findings of a local or regional survey may be compared with the national averages; the performance indicators of a particular bank may be tested against the corresponding indicators of the banking industry as a whole; and so on. Finally, secondary data may be used as the sole source of information for a research project. Such studies as securities Market Behaviour, Financial Analysis of companies, Trade in credit allocation in commercial banks, sociological studies on crimes, historical studies, and the like, depend primarily on secondary data. Year books, statistical reports of government departments, report of public organizations of Bureau of Public Enterprises, Censes Reports etc, serve as major data sources for such research studies.

Advantages of Secondary Data Secondary sources have some advantages: 1. Secondary data, if available can be secured quickly and cheaply. Once their source of documents and reports are located, collection of data is just matter of desk work. Even the tediousness of copying the data from the source can now be avoided, thanks to Xeroxing facilities. 2. Wider geographical area and longer reference period may be covered without much cost. Thus,

the use of secondary data extends the researchers space and time reach. 3. The use of secondary data broadens the data base from which scientific generalizations can be made. 4. Environmental and cultural settings are required for the study. 5. The use of secondary data enables a researcher to verify the findings bases on primary data. It readily meets the need for additional empirical support. The researcher need not wait the time when additional primary data can be collected. Disadvantages of Secondary Data The use of a secondary data has its own limitations. 1. The most important limitation is the available data may not meet our specific needs. The definitions adopted by those who collected those data may be different; units of measure may not match; and time periods may also be different. 2. The available data may not be as accurate as desired. To assess their accuracy we need to know how the data were collected. 3. The secondary data are not up-to-date and become obsolete when they appear in print, because of time lag in producing them. For example, population census data are published tow or three years later after compilation, and no new figures will be available for another ten years. 4. Finally, information about the whereabouts of sources may not be available to all social scientists. Even if the location of the source is known, the accessibility depends primarily on proximity. For example, most of the unpublished official records and compilations are located in the capital city, and they are not within the easy reach of researchers based in far off places. b. What are the criteria used for evaluation of secondary data? Ans. Evaluation of Secondary Data When a researcher wants to use secondary data for his research, he should evaluate them before deciding to use them. 1. Data Pertinence The first consideration in evaluation is to examine the pertinence of the available secondary data to the research problem under study. The following questions should be considered. What are the definitions and classifications employed? Are they consistent ? What are the measurements of variables used? What is the degree to which they conform to the requirements of our research?

What is the coverage of the secondary data in terms of topic and time? Does this coverage fit the needs of our research? On the basis of above consideration, the pertinence of the secondary data to the research on hand should be determined, as a researcher who is imaginative and flexible may be able to redefine his research problem so as to make use of otherwise unusable available data. 2. Data Quality If the researcher is convinced about the available secondary data for his needs, the next step is to examine the quality of the data. The quality of data refers to their accuracy, reliability and completeness. The assurance and reliability of the available secondary data depends on the organization which collected them and the purpose for which they were collected. What is the authority and prestige of the organization? Is it well recognized? Is it noted for reliability? It is capable of collecting reliable data? Does it use trained and well qualified investigators? The answers to these questions determine the degree of confidence we can have in the data and their accuracy. It is important to go to the original source of the secondary data rather than to use an immediate source which has quoted from the original. Then only, the researcher can review the cautionary ands other comments that were made in the original source. 3. Data Completeness The completeness refers to the actual coverage of the published data. This depends on the methodology and sampling design adopted by the original organization. Is the methodology sound? Is the sample size small or large? Is the sampling method appropriate? Answers to these questions may indicate the appropriateness and adequacy of the data for the problem under study. The question of possible bias should also be examined. Whether the purpose for which the original organization collected the data had a particular orientation? Has the study been made to promote the organizations own interest? How the study was conducted? These are important clues. The researcher must be on guard when the source does not report the methodology and sampling design. Then it is not possible to determine the adequacy of the secondary data for the researchers study. 6. What are the differences between observation and interviewing as methods of data collection? Give two specific examples of situations where either observation or interviewing would be more appropriate.

Ans. Observation vs interviewing as Methods of Data Collection Collection of data is the most crucial part of any research project as the success or failure of the project is dependent upon the accuracy of the data. Use of wrong methods of data collection or any inaccuracy in collecting data can have significant impact on the results of a study and may lead to results that are not valid. There are many techniques of data collection along a continuum and observation and interviewing are two of the popular methods on this continuum that has quantitative methods at one end while qualitative methods at the other end. Though there are many similarities in these two methods and they serve the same basic purpose, there are differences that will be highlighted in this article. Observation Observation, as the name implies refers to situations where participants are observed from a safe distance and their activities are recorded minutely. It is a time consuming method of data collection as you may not get the desired conditions that are required for your research and you may have to wait till participants are in the situation you want them to be in. Classic examples of observation are wild life researchers who wait for the animals of birds to be in a natural habitat and behave in situations that they want to focus upon. As a method of data collection, observation has limitations but produces accurate results as participants are unaware of being closely inspected and behave naturally. Interviewing Interviewing is another great technique of data collection and it involves asking questions to get direct answers. These interviews could be either one to one, in the form of questionnaires, or the more recent form of asking opinions through internet. However, there are limitations of interviewing as participants may not come up with true or honest answers depending upon privacy level of the questions. Though they try to be honest, there is an element of lie in answers that can distort results of the project. Though both observation and interviewing are great techniques of data collection, they have their own strengths and weaknesses. It is important to keep in mind which one of the two will produce desired results before finalizing. Observation vs Interviewing

Data collection is an integral part of any research and various techniques are employed for this purpose. Observation requires precise analysis by the researcher and often produces most accurate results although it is very time consuming Interviewing is easier but suffers from the fact that participants may not come up with honest replies. Interview format: Interviews take many different forms. It is a good idea to ask the organisation in advance what format the interview will take. Competency/criteria based interviews: These are structured to reflect the competencies or qualities that an employer is seeking for a particular job, which will usually have been detailed in the job specification or advert. The interviewer is looking for evidence of your skills and may ask such things as: Give an example of a time you worked as part of a team to achieve a common goal. Technical interviews: If you have applied for a job or course that requires technical knowledge, it is likely that you will be asked technical questions or has a separate technical interview. Questions may focus on your final year project or on real or hypothetical technical problems. You should be prepared to prove yourself, but also to admit to what you do not know and stress that you are keen to learn. Do not worry if you do not know the exact answer - interviewers are interested in your thought process and logic.

The Screening Interview: Companies use screening tools to ensure that candidates meet minimum qualification requirements. Computer programs are among the tools used to weed out unqualified candidates. (This is why you need a digital resume that is screening-friendly. See our resume centre for help.) Sometimes human professionals are the gatekeepers. Screening interviewers often have honed skills to determine whether there is anything that might disqualify you for the position. Remember they do not need to know whether you are the best fit for the position, only whether you are not a match. For this reason,

screeners tend to dig for dirt. Screeners will hone in on gaps in your employment history or pieces of information that look inconsistent. They also will want to know from the outset whether you will be too expensive for the company. The Informational Interview: On the opposite end of the stress spectrum from screening interviews is the informational interview. A meeting that you initiate, the informational interview is underutilized by job-seekers who might otherwise consider themselves savvy to the merits of networking. Jobseekers ostensibly secure informational meetings in order to seek the advice of someone in their current or desired field as well as to gain further references to people who can lend insight. Employers that like to stay apprised of available talent even when they do not have current job openings, are often open to informational interviews, especially if they like to share their knowledge, feel flattered by your interest, or esteem the mutual friend that connected you to them. During an informational interview, the jobseeker and employer exchange information and get to know one another better without reference to specific job opening.

MBA SEMESTER III MB0050 Research Methodology- 4 Credits (Book ID: B1206 ) Assignment Set- 2 (60 Marks) 1. a. Explain the General characteristics of observation. Ans. General Characteristics of Observation Method

Observation as a method of data collection has certain characteristics. 1. It is both a physical and a mental activity: The observing eye catches many things that are present. But attention is focused on data that are pertinent to the given study. 2. Observation is selective: A researcher does not observe anything and everything, but selects the range of things to be observed on the basis of the nature, scope and objectives of his study. For example, suppose a researcher desires to study the causes of city road accidents and also formulated a tentative hypothesis that accidents are caused by violation of traffic rules and over speeding. When he observed the movements of vehicles on the road, many things are before his eyes; the type, make, size and colour of the vehicles, the persons sitting in them, their hair style, etc. All such things which are not relevant to his study are ignored and only over speeding and traffic violations are keenly observed by him. 3. Observation is purposive and not casual: It is made for the specific purpose of noting things relevant to the study. It captures the natural social context in which persons behaviour occur. It grasps the significant events and occurrences that affect social relations of the participants. 4. Observation should be exact and be based on standardized tools of research and such as observation schedule, social metric scale etc., and precision instruments, if any. b. What is the Utility of Observation in Business Research? Ans. Utility of Observation in Business Research Observation is suitable for a variety of research purposes. It may be used for studying (a) The behaviour of human beings in purchasing goods and services.: life style, customs, and manner, interpersonal relations, group dynamics, crowd behaviour, leadership styles, managerial style, other behaviours and actions; (b) The behaviour of other living creatures like birds, animals etc. (c) Physical characteristics of inanimate things like stores, factories, residences etc. (d) Flow of traffic and parking problems (e) movement of materials and products through a plant.

2. a. Briefly explain Interviewing techniques in Business Research? Ans. Interviewing techniques in Business Research The interview process consists of the following stages: Preparation Introduction Developing rapport Carrying the interview forward Recording the interview

## Closing the interview

1 Preparation The interviewing requires some preplanning and preparation. The interviewer should keep the copies of interview schedule/guide (as the case may be) ready to use. He should have the list of names and addresses of respondents, he should regroup them into contiguous groups in terms of location in order to save time and cost in traveling. The interviewer should find out the general daily routine of the respondents in order to determine the suitable timings for interview. Above all, he should mentally prepare himself for the interview. He should think about how he should approach a respondent, what mode of introduction he could adopt, what situations he may have to face and how he could deal with them. The interviewer may come across such situations as respondents; avoidance, reluctance, suspicion, diffidence, inadequate responses, distortion, etc. The investigator should plan the strategies for dealing with them. If such preplanning is not done, he will be caught unaware and fail to deal appropriately when he actually faces any such situation. It is possible to plan in advance and keep the plan and mind flexible and expectant of new development. 2 Introduction The investigator is a stranger to the respondents. Therefore, he should be properly introduced to each of the respondents. What is the proper mode of introduction? There is no one appropriate universal mode of introduction. Mode varies according to the type of respondents. When making a study of an organization or institution, the head of the organization should be approached first and his cooperation secured before contacting the sample inmates/employees. When studying a community or a cultural group, it is essential to approach the leader first and to enlist cooperation. For a survey or urban households, the research organizations letter of introduction and the interviewers identity card can be shown. In these days of fear of opening the door for a stranger, residents cooperation can be easily secured, if the interviewer attempts to get him introduced

through a person known to them, say a popular person in the area e.g., a social worker. For interviewing rural respondents, the interviewer should never attempt to approach them along with someone from the revenue department, for they would immediately hide themselves, presuming that they are being contacted for collection of land revenue or subscription to some government bond. He should not also approach them through a local political leader, because persons who do not belong to his party will not cooperate with the interviewer. It is rather desirable to approach the rural respondents through the local teacher or social worker.

of course, the areas to be investigated. 4. Know the objectives of each question so as to make sure that the answers adequately satisfy the question objectives.

crises in the life of the individual, emotional blockage may occur. Then drop the subject for the time being and pursue another line of conversation for a while so that a less direct approach to the subject can be made later. 15. When there is a pause in the flow of information, do not hurry the interview. Take it as a matter of course with an interested look or a sympathetic half-smile. If the silence is too prolonged, introduce a stimulus saying You mentioned that What happened then? 5 Additional Sittings In the case of qualitative interviews involving longer duration, one single sitting will not do, as it would cause interview weariness. Hence, it is desirable to have two or more sittings with the consent of the respondent. 6 Recording the Interview It is essential to record responses as they take place. If the note taking is done after the interview, a good deal of relevant information may be lost. Nothing should be made in the schedule under respective question. It should be complete and verbatim. The responses should not be summarized or paraphrased. How can complete recording be made without interrupting the free flow of conversation? Electronic transcription through devices like tape recorder can achieve this. It has obvious advantages over note-taking during the interview. But it also has certain disadvantages. Some respondents may object to or fear going on record. Consequently the risk of lower response rate will rise especially for sensitive topics. If the interviewer knows short-hand, he can use it with advantage. Otherwise, he can write rapidly by abbreviating word and using only key words and the like. However, even the fast writer may fail to record all that is said at conversational speed. At such times, it is useful to interrupt by some such comment as that seems to be a very important point, would you mind repeating it, so

that I can get your words exactly. The respondent is usually flattered by this attention and the rapport is not disturbed. The interviewer should also record all his probes and other comments on the schedule, in brackets to set them off from responses. With the pre-coded structured questions, the interviewers task is easy. He has to simply ring the appropriate code or tick the appropriate box, as the case may be.

He should not make mistakes by carelessly ringing or ticketing a wrong item. 7 Closing the Interview After the interview is over, take leave off the respondent thanking him with a friendly smile. In the case of a qualitative interview of longer duration, select the occasion for departure more carefully. Assembling the papers for putting them in the folder at the time of asking the final question sets the stage for a final handshake, a thank-you and a good-bye. If the respondent desires to know the result of the survey, note down his name and address so that a summary of the result could be posted to him when ready. 8 Editing At the close of the interview, the interviewer must edit the schedule to check that he has asked all the questions and recorded all the answers and that there is no inconsistency between answers. Abbreviations in recording must be replaced by full words. He must ensure that everything is legible. It is desirable to record a brief sketch of his impressions of the interview and observational notes on the respondents living environment, his attitude to the survey, difficulties, if any, faced in securing his cooperation and the interviewers assessment of the validity of the respondents answers. b. What are the problems encountered in Interview? Ans. Interview Problems In personal interviewing, the researcher must deal with two major problems, inadequate response, non-response and interviewers bias.

1 Inadequate response Kahn and Cannel distinguish five principal symptoms of inadequate response. They are: partial response, in which the respondent gives a relevant but incomplete answer non-response, when the respondent remains silent or refuses to answer the question irrelevant response, in which the respondents answer is not relevant to the question asked inaccurate response, when the reply is biased or distorted and verbalized response problem, which arises on account of respondents failure to understand

a question or lack of information necessary for answering it. 2 Interviewers Bias The interviewer is an important cause of response bias. He may resort to cheating by cooking up data without actually interviewing. The interviewers can influence the responses by inappropriate suggestions, word emphasis, tone of voice and question rephrasing. His own attitudes and expectations about what a particular category of respondents may say or think may bias the data. Another source of response of the interviewers characteristics (education, apparent social status, etc) may also bias his answers. Another source of response bias arises from interviewers perception of the situation, if he regards the assignment as impossible or sees the results of the survey as possible threats to personal interests or beliefs he is likely to introduce bias. As interviewers are human beings, such biasing factors can never be overcome completely, but their effects can be reduced by careful selection and training of interviewers, proper motivation and supervision, standardization or interview procedures (use of standard wording in survey questions, standard instructions on probing procedure and so on) and standardization of interviewer behaviour. There is need for more research on ways to minimize bias in the interview. 3 Non-response Non-response refers to failure to obtain responses from some sample respondents. There are many sources of non-response; non-availability, refusal, incapacity and inaccessibility. 4 Non-availability Some respondents may not be available at home at the time of call. This depends upon the nature of the respondent and the time of calls. For example, employed persons may not be available during working hours. Farmers may not be available at home during cultivation season. Selection of appropriate timing for calls could solve this problem. Evenings and weekends may be favourable interviewing hours for such respondents. If someone is available, then, line respondents hours of availability can be ascertained and the next visit can be planned accordingly.

5 Refusal Some persons may refuse to furnish information because they are ill-disposed, or approached at the wrong hour and so on. Although, a hardcore of refusals remains, another try or perhaps another approach may find some of them cooperative. Incapacity or inability may refer to illness which prevents a response during the entire survey period. This may also arise on account of language barrier. 6 Inaccessibility

Some respondents may be inaccessible. Some may not be found due to migration and other reasons. Non-responses reduce the effective sample size and its representativeness. 7 Methods and Aims of control of non-response Kish suggests the following methods to reduce either the percentage of non-response or its effects: 1. Improved procedures for collecting data are the most obvious remedy for non-response. Improvements advocated are (a) guarantees of anonymity, (b) motivation of the respondent to cooperate (c) arousing the respondents interest with clever opening remarks and questions, (d) advance notice to the respondents. 2. Call-backs are most effective way of reducing not-at-homes in personal interviews, as are 3. Substitution for the non-response is often suggested as a remedy. Usually this is a mistake because the substitutes resemble the responses rather than the non-responses. Nevertheless, beneficial substitution methods can sometimes be designed with reference to important characteristics of the population. For example, in a farm management study, the farm size is an important variable and if the sampling is based on farm size, substitution for a respondent with a particular size holding by another with the holding of the same size is possible. Attempts to reduce the percentage or effects on non-responses aim at reducing the bias caused by differences on non-respondents from respondents. The non-response bias should not be confused with the reduction of sampled size due to non-response. The latter effect can be easily overcome, either by anticipating the size of non-response in designing the sample size or by compensating for it with a supplement. These adjustments increase the size of the response and the sampling precision, but they do not reduce the non-response percentage or bias. 3. a. What are the various steps in processing of data? Ans. The various steps in processing of data may be stated as: Identifying the data structures Editing the data Coding and classifying the data Transcription of data Tabulation of data. repeated mailings to no-returns in mail surveys.

Objectives:

After studying this lesson you should be able to understand: Checking for analysis Editing Coding Classification Transcription of data Tabulation Construction of Frequency Table Components of a table Principles of table construction Frequency distribution and class intervals Graphs, charts and diagrams Types of graphs and general rules Quantitative and qualitative analysis Measures of central tendency Dispersion Correlation analysis Coefficient of determination

1 Checking for Analysis In the data preparation step, the data are prepared in a data format, which allows the analyst to use modern analysis software such as SAS or SPSS. The major criterion in this is to define the data structure. A data structure is a dynamic collection of related variables and can be conveniently represented as a graph where nodes are labelled by variables. The data structure also defines and stages of the preliminary relationship between variables/groups that have been pre-planned by the researcher. Most data structures can be graphically presented to give clarity as to the frames researched hypothesis. A sample structure could be a linear structure, in which one variable leads to the other and finally, to the resultant end variable. The identification of the nodal points and the relationships among the nodes could sometimes be a complex task than estimated. When the task is complex, which involves several types of instruments being collected for the same research question, the procedures for drawing the data structure would involve a series of steps. In several intermediate steps, the heterogeneous data structure of the

individual data sets can be harmonized to a common standard and the separate data sets are then integrated into a single data set. However, the clear definition of such data structures would help in the further processing of data. 2 Editing The next step in the processing of data is editing of the data instruments. Editing is a process of checking to detect and correct errors and omissions. Data editing happens at two stages, one at the time of recording of the data and second at the time of analysis of data. a. Data Editing at the Time of Recording of Data Document editing and testing of the data at the time of data recording is done considering the following questions in mind. Do the filters agree or are the data inconsistent? Have missing values been set to values, which are the same for all research questions? Have variable descriptions been specified? Have labels for variable names and value labels been defined and written? All editing and cleaning steps are documented, so that, the redefinition of variables or later analytical modification requirements could be easily incorporated into the data sets. b. Data Editing at the Time of Analysis of Data Data editing is also a requisite before the analysis of data is carried out. This ensures that the data is complete in all respect for subjecting them to further analysis. Some of the usual check list questions that can be had by a researcher for editing data sets before analysis would be:

1. 2. 3. 4. 5. 6. 7. 8.

Is the coding frame complete? Is the documentary material sufficient for the methodological description of the study? Is the storage medium readable and reliable. Has the correct data set been framed? Is the number of cases correct? Are there differences between questionnaire, coding frame and data? Are there undefined and so-called wild codes? Comparison of the first counting of the data with the original documents of the researcher.

The editing step checks for the completeness, accuracy and uniformity of the data as created by the researcher.

Completeness: The first step of editing is to check whether there is an answer to all the questions/variables set out in the data set. If there were any omission, the researcher sometimes would be able to deduce the correct answer from other related data on the same instrument. If this is possible, the data set has to rewritten on the basis of the new information. For example, the approximate family income can be inferred from other answers to probes such as occupation of family members, sources of income, approximate spending and saving and borrowing habits of family members etc. If the information is vital and has been found to be incomplete, then the researcher can take the step of contacting the respondent personally again and solicit the requisite data again. If none of these steps could be resorted to the marking of the data as missing must be resorted to. Accuracy: Apart from checking for omissions, the accuracy of each recorded answer should be checked. A random check process can be applied to trace the errors at this step. Consistency in response can also be checked at this step. The cross verification to a few related responses would help in checking for consistency in responses. The reliability of the data set would heavily depend on this step of error correction. While clear inconsistencies should be rectified in the data sets, fact responses should be dropped from the data sets. Uniformity: In editing data sets, another keen lookout should be for any lack of uniformity, in interpretation of questions and instructions by the data recorders. For instance, the responses towards a specific feeling could have been queried from a positive as well as a negative angle. While interpreting the answers, care should be taken as a record the answer as a positive question response or as negative question response in all

uniformity checks for consistency in coding throughout the questionnaire/interview schedule response/data set. The final point in the editing of data set is to maintain a log of all corrections that have been carried out at this stage. The documentation of these corrections helps the researcher to retain the original data set. 3 Coding The edited data are then subject to codification and classification. Coding process assigns numerals or other symbols to the several responses of the data set. It is therefore a pre-requisite to prepare a coding scheme for the data set. The recording of the data is done on the basis of this coding scheme. The responses collected in a data sheet varies, sometimes the responses could be the choice among a multiple response, sometimes the response could be in terms of values and sometimes the response could be alphanumeric. At the recording stage itself, if some codification were done to the responses collected, it would be useful in the data analysis. When codification is done, it is imperative to keep a log of the codes allotted to the observations. This code sheet will help in the identification of variables/observations and the basis for such codification. The first coding done to primary data sets are the individual observation themselves. This responses sheet coding gives a benefit to the research, in that, the verification and editing of recordings and further contact with respondents can be achieved without any difficulty. The codification can be made at the time of distribution of the primary data sheets itself. The codes can be alphanumeric to keep track of where and to whom it had been sent. For instance, if the data consists of several public at different localities, the sheets that are distributed in a specific locality may carry a unique part code which is alphabetic. To this alphabetic code, a numeric code can be attached to distinguish the person to whom the primary instrument was distributed. This also helps the researcher to keep track of who the respondents are and who are the probable respondents from whom primary data sheets are yet to be collected. Even at a latter stage, any specific queries on a specific responses sheet can be clarified. The variables or observations in the primary instrument would also need codification, especially when they are categorized. The categorization could be on a scale i.e., most preferable to not preferable, or it could be very specific such as Gender classified as Male and Female. Certain classifications can lead to open ended classification such as education classification, Illiterate, Graduate, Professional, Others. Please specify. In such instances, the codification needs to be carefully done to include all possible responses under Others, please specify. If the preparation of

the exhaustive list is not feasible, then it will be better to create a separate variable for the Others please specify category and records all responses as such. Numeric Coding: Coding need not necessarily be numeric. It can also be alphabetic. Coding has to be compulsorily numeric, when the variable is subject to further parametric analysis. Alphabetic Coding: A mere tabulation or frequency count or graphical representation of the variable may be given in an alphabetic coding. Zero Coding: A coding of zero has to be assigned carefully to a variable. In many instances, when manual analysis is done, a code of 0 would imply a no response from the respondents. Hence, if a value of 0 is to be given to specific responses in the data sheet, it should not lead to the same interpretation of non response. For instance, there will be a tendency to give a code of 0 to a no, then a different coding than 0 should be given in the data sheet. An illustration of the coding process of some of the demographic variables is given in the following table. Question Number 1.1 Variable observation Organisation Response categories Private Public Government Yes No1 Excellent Good Adequate Bad Worst 1 Up to 20 years 21-40 years 40-60 years 3 Salaried Professional Technical Business Retired Housewife Others 5 4 3 2 1 2 S P T B R H = Code Pt Pb Go 2

3.4

## Owner of Vehicle Vehicle performs

4.2

5.1

Age

5.2

Occupation

= Could be treated as a separate variable/observation and the actual response could be recorded. The new variable could be termed as other occupation The coding sheet needs to be prepared carefully, if the data recording is not done by the researcher, but is outsourced to a data entry firm or individual. In order to enter the data in the same perspective, as the researcher would like to view it, the data coding sheet is to be prepared first and a copy of the data coding sheet should be given to the outsourcer to help in the data entry procedure. Sometimes, the researcher might not be able to code the data from the primary instrument itself. He may need to classify the responses and then code them. For this purpose, classification of data is also necessary at the data entry stage. 4 Classifications When open ended responses have been received, classification is necessary to code the responses. For instance, the income of the respondent could be an open-ended question. From all responses, a suitable classification can be arrived at. A classification method should meet certain requirements or should be guided by certain rules. First, classification should be linked to the theory and the aim of the particular study. The objectives of the study will determine the dimensions chosen for coding. The categorization should meet the information required to test the hypothesis or investigate the questions. Second, the scheme of classification should be exhaustive. That is, there must be a category for every response. For example, the classification of martial status into three category viz., married Single and divorced is not exhaustive, because responses like widower or separated cannot be fitted into the scheme. Here, an open ended question will be the best mode of getting the responses. From the responses collected, the researcher can fit a meaningful and theoretically supportive classification. The inclusion of the classification Others tends to fill the cluttered, but few responses from the data sheets. But others categorization has to carefully used by the researcher. However, the other categorization tends to defeat the very purpose of classification, which is designed to distinguish between observations in terms of the properties under study. The classification others will be very useful when a minority of respondents in the data set give varying answers. For instance, the reading habits of newspaper may be surveyed. The 95 respondents out of 100 could be easily classified into 5 large reading groups while 5 respondents could have given a unique answer. These given answer rather than being separately considered could be clubbed under the others heading for meaningful interpretation of respondents and reading habits. Third, the categories must also be mutually exhaustive, so that each case is classified only once.

This requirement is violated when some of the categories overlap or different dimensions are mixed up. The number of categorization for a specific question/observation at the coding stage should be maximum permissible since, reducing the categorization at the analysis level would be easier than splitting an already classified group of responses. However the number of categories is limited by the number of cases and the anticipated statistical analysis that are to be used on the observation. 5 Transcriptions of Data When the observations collected by the researcher are not very large, the simple inferences, which can be drawn from the observations, can be transferred to a data sheet, which is a summary of all responses on all observations from a research instrument. The main aim of transition is to minimize the shuffling proceeds between several responses and several observations. Suppose a research instrument contains 120 responses and the observations has been collected from 200 respondents, a simple summary of one response from all 200 observations would require shuffling of 200 pages. The process is quite tedious if several summary tables are to be prepared from the instrument. The transcription process helps in the presentation of all responses and observations on data sheets which can help the researcher to arrive at preliminary conclusions as to the nature of the sample collected etc. Transcription is hence, an intermediary process between data coding and data tabulation. a. Methods of Transcription The researcher may adopt a manual or computerized transcription. Long work sheets, sorting cards or sorting strips could be used by the researcher to manually transcript the responses. The computerized transcription could be done using a data base package such as spreadsheets, text files or other databases. The main requisite for a transcription process is the preparation of the data sheets where observations are the row of the database and the responses/variables are the columns of the data sheet. Each variable should be given a label so that long questions can be covered under the label names. The label names are thus the links to specific questions in the research instrument. For instance, opinion on consumer satisfaction could be identified through a number of statements (say 10); the data sheet does not contain the details of the statement, but gives a link to the question in the research instrument though variable labels. In this instance the variable names could be given as CS1, CS2, CS3, CS4, CS5, CS6, CS7, CS8, CS9 and CS10. The label CS indicating Consumer satisfaction and the number 1 to 10 indicate the statement measuring consumer satisfaction. Once the labelling process has been done for all the responses in the research instrument, the transcription of the response is done.

b. Manual Transcription When the sample size is manageable, the researcher need not use any computerization process to analyze the data. The researcher could prefer a manual transcription and analysis of responses. The choice of manual transcription would be when the number of responses in a research instrument is very less, say 10 responses, and the numbers of observations collected are within 100. A transcription sheet with 100x50 (assuming each response has 5 options) row/column can be easily managed by a researcher manually. If, on the other hand the variables in the research instrument are more than 40 and each variable has 5 options, it leads to a worksheet of 100x200 sizes which might not be easily managed by the researcher manually. In the second instance, if the number of responses is less than 30, then the manual worksheet could be attempted manually. In all other instances, it is advisable to use a computerized transcription process. c. Long Worksheets Long worksheets require quality paper; preferably chart sheets, thick enough to last several usages. These worksheets normally are ruled both horizontally and vertically, allowing responses to be written in the boxes. If one sheet is not sufficient, the researcher may use multiple rules sheets to accommodate all the observations. Heading of responses which are variable names and their coding (options) are filled in the first two rows. The first column contains the code of observations. For each variable, now the responses from the research instrument are then transferred to the worksheet by ticking the specific option that the observer has chosen. If the variable cannot be coded into categories, requisite length for recording the actual response of the observer should be provided for in the work sheet. The worksheet can then be used for preparing the summary tables or can be subjected to further analysis of data. The original research instrument can be now kept aside as safe documents. Copies of the data sheets can also be kept for future references. As has been discussed under the editing section, the transcript data has to be subjected to a testing to ensure error free transcription of data. A sample worksheet is given below for reference.

Transcription can be made as and when the edited instrument is ready for processing. Once all schedules/questionnaires have been transcribed, the frequency tables can be constructed straight from worksheet. Other methods of manual transcription include adoption of sorting strips or cards. In olden days, data entry and processing were made through mechanical and semi auto-metric devices such as key punch using punch cards. The arrival of computers has changed the data processing methodology altogether. 6 Tabulation The transcription of data can be used to summarize and arrange the data in compact form for further analysis. The process is called tabulation. Thus, tabulation is a process of summarizing raw data displaying them on compact statistical tables for further analysis. It involves counting the number of cases falling into each of the categories identified by the researcher. Tabulation can be done manually or through the computer. The choice depends upon the size and type of study, cost considerations, time pressures and the availability of software packages. Manual tabulation is suitable for small and simple studies. b. How is data editing is done at the Time of Recording of Data? Ans. Data Editing at the Time of Recording of Data Document editing and testing of the data at the time of data recording is done considering the following questions in mind. Do the filters agree or are the data inconsistent? Have missing values been set to values, which are the same for all research questions? Have variable descriptions been specified? Have labels for variable names and value labels been defined and written? All editing and cleaning steps are documented, so that, the redefinition of variables or later analytical modification requirements could be easily incorporated into the data sets.

## 4. a. What are the fundamental of frequency

Distribution? Ans. Frequency Distribution Variables that are classified according to magnitude or size are often arranged in the form of a frequency table. In constructing this table, it is necessary to determine the number of class intervals to be used and the size of the class intervals. A distinction is usually made between continuous and discrete variables. A continuous variable has an unlimited number of possible values between the lowest and highest with no gaps or breaks. Examples of continuous variable are age, weight, temperature etc. A discrete variable can have a series of specified values with no possibility of values between these points. Each value of a discrete variable is distinct and separate. Examples of discrete variables are gender of persons (male/female) occupation (salaried, business, profession) car size (800cc, 1000cc, 1200cc) In practice, all variables are treated as discrete units, the continuous variables being stated in some discrete unit size according to the needs of a particular situation. For example, length is described in discrete units of millimetres or a tenth of an inch. b. What are the types and general rules for graphical representation of data? Ans. The most commonly used graphic forms may be grouped into the following categories: a) b) c) d) e) f) g) h) Line Graphs or Charts Bar Charts Segmental presentations. Scatter plots Bubble charts Stock plots Pictographs Chesnokov Faces

The general rules to be followed in graphic representations are: 1. The chart should have a title placed directly above the chart. 2. The title should be clear, concise and simple and should describe the nature of the data presented. 3. Numerical data upon which the chart is based should be presented in an accompanying table. 4. The horizontal line measures time or independent variable and the vertical line the measured variable. 5. Measurements proceed from left to right on the horizontal line and from bottom to top on the vertical. 6. Each curve or bar on the chart should be labelled. 7. If there are more than one curves or bar, they should be clearly differentiated from one another by distinct patterns or colours.

8. The zero point should always be represented and the scale intervals should be equal. 9. Graphic forms should be used sparingly. Too many forms detract rather than illuminating the presentation. 10.Graphic forms should follow and not precede the related textual discussion. 5. Strictly speaking, would case studies be considered as scientific research? Why or why not? Case studies are a tool for discussing scientific integrity. Although one of the most frequently used tools for encouraging discussion, cases are only one of many possible tools. Many of the principles discussed below for discussing case studies can be generalized to other approaches to encouraging discussion about research ethics. Cases are designed to confront readers with specific real-life problems that do not lend themselves to easy answers. Case discussion demands critical and analytical skills and, when implemented in small groups, also fosters collaboration (Pimple, 2002). By providing a focus for discussion, cases help trainees to define or refine their own standards, to appreciate alternative approaches to identifying and resolving ethical problems, and to develop skills for analyzing and dealing with hard problems on their own. The effective use of case studies is comprised of many factors, including: appropriate selection of case(s) (topic, relevance, length, complexity) method of case presentation (verbal, printed, before or during discussion) format for case discussion (Email or Internet-based, small group, large group) leadership of case discussion (choice of discussion leader, roles and responsibilities for discussion leader) outcomes for case discussion (answers to specific questions, answers to general questions,

written or verbal summaries) Research methods don't seem so intimidating when you're familiar with the terminology. This is important whether you're conducting evaluation or merely reading articles about other studies to incorporate in your program. To help with understanding, here are some basic definitions used.

Variable: Characteristics by which people or things can be described. Must have more than

one level; in other words, to be able to change over time for the same person/object, or from person

to person, or object to object. Some variables, called attributes, cannot be manipulated by the researcher (e.g., socioeconomic status, IQ score, race, gender, etc.). Some variables can be manipulated but are not in a particular study. This occurs when subjects self-select the level of the independent variable, or the level is naturally occurring (as with ex post facto research). Manipulation: Random assignment of subjects to levels of the independent variable (treatment groups). Independent variable: The treatment, factor, or presumed cause that will produce a change in the dependent variable. This is what the experimenter tries to manipulate. It is denoted as "X" on the horizontal axis of a graph. Dependent variable: The presumed effect or consequence resulting from changes in the independent variable. This is the observation made and is denoted by "Y" on the vertical axis of a graph. The score of "Y" depends on the score of "X." Population: The complete set of subjects that can be studied: people, objects, animals, plants, etc. Sample: A subset of subjects that can be studied to make the research project more manageable. There are a variety of ways samples can be taken. If a large enough random samples are taken, the results can be statistically similar to taking a census of an entire population--with reduced effort and cost. Case Study: A case study is conducted for similar purpose as the above but is usually done with a smaller sample size for more in-depth study. A case study often involves direct observation or interviews with single subjects or single small social units such as a family, club, school classroom, etc. This is typically considered qualitative research.

Purpose: Explain or Predict Type of Research to Use: Relational Study In a relational study you start with a research hypothesis, that is, is what you're trying to "prove." Examples of research hypotheses for a relational study:

The older the person, the more health problems he or she encounters. 4-H members attending 4-H summer camp stay enrolled in 4-H longer. The greater the number of money management classes attended, the greater the amount of

annual savings achieved. Types of relational studies include correlational studies and ex post facto studies. Correlational Study: A correlational study compares two or more different characteristics from the same group of people and explains how two characteristics vary together and how well one can be predicted from knowledge of the other. A concurrent correlational study draws a relationship between characteristics at the same point in time. For example, a student's grade point average is related to his or her class rank. A predictive correlational study could predict a later set of data from an earlier set. For example, a student's grade point average might predict the same student's grade point average during senior year. A predictive correlational study could also use one characteristic to predict what another characteristic will be at another time. For example, a student's SAT score is designed to predict college freshman grade point average. Ex Post Facto (After the Fact) Study: An ex post facto study is used when experimental research is not possible, such as when people have self-selected levels of an independent variable or when a treatment is naturally occurring and the researcher could not "control" the degree of its use. The researcher starts by specifying a dependent variable and then tries to identify possible reasons for its occurrence as well as alternative (rival) explanations such confounding (intervening, contaminating, or extraneous) variables are "controlled" using statistics. This type of study is very common and useful when using human subjects in real-world situations and the investigator comes in "after the fact." For example, it might be observed that students from one town have higher grades than students from a different town attending the same high school. Would just "being from a certain town" explain the differences? In an ex post facto study, specific reasons for the differences would be explored, such as differences in income, ethnicity, parent support, etc. It is important to recognize that, in a relational study, "cause and effect" cannot be claimed. All that can be claimed is that that there is a relationship between the variables. For that matter, variables that are completely unrelated could, in fact, vary together due to nothing

more than coincidence. That is why the researcher needs to establish a plausible reason (research hypothesis) for why there might be a relationship between two variables before conducting a study. For instance, it might be found that all football teams with blue uniforms won last week. There is no likely reason why the uniform color had any relationship to the games' outcomes, and it certainly was not the cause for victory. Similarly, you must be careful about claiming that your Extension program was the "cause" of possible results. 6. a. Analyse the case study and descriptive approach to research? a) Case Study and descriptive approach to research: Descriptive research, also known as statistical research, describes data and characteristics about the population or phenomenon being studied. Descriptive research answers the questions who, what, where, when and how... Although the data description is factual, accurate and systematic, the research cannot describe what caused a situation. Thus, Descriptive research cannot be used to create a causal relationship, where one variable affects another. In other words, descriptive research can be said to have a low requirement for internal validity. The description is used for frequencies, averages and other statistical calculations. Often the best approach, prior to writing descriptive research, is to conduct a survey investigation. Qualitative research often has the aim of description and researchers may follow-up with examinations of why the observations exist and what the implications of the findings are. In short descriptive research deals with everything that can be counted and studied. But there are always restrictions to that. Your research must have an impact to the lives of the people around you e.g. finding the most frequent disease that affects the children of a town. The reader of the research will know what to do to prevent that disease thus; more people will live a healthy life. Descriptive research does not fit neatly into the definition of either quantitative or qualitative research methodologies, but instead it can utilize elements of both, often within the same study. The term descriptive research refers to the type of research question, design, and data analysis that will be applied to a given topic. Descriptive statistics tell what is, while inferential statistics try to determine cause and effect. A case study is a research method common in social science. It is based on an in-depth investigation of a single individual, group, or event. Case studies may be descriptive or explanatory. The latter type is used to explore causation in order to find underlying principles. They may be prospective, in which criteria are established and cases fitting the criteria are included as they become available, or

retrospective, in which criteria are established for selecting cases from historical records for inclusion in the study. Rather than using samples and following a rigid protocol (strict set of rules) to examine limited number of variables, case study methods involve an in-depth, longitudinal (over a long period of time) examination of a single instance or event: a case. They provide a systematic way of looking at events, collecting data, analyzing information, and reporting the results. As a result the researcher may gain a sharpened understanding of why the instance happened as it did, and what might become important to look at more extensively in future research. Case studies lend themselves to both generating and testing hypotheses. Another suggestion is that case study should be defined as a research strategy, an empirical inquiry that investigates a phenomenon within its real-life context. Case study research means single and multiple case studies, can include quantitative evidence, relies on multiple sources of evidence and benefits from the prior development of theoretical propositions. Case studies should not be confused with qualitative research and they can be based on any mix of quantitative and qualitative evidence. Single-subject research provides the statistical framework for making inferences from quantitative case-study data

b. Distinguish between research methods & research Methodology. Ans. Research Methods Research methods are the various procedures, Research Methodology Research methodology is a systematic way to

schemes, algorithms, etc. used in research. solve a problem. It is a science of studying All how the methods used by a researcher during research is to be carried out. Essentially, a the research study are termed as research procedures by which researchers go about methods. their They are essentially planned, scientific work of describing, explaining and and predicting valuephenomena are called research methodology. neutral. They include theoretical It experiment is also defined as the study of methods by procedures, al studies, numerical which schemes, statistical approaches, etc. knowledge is gained. Its aim is to give the Research work methods help us collect samples, data and find plan of research. a solution to a problem. Particularly, scientific research methods call for explanations based on collected facts, measurements and observations and not on reasoning alone. They accept only those explanations which can be verified by experiments.