Vous êtes sur la page 1sur 10

1

Writing the Research Report: For Experiments, Nonexperiments, and Scale/Test development Involving Quantitative Analysis Published by Academic Scholars Publishing House , Sydney, Australia http://academic-publishing-house.com/ ISBN 978-981-024-855-3 Copyright 2012 by Carlo Magno All rights reserved. No part of this book may be reproduced or performed in any form or by any means without the written permission of the copyright owner

Writing the Research Report


For Experiments, Nonexperiments, and Scale/Test development Involving Quantitative Analysis Carlo Magno

The Problem and Literature Review Abstract The abstract is composed of 150 to 200 words that sufficiently summarize the study including the purpose, theoretical background, methods, results, and further implications/conclusions of the study. Start the abstract with the purpose of the investigation followed by its theoretical or conceptual support. Explain succinct processes of the method of the study that will provide an idea to the reader what was done in the study. o The number of participants, questionnaire, and analysis can be mentioned for nonexperiments. o The number of participants, conditions of the independent variable, specific design, and instruments can be mentioned for experiments. o The number of participants, number of pretesting can be mentioned if a scale or a test is developed. Main points of the results are described giving way to implications made in the study. 1. Introduction Set the background of the study by explaining relevant information directly leading to the proposed research questions. Describe the status of past research in the area under investigation that will eventually lead to the present research questions. The variables under study can be defined and a description on how the variables related to each other. o If variables are to be correlated, explain relevant information and basis that allows the variables to be correlated. o If groups will be compared on certain variables, enough basis for the comparison should be immediately presented. o If one variable is proposed to affect another, basis for the direction of the effects are described. o If a measure for a variable will be constructed, provide the need to construct the scale or test. Justify why is there a need to conduct the present study.

o Present gaps from past research. o Mention the contradictory findings. o Explain the rationale why the variables need further investigation. Tips in structuring the Introduction: Structure the introduction from broad to specific ideas. Open a general statement about the variables under study referring to peoples behavior. This opening statement should be a practical event or processes that describe the main idea of the study. This opening statement is made more concrete and stated in research terms in psychology as the readers are further lead in introducing the research questions. The definitions of the main variables of the study can be presented and how they are related (correlation, comparative, or causal). After establishing the conceptual relationship (correlation, causal, or comparative) of the variables under study, present the justifications for conducting the study. The author may focus on the gaps, contradictions, and rationale for the variables that will be investigated. End the introduction by stating the specific purpose of the present study or specific hypothesis.

Authors are encouraged to use research references that are published in refereed publications. The reviews presented should coherently describe the current context of studies related to the research question. Avoid plagiarism, rewrite or paraphrase statements taken from references and cite the original author. The review should clearly state the theoretical premises of the study. A literature review may compare studies in terms of assumptions about the research question, method, analysis, and conclusions drawn.

Organization Do not enumerate studies, organize them according to the main point you want to make about the study. Assert the ideas or themes that will directly provide basis for the research questions and support this with relevant past studies and explanations. If there is a large body of work in the present research done, or several variables under study, it is best to organize the reviews by putting subheadings. If variables are correlated, or variables are used to predict another, provide past reviews that support the intended relationships. If groups are compared, provide reviews that show the difference of the groups compared. If one variable is proposed to affect another, provide reviews that demonstrated the causal direction. If there are several reviews are shown, synthesize the reviews by: o Put together same ideas/results from different studies.

2. Review of Related Literature Content Provide a discussion of relevant studies within the frame of the studies purpose and evaluate these studies.

o Compare and contrast studies showing different results. o Use the ideas presented in the synthesized reviews to make implications for the purpose of the study. Elements found in the Literature Review Definition of important terms Justification in the selection of reviews Justifications of omissions or information that will not be covered. Forecast different sections that will be presented or an advance organizer. Signal the structure Link the present work to the literature Critique the literature Define the gap.

Explain how the theory is used for the specific relationship of variables in the study. If there are several variables in the study with complex connections, a diagram would help to illustrate the direction of the variables. Show only in the figure the variables tested in the study. The explanation of the specific connections of the variables in the study should directly lead to the hypothesis.

3. The Framework The framework should directly point the proposed relationship (correlation, comparative, causal) among the variables under study. Establishing the relationship of the variables under study would need a more specific framework such as a theory, model, or set of principles established from references. State the theory, model, or principle and explain the general premises of the theory. After the theory has been clarified, explain how the theory will be tested using the specific connection of the variable of the present study.

Source: Johnston, B. (2001). Toward a new classification of nonexperimental quantitative research. Educational Researcher, 30(2), 3-13. For experiments, indicate that the method used is experiment and give the specific design of the experiment (refer to Christensen, 1995). o Faulty-experimental Design: One-group after only design One group before-after (pretest and posttest) design Nonequivalent post test only design o True experimental designs Between-subjects Within-subjects Between subjects after only design Matched between-subjects after only design Simple randomized subjects design Factorial design Within-subjects after only design Combined between and within-subjects design Between subjects pretest and post test design Solomon four group design Switching replications design Randomized block design o Quasi-experimental design Nonequivalent control group design Increasing treatment effect 1 outcome Increasing treatment and control groups outcome Increasing treatment effect II outcome

Method Parts of the Methodology for Experiments and Nonexperiments Research Design Describe the type of research design used in the study and explain how it is going to be applied. For non-experiments, use classification of research designs provided by Johnston (2001):
Research Objective Descriptive Cross-sectional Descriptive, Cross-sectional (Type 1) Predictive, Cross-sectional (Type 4) Explanatory, Cross-sectional (Type 7) Time Dimension Longitudinal Descriptive, Longitudinal (Type 2) Predictive, Longitudinal (Type 5) Explanatory, Longitudinal (Type 8) Retrospective Descriptive, Retrospective (Type 3) Predictive, Retrospective (Type 6) Explanatory, Retrospective (Type 9)

Predictive

Explanatory

Crossover effect o Time series design Interrupted time series design Multiple time series design o Single subjects design A-B-A design Interaction design Multiple baseline design Changing-criterion design

If judges and observers are used, describe how many and how were they trained.

Participants Indicate who and how many participated in the study. Explain how the participants were selected (sampling technique). Report the procedures for selecting and assigning subjects. Give major demographic characteristics such as general geographic location, institutional affiliation, gender, and age When animals are participants in experiments: Report the genus, species, and strain number, sex, age, weight, physiological condition Instruments For nonexperiments, describe the source of the instrument. Indicate what does the instrument measure and indicate its factors or subscales with their specific item numbers. Report numerical scaling used and the total number of items. Report the validity and reliability of the instrument. For experiments, describe the apparatus or materials used and their function in the experiment. Standard laboratory equipment includes furniture, stopwatches

Procedures Summarize each step in the execution of the research that includes: Instructions to participants, formation of groups, experimental manipulation, standard testing procedures, specific mechanics to implement the research design, techniques of controlling extraneous variables, and address any ethical issue that might be raised. Discuss subject dropouts and other difficulties encountered in executing the study. Mention how the participants were debriefed. Data Analysis Indicate how the data was analyzed. Align the statistics used for each research question. Describe what kind of data is derived such as their levels of measurement. Mention what standardization technique is used in treating raw scores. Explain how data was cleaned and encoded Indicate the descriptive statistics used Indicate the statistical tests used. Indicate Post-statistical analysis used Justify the use of each statistical tests Indicate the measure of effect size and power analysis was used For testing structural models, indicate whether structural equations modeling, path analysis, or confirmatory factor

analysis is used. Explain the techniques in model specification or modification used and cite authors of necessary. Explain the direction of the variables being tested in the model. Report the goodness of fit measures used and how they will be interpreted. Parts of the Methodology for Scale or Test Development Studies Search for Content Domain Indicate the source where the instrument was based on. The basis could be from a theory, conceptual model, factors of an available scale, or previous qualitative study. Provide the main purpose and use of the scale or test developed. Item Writing Describe how the items are written. Mention the sources of information where the items were based. Indicate how many items were written for each hypothesized subscale or area. Show and describe the table of specifications. Label positive and negative items for a scale. Item Review Describe the procedure how the items were reviewed. In the case of achievement tests, describe the content validation procedure. Indicate how many reviewers and stages of review the items underwent. Indicate what tools were used in the review: Checklists, focus groups, consultation, etc.

Describe what changes and modifications made on the scale /test based on the comments from the reviewer. Describe the final count of items in case there are changes and present a new Table of Specifications.

Scaling Technique/Response Format Describe the response format used in answering the scale or test. For scales identify the type of scaling used: Multiple response, single-response, Lickert scale, verbal frequency scale, ordinal scale, forced ranking scale, paired comparison scale, comparative scale, Linear numeric, semantic differential, semantic distance, adjective checklist (see Alreck & Settle, 1995). Describe the numerical scoring for the scale and possible interpretation of high and low scores. For tests indicate the test format whether the response format is supply, binary type, multiple choice, matching type etc.

Preliminary Pretesting Describe the preliminary rest form: Number of items, response format, subscales Describe the sample and sampling technique. Describe the details of administering the scale or test: o Instructions provided o Additional materials used o How the participants answered o Duration/time of testing o Room condition o Restrictions during testing o Other observations

Explain how participants were debriefed.

Final Pretesting Describe the modifications and changes made in the instrument after the analysis of the data from the preliminary pretesting. Indicate the improvements made in the administration of the forms based on identified weaknesses during the preliminary pretesting.

Data Analysis Report the descriptive statistics used: Descriptive statistics for the preliminary and final pretesting may include means, standard deviations, standard errors, confidence intervals, kurtosis, and skewness. Report statistics used to establish test validity and reliability: Indicate what type of validity and reliability that is intended to be established coupled with the statistics used. The most common are correlations and factor analysis. Report item difficulties and discrimination and what approach will be used. The approach can be Classical Test Theory (CTT) or Item Response Theory (IRT). Explain the appropriateness of the approach used and if complex models are used, citations may be essential.

Results Start the results section by informing readers the hypothesis of the study and what statistical analysis will be presented in the section. Report the data collected and its statistical treatment through tables and figures. The order of the presentation of the results should follow with the statement of the problem. When reporting the results, state the main result first then report data in sufficient detail Mention all relevant results to answer the research question. Do not include individual scores or raw data except for single-subject design. If individual raw scores are required, it may be placed in the appendix section. Use Tables and figures to organize the results: o Report exact values and illustrate main effects (for experiments)

o Always tell the reader what to look for in the tables and figures o Lead the readers specifically to the point what to look at in the table o Provide sufficient explanation to make tables and figures readily intelligible Statistical Presentations: o Include descriptive statistics o Include information about the obtained magnitude or value of the test o Degrees of freedom o Probability level o Direction of the effect o Reject or failure to reject alternative hypothesis o Effect size

5. Elaborate or qualify the overall conclusion if necessary Only in the father-watching condition did the men fail to produce more tears than the women, but a specific test of this effect failed to reach significance, t = 1.53, p<.12 6. End each section of results with a summary of where things stand Thus, except for the father-watching condition, which will be discussed below, the hypothesis that men cry more that women in response to visually depicted grief appears to receive strong support.

Tips in making interpretation on the results: 1. Begin with the central findings, and then move to more peripheral ones. 2. Remind the conceptual hypothesis or question being asked 3. Tell the answer immediately and in English As table 1 reveals, men do, in fact, cry more profusely than women. 4. Then speak in numbers Thus, the men in all four conditions produced an average of 1.4 cc more tears than the women, F(1, 112) = 5.79, p<1.025 Discussion Evaluate, interpret, examine the implications and draw inferences from the results Emphasize theoretical consequences Open the discussion with a support or non-support of your alternative hypothesis.

10

Repost the similarities and differences between your results and the work of others should clarify and confirm your solutions Negative results should be accepted as such without an undue attempt to explain them away Identify the practical and theoretical implications of your study

A chapter in an Edited book: Bandura, A. (1989). Social cognitive theory. In R. Vasta (Ed.), Annals of child development. Vol. 6. Six theories of child development (pp. 1-60). Greenwich, CT: JAI Press. Appendices Indicate the items used in the instruments, diagrams, figures, additional tables, raw data, illustrations, and other materials used in the syudy. Authors bioprofile Follow the DLSU style in formatting the resume.

Things to be asked in the discussion: What have I contributed here? How has my study helped to resolve the original problem? What conclusions and theoretical implications can I draw from my study? References Follow the recent edition of the Publication Manual of the APA Examples Book: Murray, R. (2002). How to write a thesis. Philadelphia: Open University Press: Periodical/Journal: Furnham, A., & Chamorro-Premuzic, T. (2005). Individual differences in students' preferences for lecturers' personalities. Journal of Individual Differences, 26, 176-184.

References
American Psychological Association. (2001). Publication manual of the American Psychological Association (5th ed.). Washington, DC: Author. Bem, D. J. (2002). Writing the empirical journal article. In In Darley, J. M., Zanna, M. P., & Roediger III, H. L. (Eds.). The complete academic: A Career Guide. Washington, DC: American Psychological Association. Caputo, R. K. (2004). Advice for those wanting to publish quantitative research. Families in Society, 85, 401-404. Galvan, J. L. (2004). Writing literature reviews: A guide for students of the social and behavioral sciences (2nd ed.). Los Angeles: Pyrczak Publishing. Klingner, J. K., Scanlon, D., & Pressly, M. (2005). How to publish in scholarly journals. Educational Researcher, 38, 14-20. Johnston, B. (2001). Toward a new classification of nonexperimental quantitative research. Educational Researcher, 30(2), 3-13. Murray, R. (2002). How to write a thesis. Philadelphia: Open University Press:

Vous aimerez peut-être aussi