Académique Documents
Professionnel Documents
Culture Documents
August 1984
consult the hierarchy whenever there is a question. The greatest ambiguity exists in the definition of arrest (Sherman, 1980). Conflicting judicial opinions define an arrest either so broadly that they include any restraint on an individuals freedom to come and go as he or she pleases (Sweetnam v. F.W. Woolworth Co., 83 Ariz. 189 [1964]), or so narrowly that they exclude an eight-hour involuntary interrogation at police headquarters (U.S. v. Vita, 294 F. 2d. 524 [1962]). The growing use of citations and summonses for minor offenses adds to the confusion, since notices to appear in court at a later date usually are issued without transporting suspects to a police station for booking. Strategies for Quality Control Attempts to resolve the consistency problem were hampered from the very beginning by the voluntary nature of the system. The UCR published a manual which established counting rules. This resolved some, but not all, of the questions that agencies face in trying to maintain a consistent approach to compiling statistics. Since the primary purpose of the system was to count reported crimes, the manual devoted most of its attention to that issue. Very little attention was paid to the problem of defining and counting arrests. The first manual (International Association of Chiefs of Police, 1929:23) defined an arrest as the taking of a person into custody in order that he may be held to answer for a public offense. This definition did nothing to resolve the question of the individual who was, in fact, in custody for that purpose
rather than for interrogation. By 1966, the manual had all but given up on providing a definition of arrest. The closest it came was distinguishing between persons arrested (All arrests are included even though the person is later released without being formally charged) and persons charged (turn over for prosecution) (Federal Bureau of Investigation, 1966:2). But the manual did provide some new counting rules (FBI, 1966:73), such as, Count one (arrest) for each person (no matter how many offenses he is charged with) on each separate occasion (day) he is arrested or charged. These counting rules have been disseminated through a variety of training efforts. Federal funds have helped most states establish their own crime counting agencies. These agencies have assumed responsibility for gathering police statistics for submission to the FBI. The FBI, in turn, has held biennial conferences for state-level officials, who have assumed increased responsibility for obtaining compliance with the counting rules from member police departments. A variety of strategies for obtaining compliance theoretically are possible. They include training opportunities for police clerks; regular audits; and the continuing review of statistics supplied by agencies. But since the FBI has established no uniform compliance procedure, it has been difficult to determine exactly what the states are doing to ensure compliance.
quality of arrest data and to measure the level of compliance with FBI counting rules. The research focused exclusively on adult arrests in order to avoid the complexity of counting juvenile arrests (Klein et al., 1975). Site Visits. Barry Glick visited 18 police departments for one day each to interview record-keeping staff and observe arrest operations. All but two of the departments served communities of over 100,000 people, and two served communities of over one million. The sample was drawn from the Middle Atlantic, Rocky Mountain, and Pacific regions of the U.S. It included one state, four county, and 13 city police departments. Mail Survey of Police Departments. Based on the issues that emerged in the site visits, a questionnaire was designed to be completed by the heads of police crime reporting sections. The survey was mailed to 213 city, county, and sheriffs departments identified as serving populations of 100,000 or more. A random sample of 26 departments serving populations of 10,000 to 100,000 was also included. Of the 239 departments contacted, 196 supplied usable responses, for a response rate of 82 percent. The actual sample was dominated by departments serving populations of 100,000 or more. These departments comprised 175 (89 percent) of the 196 cases. Mail Survey of State Agencies. The one-day site visits also produced a set of questions for state agencies gathering local police statistics. In order to improve the response rate and meet informally with state officials, Glick attended the three-day National Uniform Police Foundation Reports August 1984
Crime Reporting Conference of state UCR officials at the FBI Academy. One official per state was asked to complete the questionnaire. Thirty questionnaires were returned at the conference, and 11 were returned at a later date. This produced an 82 percent response rate for the 50 state agencies. Case Studies. Four police departments were selected for two-week site visits. The purpose of the visits was to observe booking procedures to determine the kinds of reports generated; and to audit department counts of arrests by offense type for a period of one month or more. The four departments were selected because they represented some of the many different kinds of police departments and the regions these departments serve. The four departments selected were (1) a large Pacific urban police department; (2) a medium-sized northeastern police department; (3) a small mountain police department; and (4) a large midAtlantic suburban police department.
counted, there appears to be a fairly high level of consensus, even though it violates UCR counting rules. Regulation by the States The study identified three statelevel strategies for achieving compliance: training; report review; and audit. Thirty-three (80 percent) of the responding state agencies claimed that they regularly trained police personnel when they were assigned to UCR reporting duties. Twenty-five state agencies (63 percent) even claimed that police departments notified them when new personnel were assigned to UCR reporting duties. On the other hand, 68 percent of the responding police departments reported that state agencies do not provide training for UCR personnel. Fifteen state agencies (37 percent) reported that they do not have adequate resources for training local police department personnel. Half of the agencies reported that three or fewer staff members process data from up to 1,036 police departments in each state (half reported dealing with 240 departments or more). It is surprising, therefore, that more agencies did not report that their training resources were inadequate. Since 30 state agencies (73 percent) reported that most police departments supply arrest data in gross totals on the standard FBI monthly Age, Sex, Race and Ethnic Origin of Persons Arrested form, these agencies find it difficult to determine anything of significance from a superficial review of these reports. The only way state officials could discover a problem in report preparation
is by detecting inconsistencies in police presentation of summary statistics. Unlike other systems for reporting social statistics (e.g., causes of death through death certificates forwarded to state agencies), the UCR system does not require that raw data be forwarded to the states. The great volume of crime could make such a process far too costly. Despite this limited information, the respondents claimed they had identified several areas of poor compliance by police record keepers. State respondents were almost all highly confident about the accuracy of police classification of multiple offense arrests according to the UCR hierarchy of offenses. But 25 respondents (61 percent) claimed that police departments encounter problems in reporting correct racial classifications for non-black minority groups. Twenty-five respondents (61 percent) also indicated that some police departments take credit for arrests made by other agencies, leading to double counting. Thirteen state respondents (32 percent) said that police agencies were probably not reporting arson arrests made by fire officials. Only 26 of the state agencies (63 percent) reported the existence of procedures for reviewing the accuracy of arrest statistics. Nineteen agencies (46 percent) indicated that they try to count adult arrests by using the standard FBI definition of adults as persons 18 years of age or older. The only certain way to verify the arrest totals reported by police departments in each category is to compare them to a review of each separate arrest report. But as far as the study could determine, state agencies (with one possible exception) do not conduct this kind of audit. A Police Foundation Reports August 1984
Findings
The study found that state UCR agencies allocate relatively little effort to regulating arrest statistics, and that the regulators themselves often fail to recall the UCR counting rules. This weak regulatory system allows a fairly high rate of error in definitions used for counting arrests in local police departments. Most important, perhaps, is the tabulation error rate within departments for certain offenses, which the audits found to be quite high. But on the key issue, i.e., the point in the arrest process at which the arrest is
few state agencies may audit reports for one specific offense only if there appears to be a large month-to-month change. But it seems that state agencies do not routinely conduct comprehensive tally checks for arrests in all offense categories. Even if state agencies could devote more resources to obtaining police compliance with counting rules, it is not clear they could communicate the proper instructions to police departments. A majority of the questions included to test state agency knowledge of UCR procedures were answered incorrectly, despite the fact that 93 percent of the respondents were agency directors (admittedly away from their offices), and that 73 percent of the respondents had held their UCR positions for two or more years. Most of the test questions covered key issues affecting the validity of reported arrest rates. Twenty respondents (49 percent) indicated incorrectly that if a police department does not approve an arrest, and the suspect is released, the arrest should not be counted. Twenty-seven respondents (66 percent) indicated incorrectly that arrests should not be counted unless an arrest report is generated, although this is a violation of UCR rules. Only 15 respondents (37 percent) stated correctly that citations other than those for traffic offenses should be counted as arrests, and only 20 (49 percent) stated that summonses should be counted. Six respondents (15 percent) did not know whether to count field interrogations, and one
respondent even indicated that a separate arrest should be counted for each charge preferred, rather than (as UCR requires) for each person arrested. Respondents were much more accurate (63 to 78 percent correct), however, on the three questions about rules for counting arrests involving more than one jurisdiction.
well should probably be interpreted as an ideal belief about how police officers should behave rather than an empirical report on their actual behavior. None of Glicks field visits revealed any department which counted arrests for UCR purposes without a formal booking process in a police facility (although several did count such arrests for internal purposes). It seems clear that regardless of how the courts might define an arrest, police departments define and count an arrest as a booking in a police station. Unfortunately, this nearly universal practice violates the UCR manual (FBI, 1980: 78), which asks departments to count arrests of those persons arrested and released without a formal charge being placed against them. It is quite possible that some police departments routinely arrest people for serious offenses, take them to a police facility, interrogate them, and release them without creating formal
Figure 1: Percentage of Police Departments Always Filing Arrest Reports Under Various Conditions (N=169) 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
Any Restraint Drive to Station Over 4 Hours Detention at Station "You're Under Arrest" Charged and Booked
100%
58%
16% 11%
29%
Table 1 Police Department Responses to Question When is an arrest report filed in your jurisdiction? (N does not total 196 due to missing responses.) Department Responses Circumstances a. When a police officer imposes any restraint of freedom. b. When a police officer places a citizen in a car and drives to a police facility. Always 16% (27) 11% (19) Sometimes 38% (66) 46% (80) Never 25% (43) 21% (36) Dont Know 6% (10) 5% (9) Total
(146)
(144)
c. When a police officer advises a citizen he is under arrest. d. When a police officer detains a citizen at a police facility for more than four hours.
3% (5) 6% (10)
(149)
(145)
100% (169)
(169)
arrest records. Seventy-six percent of the departments surveyed said they would not report an arrest to the UCR system if a suspect was released after being brought to the station but not charged. Ninety-two percent said they would not report an arrest unless an arrest report had been completed. Supervisory Review. Policies governing supervisory review of arrests may also contribute to the variation in arrest counting procedures between police departments. Thirty-four percent of the departments reported arrests (as the UCR requires) even if a supervisor disapproved the arrest and the suspect was released. But 61 percent of the departments indicated that those arrests
would not be counted for UCR purposes. Multiple Charges. Eleven percent of the police departments indicated that they counted each charge placed against the suspect as one arrest. In a later question giving an example of one person charged with three offenses in one incident, 10 percent of the departments indicated that they would count three arrestseven though the UCR manual clearly states that only one arrest be counted for each person on each occasion that he or she is arrested. A department counting all charges could show at least twice as many arrests as a department counting the same number of people, depending on how many multiple charge
incidents each department processes. Summonses and Citations. The UCR manual requires that both summonses and citations be counted in arrest statistics. But 29 percent of the departments indicated that they do not include adult citations, and 57 percent do not include adult summonses, in their UCR arrest statistics. Jurisdiction. About half (49 percent) of the responding departments share their jurisdiction with other local public agencies. Contrary to the UCR manual, 15 percent of the departments indicated that they do not report arrests made by their officers in another jurisdiction. Ten percent improperly take credit for arrests Police Foundation Reports August 1984
made by other agencies in their own jurisdiction. And 44 percent indicated that they improperly report arrests made by their officers on the basis of warrants from other jurisdictions. Other Situations. Several other violations of counting rules were reported in the survey, although it is not clear that the situations they involve occur with any frequency. Forty-three percent of the departments indicated that they do not include citizens arrests in their UCR statistics (the UCR manual does not provide guidelines for processing these arrests). Thirteen percent said that they would count two arrests if additional charges were placed against a suspect (in custody) stemming from the same incident for which he or she had been arrested initially. Thirty-one percent said that they do not include arson arrests made by fire officials (although not all jurisdictions empower fire officials to make arrests). One department even reported that it included police field interrogations in its arrest count (although this would greatly multiply its arrest figures relative to other departments). The confusion is compounded by the fact that most police departments routinely maintain two or three sets of arrest statistics one for UCR reporting, one for administrative purposes, and perhaps one for public relations purposes. The administrative reports are often compiled from officer activity logs, so that departments using twoofficer patrol cars count many arrests twice. As a result, state agencies using police department annual
percentage differences does seem to decline for those offense categories with large numbers of arrests. But this pattern does not hold across departments. Both the Pacific and the midAtlantic departments have comparable N sizes. Yet the magnitude of error is almost three times greater for most Part I offenses in the suburban department. These data tend to refute a theory that police departments deliberately inflate arrest statistics to make their performance look better. In three of the four departments, the reported arrest statistics for Part I offenses understate rather than overstate their arrest activity. Three departments understate their arrest activity for as many offense categories as they overstate. Only the northeastern departments reports are consistent with intentional inflation of arrest totals in Part I. But the departments understatement of Part II arrests and of total arrests discounts the theory.
Figure 2: Percentage of Under- or Over-Reporting of Arrest Totals in Four Police Departments 20% 14% 15% 10% 5% 5% 0% -5% -10%
Large MidAtlantic Sm all Mountain Large Pacific Medium Northeastern
-2% -5%
Table 2: Audit of Arrest Statistics in Four Police Departments Offense Part I Arrests Homicide Rape Robbery Aggravated Assault Burglary Larceny Auto Theft Arson Total Part I Arrests Part II Arrests Other Assault Forgery (counterfeit) Fraud Embezzlement Stolen Property Vandalism Weapons Prostitution/ Vice Sex Offenses Drug Abuse Gambling Offense Against Family/Child Large Pacific* Percent audit under/over count reported by department 10 -11% 9 -12% 33 -12% 60 +8% -15% +6% -8% 71 168 32 0 383 Small Mountain** Percent audit under/over count reported by department 2 0 -20% 5 +167% 3 -46% +7% -17% -11% 28 27 6 0 71 Medium Northeastern** Percent audit under/over count reported by department +100% 0 +25% 4 +27% 15 +83% 23 +36% -34% +10% +11% 25 59 19 2 147 Large Mid-Atlantic* Percent audit under/over count reported by department 5 -60% 5 -40% 30 +143% 7 -30% +13% +17% -1% 79 202 6 6 340
-27% -6% +150% +53% +10% +8% -100% +8% -38% +1% -24% -19% -1% -2%
+43% -33% -100% +200% +600% +1% +100% -20% +12% +9% +5%
-21% -54% +100% +150% +40% +100% -100% +42% -11% -100% +200% -10% -100% -61% -14% -5%
14 33 0 0 6 5 4 0 4 26 47 4 2 59 2 65 271 418
-10% +71% -12% +100% +1400% -41 -100% -100% +30% -12% -80% +100% +11% +122% +28% +14%
50 7 25 0 1 17 52 3 10 65 4 5 0 0 0 46 0 54 339 679
DUI Liquor Laws Drunk Disorderly Vagrancy All Others Total Part II Arrests Total Part I & Part II Arrests * = one-month total
** = three-month total
Policy Implications
The clearest policy implication of these findings is that UCR arrest statistics cannot be used to evaluate police performance by comparing one departments arrest data to that of other departments. Even year-to-year evaluations of arrest trends may be suspect due to the error rate. Violations of the counting rules are so easy to implement and so difficult to detect that evaluations of this nature may simply increase intentional misreporting, as opposed to current haphazard misreporting. A second policy implication is that the probability of an arrested person acquiring an arrest record may depend upon where the person is arrested. This report does not directly demonstrate this fact. (See, however, How Accurate are Individual Arrest Histories? on page 9). But the findings do suggest that it is possible, and even likely. If this is true, then many criminal justice, occupational licensing, and employment decisions made on the basis of an individuals past record are being made unfairly. A sentencing judge, for example, may punish one convict more severely than others based on differences in the length of arrest record. If these differences are due to different recording practices in different police jurisdictions, variations in arrest recording create a problem of fairness. A third policy implication seems almost futile in an era of federal spending reductions for the generation of social statistics. Nonetheless, it seems reasonable to conclude that many violations in counting rules could be corrected by providing more resources to the regulatory
system. If state UCR agencies had more personnel, they could conduct more training programs and more audits, and ensure greater compliance with counting practices.
Sherman, L.W. 1980 Defining Arrests: The Practical Consequences of Agency Differences. Criminal Law Bulletin 16, 4 & 5. Tittle, C.R., and A.R. Rowe 1974 Certainty of Arrest and Crime Rates: A Further Test of the Deterrence Hypothesis. Social Forces 52: 455-462. Wilson, J.Q. and B. Boland 1978 The Effect of the Police on Crime. Law and Society Review 12(3): 367-390.
References
Brown, D.W. 1978 Arrest Rates and Crime Rates: When Does a Tipping Effect Occur? Social Forces 57: 671-681. Federal Bureau of Investigation 1966 Uniform Crime Reporting Handbook. Washington, D.C.: U.S. Government Printing Office. Hindelang, M.J. 1978 Race and Involvement in Crimes. American Sociological Review 43: 93-109. International Association of Chiefs of Police, Committee on Uniform Crime Records 1929 Uniform Crime Reporting: A Complete Manual for Police (2d Ed). New York: J.J. Little and Ives. Klein, M.W., S.L. Rosenweig, and R. Bates 1975 The Ambiguous Juvenile Arrest. Criminology 13: 78-89. Phillips, L., and H.L. Votey, Jr. 1972 An Economic Analysis of the Deterrent Effect of Law Enforcement on Criminal Activity. Journal of Criminal Law, Criminology and Police Science 63: 330-342. Riccio, L.J., and J.F. Heaphy 1977 Apprehension Productivity of Police in Large U.S. Cities. Journal of Criminal Justice 5: 271-278.
Authors
Lawrence W. Sherman is vice president for research of the Police Foundation and professor of criminology at the University of Maryland. Barry D. Glick, project director at the Police Foundation, directed this project and currently is director of the shoplifting arrest experiment.
This study was conducted under Grant #80-IJ-CX-0039 from the National Institute of Justice, Office of Research and Evaluation Methods. Points of view or opinions expressed in this document do not necessarily represent the official position of the U.S. Department of Justice or the Police Foundation.
rrest statistics are a critically important measure of this nations efforts to control crime. They tell us many things: how productive police officers are, the characteristics of persons committing crimes, and our success rates at catching criminals committing different kinds of crimes. For both operational planning and basic research, arrest data are an indispensable tool. We should all be concerned if this tool is not working properly. This report suggests that there are important grounds for concern. In collaboration with police agencies serving the highest crime areas in the country, the Police Foundation has found variation where there should be consistency, and tabulation errors where there should be accuracy. Police record keeping has made tremendous strides in the years since police chiefs voluntarily formed the Uniform Crime Reporting (UCR) system in 1929. Compared to other major social statistics systems, the UCR system may even be one of the best in the country. But all systems of national scope involving thousands of organizations can benefit from periodic self-evaluations. The police departments cooperating with this study should be praised for their willingness to open their procedures so that the entire field can learn and make progress. James K. Stewart Director, National Institute of Justice
ew people realize the tremendous variety of police practices in this country. Police departments are generally more different than they are alike. Much of this variation is healthy. But it can make police departments very hard to compare and evaluate. This report clearly shows that it is inappropriate to evaluate police departments on the basis of their arrest rates. That is a troubling conclusion, given the need to increase productivity in an era of scarce municipal resources. The auto industry can count the cars it produces, and the education industry can count the number of students it graduates. But with our current statistical system, it is not possible to say with any confidence that one police agency makes more arrests per officer than another. Mayors, city managers, and city councils should understand this problem when reviewing annual or monthly police reports. The news media should understand this when writing feature stories about how well or how poorly local police departments perform compared to other cities. Most of all, researchers who test the effects of arrest rates on crime rates should understand the limitations of those statistics. Patrick V. Murphy President, Police Foundation
1201 Connecticut Avenue, NW Washington, DC 20036-2636 (202) 833-1460 Fax: (202) 659-9149 E-mail: pfinfo@policefoundation.org www. pol i cef oundat i on. org