MEASURING POLICE AND COMMUNITY PERFORMANCE USING WEB BASED SURVEYS 221076
User Manual: 221076
Open the PDF directly: View PDF .
Page Count: 219
Download | |
Open PDF In Browser | View PDF |
The author(s) shown below used Federal funds provided by the U.S. Department of Justice and prepared the following final report: Document Title: Measuring Police and Community Performance Using Web-Based Surveys: Findings from the Chicago Internet Project Author(s): Dennis Rosenbaum ; Amie Schuck ; Lisa Graziano ; Cody Stephens Document No.: 221076 Date Received: January 2008 Award Number: 2004-IJ-CX-0021 This report has not been published by the U.S. Department of Justice. To provide better customer service, NCJRS has made this Federallyfunded grant final report available electronically in addition to traditional paper copies. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. MEASURING POLICE AND COMMUNITY PERFORMANCE USING WEB-BASED SURVEYS: FINDINGS FROM THE CHICAGO INTERNET PROJECT Final Report Prepared by: Dennis P. Rosenbaum Amie M. Schuck Lisa M. Graziano Cody D. Stephens Center for Research in Law and Justice Department of Criminal Justice University of Illinois at Chicago November 25, 2007 This project was supported under award number 2004-IJ-CX-0021 to the University of Illinois at Chicago by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice. Findings and conclusions of the research reported here are those of the authors and do not necessarily reflect the official position or policies of the U.S. Department of Justice. This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. ACKNOWLEDGMENTS A project of this magnitude requires considerable indebtedness. First, we would like to thank the Chicago Police Department for agreeing to work with us as partners and for making this project a priority at police headquarters. We are especially grateful for the leadership provided by Assistant Deputy Superintendent Frank Limon, Lt. Michael Kuemmeth, and Sgt. Brian Daly. They worked diligently from the CAPS Project Office to minimize implementation problems within the police bureaucracy. Our gratitude also extends to Beth Forde, Vance Henry, Ron Huberman, Barbara McDonald, and Ted O'Keefe, each of whom provided thoughtful feedback during our feasibility study and the early stages of this project. We also wish to acknowledge Willie Cade, president of Computers for Schools, for assisting with this project and many others. Willie donated six laptop computers in order to stimulate greater community involvement. Participation rates were enhanced because survey respondents were eligible for as many as six drawing to win a laptop from Computers for Schools, a Microsoft Authorized Refurbisher. At the technical end, we are indebted to Raphael Villas, president of 2 Big Division, and his colleagues, who designed the web interface, background infrastructure for the website, and final graphics. We also wish to thank our own Academic Communications and Computing Center, whose employees helped us create the project website on the University's server, post the surveys, access the results and update the monthly content links. Finally, we are deeply grateful to Lois Mock, our grant monitor at the National Institute of Justice, whose guidance and support throughout this project was invaluable. Lois was always available when we needed her assistance with either substantive or bureaucratic questions, and she always provided encouragement when we needed it the most. Her insights about policing and evaluation issues, drawn from dozens of projects over the years, were especially helpful. With her retirement at the end of this year, she will be sorely missed by the entire criminal justice community. We wish her all the best. ii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. EXECUTIVE SUMMARY This is the story of the Chicago Internet Project, a joint information technology project involving the University of Illinois at Chicago, the Chicago Police Department and community residents in Chicago’s neighborhoods. The dual goals of this project are: (1) to successfully implement a large scale comprehensive web-based community survey and identify the challenges encountered when transferring this infrastructure to other settings; and (2) to determine whether a web-based survey system can enhance the problem solving process, increase community engagement, and strengthen police-community relations. A. Background Over the past two decades, American policing has been in a continual state of change and innovation. The COPS Office and the National Institute of Justice have promoted substantive reforms and evaluation research, respectively, at the local and national levels. Community policing and problem solving emerged as substantial reform models (Goldstein, 1990; Greene & Mastrofksi, 1988; Rosenbaum, 1994), but the obstacles to full-scale implementation have been numerous (see Fridell & Wycoff, 2004; Skogan & Frydl, 2004; Skogan, 2003a). Other police innovations are now competing for dominance, including broken windows, hot spots, Compstat, pulling-levers policing, which are all aided by advances in information technology (see Weisburd & Braga, 2006). Critics, however, argue that these aggressive policing strategies are undermining trust and confidence in the police, especially in minority communities (Tyler, 2005; Walker & Katz, 2008) and could have other adverse effects down the road (see Rosenbaum, 2007). The Chicago Internet Project is based on the premise that, while much has been done under the community policing and problem-oriented policing models, progress in reforming police organizations and communities has been restricted by failure to explore new measures of success and new methods of accountability that are grounded in the community. While community residents demand safer streets and less violence, they also want a police force that is fair and sensitive to their needs (Rosenbaum et al, 2005; Skogan, 2005; Tyler, 2005; Weitzer & Tuch, 2005). How can all of this be achieved, and how can it be measured? Police accountability. Traditionally, police accountability has been an internal and legal process, focusing on the control of officers through punitive enforcement of rules, regulations, and laws (Chan, 2001). Today, police organizations are under pressure to be responsive to the public both for crime control and police conduct. Consequently, there has been widespread interest in computer driven measurement of police performance using traditional crime indicators. Although useful for a specific purpose, these indicators of performance do not attempt to gauge in a meaningful way customer satisfaction with the quality of police service or the quality of police-community partnerships. As noted previously, only when the performance evaluation systems change can we expect police–community interactions to change (Rosenbaum, 2004). To achieve marked improvements in police performance, accountability systems will need to expand and incorporate new standards based on the goals of solving problems, engaging iii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. the community, building effective partnerships, and providing services that are satisfactory to all segments of the community. Similarly, local residents must be held more accountable for public safety and crime prevention. Too often, community residents expect the police to solve all of their public safety problems and concerns. The community's role in the prevention of crime is well established (Rosenbaum et al, 1998; Sampson, 1998). Unfortunately, we have yet to implement a standardized set of measures to capture the social ecology and crime prevention behavior of neighborhoods. In a limited way, the Chicago Internet Project also represents attempts to systematically measure community performance indicators. The information imperative. In this information-driven society, the POP/COP models foster a new information imperative (Dunworth et al. 2000; Rosenbaum, 2004) and call on police executives and universities to “measure what matters” in the 21st century (see Masterson and Stevens 2002; Mastrofski, 1999; Mirzer 1996, Skogan, 2003b). Particularly important (and often neglected) are data about concerns and priorities of local residents and community organizations, as well as factors in the local environment that are either preventative or criminogenic. If community engagement is a priority, then police officers need reliable information about community capacity, current levels of community crime prevention behaviors, and local resources that can be leveraged to help prevent crime and disorder. Measuring the policecommunity interface is critical for achieving strong police–community relations, and stimulating community based crime prevention. Both are needed to create effective partnerships that are postulated as the heart of community policing and problem solving (Cordner 1997; Rosenbaum 2002; Schuck & Rosenbaum, 2006). If police-community relations are a priority, accountability systems should begin to examine the day-to-day interactions between police and citizens. We should ask: How are the police responding to residents as victims, witnesses, suspects, complainants, callers, and concerned citizens? And how do residents respond to the police? In short, researchers, community leaders, and police administrators should begin to ask: What are the important dimensions of community-police relationships and interactions and how do we begin to collect timely, geo-based information on these constructs? Information technology and the police. The “information technology (IT) revolution” is finally reaching law enforcement (see Brown, 2000; Chan, 2001Dunworth, 2000; Reuland, 1997). Several technology driven law enforcement initiatives have received national attention in recent years, especially New York's COMPSTAT initiative (McDonald, 2000, 2005) and Chicago’s CLEAR program (see Skogan et al., 2002; Skogan et al, 2005). While law enforcement agencies are making significant progress toward harnessing the power of information technology, rarely do these initiatives give attention to the information imperative of problem solving and community policing. Rather, police tend to focus on new ways of processing traditional data elements (Chan et al, 2001; Rosenbaum, 2006; Weisburd et al, 2006). Measuring the fears, concerns, and behaviors of the community has not been a priority. Using web-based community surveys. Many police departments in the United States now offer on-line information about their services, programs and crime statistics (Haley & Taylor, 1998; Rosenbaum, Graziano, & Stephens, in preparation). To date, however, few have iv This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. moved beyond simply posting information to embracing the Internet as a proactive tool for obtaining new information about neighborhood conditions, solving problems, building partnerships, evaluating programs, and assessing unit performance. The focus of the Chicago Internet Project is a web-based community survey with repeated measurement. Although a few police departments post Internet surveys, these sporadic efforts, on the whole, have not been comprehensive, methodologically sound, institutionalized, or used for strategic planning and accountability. B. Research Questions and Research Findings The Chicago Internet Project addressed several key questions. In this summary, each research question is followed by an answer derived from our research findings. (1) Can we successfully design and implement a comprehensive community Internet survey? The answer is a resounding "Yes." As a team, we developed a comprehensive system of measurement. This system required, and exceeded the following core activities: (1) identifying samples of potential survey respondents; (2) developing multiple web surveys; (3) purchasing and installing appropriate Internet survey software; (4) gaining approval to use the University's server; (5) recruiting respondents by email and at community meetings; (6) monitoring survey returns and answering questions posed by respondents; (7) analyzing community specific survey data; (8) posting survey results; (9) developing and posting (or otherwise disseminating) educational/training information; (10) arranging incentives to increase participation rates; and (11) managing communication with the police department to insure fidelity of implementation.In other words, a number of resources were needed to build and sustain the infrastructure for conducting online surveys and providing feedback to the target communities. To say that we achieved successful implementation should not go without qualification. Numerous obstacles were encountered along the way, ranging from problems with using the University server to getting the police bureaucracy to distribute and discuss the survey findings at beat meetings. These problems are discussed in great detail in this report, but they should not obscure the bottom line that web-based community surveys can be employed with success if cities are willing to invest the time and resources necessary. (2) As a measurement device, how well does the Internet survey perform with respect to measuring neighborhood problems, community and police performance, and local program outcomes? One primary objectives of the Chicago Internet Project was to develop new external measures of police performance. In 1996, the National Institute of Justice held a series of workshops entitled "Measuring What Matters," where leading police scholars and police chiefs reflected on the problems with traditional performance measures and agreed that there is a pressing need to conceptualize and measure the dimensions of police performance that matter most to the community. Although the theoretical dialogue has continued over the past decade, little progress has been made at the empirical level. v This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. To fill this void, the Chicago Internet Project sought to develop, field test, and validate a number of survey measures. Our conceptual scheme for evaluation of the police posits three primary types of community assessment: general assessments of police officers; experiencebased assessments of police officers; and assessments of the police organization as a whole. Reaching far beyond traditional crime statistics, particular emphasis was given to addressing the following performance questions: • Are the police exhibiting good manners during encounters with residents? • Are the police competent in the exercise of their duties? • Are the police fair and impartial when enforcing the law? • Are the police acting lawful in the exercise of their duties? • Are the police equitable in the distribution of services? Drawing on theories of community policing and problem oriented policing the following process and outcome questions were also measured: • Are the police responsive to the community's concerns and problems? • Are the police effective in solving neighborhood problems? • Are the police engaging the community in crime control and prevention actions? • Are the police creating cooperative partnerships with the community? • Does the public perceive less crime and disorder? • Does the public report lower rates of victimization? • Does the public report less fear of crime? • Does the public perceive a higher quality of life in their neighborhood? • Does the public attribute organizational legitimacy to the police? In each of these domains, we found that Internet surveys are capable of yielding reliable and valid data. As the above questions clearly suggest, the measures we developed (or selected) covered a wide range of theoretical constructs regarding police performance. Second, these survey items were subjected to various tests of validity and reliability. When constructing composite indices, factor and reliability analyses were used to establish that the items formed a unidimensional factor with strong internal consistency. Often measures were taken at two or more points, and therefore, test-retest reliability coefficients were computed. Finally, for key indices, additional validity tests were conducted to establish construct and criterion validity. For example, based on prior research using telephone survey data, we hypothesized that African Americans and Latinos would report (via the Internet) more negative views of the police on vi This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. several performance dimensions, which is a test of "known groups" validity. The results strongly support this hypothesis, suggesting that web-based indices of police performance can successfully capture known group differences. More importantly, we found that geo-based web-surveys are sensitive to neighborhood differences that have been masked by large-scale surveys in the past. National or citywide surveys, for example, often contribute to the impression that African Americans, Latinos and whites are homogeneous groups with little within-group variability in their assessments of the police. Because the data were collected in smaller geographic areas over time, the Internet surveys were able to capture sizable differences in police performance assessments within racial/ethnic groups. The community measurement component of the web survey covered several variable domains: • Neighborhood conditions: Fear of crime, social and physical disorder, crime problems, and overall perceptions of neighborhood conditions • Individual resident performance: Individual, household, and collective crime prevention knowledge and behaviors • Community performance: Informal social control and collective efficacy For the community scales, validity tests were performed using multi-method procedures and the results were encouraging. For example, looking at neighborhood conditions in our 51 police beats, we compared data from our web survey against telephone survey data. The correlation between these two methods, using data collected three years apart with different random samples, are incredibly strong (ranging from .552 to .790). These findings suggest that the Internet can be used to capture valid impressions of neighborhood conditions in relatively small geographic areas. The table below shows the construct validity testing; we compared the findings from the Internet and telephone surveys with official police records for the 51 police beats. Again, the correlations between data collected from three very different methods are consistently positive and almost always statistically significant. Furthermore, actual neighborhood problems predict perceptions and fear. Neighborhoods (police beats) with higher levels of violent crime, illegal drugs, weapons, and disorder (as defined by the Chicago police) are places where web-survey respondents report significantly higher levels of fear, victimization, and disorder. vii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Official Chicago Police Department Crime Data (logged) Crime Violent Robbery Homicide Drug Weapons Disorder UIC Internet Survey Fear .489** .702** .694** .522** .725** .770** .345* Victimization .186 .342* .331* .353* .394** .477** .354* Disorder .276 .541** .463** .485** .698** .680** .284* Disorder .216 .476** .422** .427** .652** .620** .265 Northwestern Telephone Survey Fear .292* .482** .490** .351** .519** .586** .371** Disorder .188 .400** .405** .308* .514** .563** .233 (3) What are the effects on the target audiences of collecting and feeding back this information? For the general public, how does participating in online surveys affect their perceptions of neighborhood problems, community capacity, the local police, and participants' own crime prevention knowledge and behavior? For CAPS members, does their participation have some of the same effects, and furthermore, does it enhance the problem solving process? To explore these questions, the Chicago Internet Project included randomized trials with two separate groups: CAPS participants and a random sample of residents from the same police beats. The random sample provided stronger external validity because it is more representative of households in the study population. The CAPS group, however, provided the opportunity to test the effects of information sharing within a face-to-face police-community partnership. A third group, police officers were also studied. CAPS is a joint problem solving environment, so how the police respond to beat problems or evaluate their partnership with the community is also important. The procedural elements of the experiment include the collection of new information through Internet surveys, the dissemination (feedback) of this information to police and residents in selected beats, and supplemental education/training in the use of survey findings and/or crime prevention advice. Despite months of careful planning, training, and implementation monitoring, the results of these two experiments were not encouraging. Detailed analyses indicate that police beats assigned to the experimental conditions (survey feedback or survey feedback plus additional training and/or crime prevention advice) did not differ from the control beats on a wide range of police and community outcome measures. The question is, “Why?” For the CAPS experiment one possibility is that the survey response rate was extremely low. This was due in large part to inconsistent implementation of essential project tasks on the part of the police. Police in the test beats were assigned primary responsibility for making residents aware of the opportunity to participate in the Internet survey and leading discussions of the survey results at beat meetings, a viii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. strategy intended to invest police more fully in the process. However, they frequently failed to carry out the project tasks, most notably failing to discuss survey results on more than half of the occasions where they were expected to do so. Multiple obstacles to implementation were identified, including communication breakdowns within the organization, police resistance or lack of commitment to the project, rigid patterns of communication, immutable expectations of police and residents as to their roles at beat meetings, reassignment of personnel, and a general lack of organizational support for CAPS. CAPS participants were still given survey results (primarily results collected from the random sample in their beat), but this still did not change their opinions about the police or the community. The absence of serious problem solving at most beat meetings is considered the most formidable obstacle to implementation and the most likely explanation for the lack of impact. We learned that CAPS is a culture unto itself, with strong (and relatively traditional) norms about police and community roles (see Graziano, 2007). Rather than engage in joint problem solving, the police are expected to respond to residents' complaints, similar to 911 calls, but in person. For lower crime neighborhoods, CAPS meetings can sometimes become social events where problem solving would be viewed as an inconvenience. Hence, the introduction of new survey information and pressure from the police administration (and the University) to engage in problem solving was met with some passive resistance at various levels. On a more positive note, we were able to identify a group of randomly selected residents in each of the 51 police beats who were willing to go online and remain engaged in the panel survey for several months and multiple surveys. These participators were generally younger than the CAPS sample and less inclined to attend community meetings. Hence, through random sampling and telephone outreach, we were able to "democratize" the process of engaging the community in a dialogue about public safety issues. (Skogan et al., 2002 notes that CAPS participants represent, on average, only 0.5% of the beat population). These randomly selected individuals represent "the silent majority" in neighborhoods and their public safety input is rarely sought, except via occasional large-scale surveys. Their knowledge, perceptions, beliefs, attitudes and opinions became the primary data for testing a new measurement system. Although these randomly sampled residents agreed to participate, and provided valuable data, the experimental interventions did not change their perceptions or behavior. We suspect that these null effects were due to weak "dosage of the treatment." Most reported that they saw the survey results, but only a small number of respondents clicked on the community resource links to receive additional information about community crime prevention. (4) What lessons are learned that are transferable to other communities? From management and research perspectives, the efficiency of the Internet allows for hundreds of performance comparisons within and across jurisdictions. This tool can be used, for example, to assess the impact of localized interventions (e.g. the impact of installing cameras in crime hot spots on residents’ awareness, fear, and risk of detection compared to control locations), to compare performance across beats or districts (e.g. police visibility and response times across different Latino beats), or to compare performance across jurisdictions (e.g. perceived police demeanor during traffic stops in African American neighborhoods in Chicago, London, and Los Angeles). The possibilities are endless, but making comparisons is the key to good measurement (Maguire, 2004). ix This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Our experience in Chicago suggests that motivated communities will find it quite feasible to institute a system of online performance measurement. However, this project raises many questions that must be addressed in the future. First, what is the primary purpose of the system? For example, are you interested in: (1) assessing community needs, defining and analyzing problems, and identifying community resources? (2) measuring police performance? (3) measuring community performance? (4) evaluating new public safety programs? and/or (5) querying the public about police policies and procedures and other justice initiatives? In the Chicago Internet Project, we covered a plethora of domains, but would encourage communities to find their primary interest. Second, whose opinions in the "community" are you seeking? Do you want the views of random, representative samples? How about persons who monitor places and are experts on specific locations? How about persons with recent police encounters? Although community samples are feasible (as we have shown), they are not the most cost-efficient way to implement a web-based system. To begin with, we would encourage technologically savvy communities to systematically evaluate how police perform during routine police encounters. This topic has been a major source of tension between the police and the community for the past two decades. As public interest in procedurally just policing reaches unprecedented heights, web-based surveys offer one possible solution. We believe that customer satisfaction with police services and police encounters is the next frontier for systematic measurement to address equity concerns. In the U.S., 43.5 million persons had face-to-face contact with the police in 2005 (Durose et al, 2007). Residents have contact with the police in various settings (e.g. calls for service, incident reports, community meetings, vehicular or pedestrian stops, arrests) and in each of these encounters, email addresses could be collected and entered into the system. Email invitations to complete a short customer satisfaction survey could be sent in the days that follow. This type of feedback can be used to monitor and adjust performance at the individual, beat or district levels or within special units or bureaus. The National Research Council report on the status of American policing entitled, Fairness and Effectiveness in Policing (2004), emphasizes that public confidence in the police depends not only an organization's effectiveness in fighting crime, but also on the public's perception of how they are treated by the police and the perceived quality of police service during police-citizen encounters. We believe that police organizations who endorse this type of measurement system will receive very high marks on organizational transparency, legitimacy, professionalism, and responsive to the community. As a final question, who should have access to, and control over, the information generated? Although police cooperation and partnership are essential, we reluctantly conclude that the police organization should not completely control the data collection system. History suggests that maximum reform will occur when outsiders are watching and involved. Without question, internal gains have been achieved by introducing new mission statements, rules and regulations, officer training, and supervision, but critics have argued that external oversight is necessary to achieve sizable and lasting change in police organizations. Reiss (1971) and Mastrofski (1999) have proposed independent “auditing bureaus” to collect data on how citizens are treated by the police and vice versa. Additionally, in recent years, independent police auditors have been created as part of new accountability structures (Walker, 2005), although not necessarily with survey skills or interests. We propose that the important function of external x This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. performance measurement be assigned to universities or other independent organizations with expertise in both policing and social science research and that are able to establish working partnerships with the police and other stakeholders. xi This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. TABLE OF CONTENTS CHAPTER ONE. INTRODUCTION………………………………………………………….. 1 A. Overview…………………………………………………………………………….... 1 B. Goals and Objectives………………………………………………………………….. 2 C. Key Research Questions………………………………………………………………. 2 CHAPTER TWO. LITERATURE REVIEW…………………………………………………. 4 A. Overview………………………………………………………………………………. 4 B. Accountability…………………………………………………………………………. 4 C. The Information Imperative………………………………………………………….... 5 D. Information Technology and the Police……………………………………………….. 6 E. Testing Web-Based Community Survey………………………………………………. 7 F. Survey Feedback……………………………………………………………………….. 8 CHAPTER THREE. PROCEDURES AND METHODS…………………………………….. 9 A. Experimental Design…………………………………………………………………. 9 B. Setting…………………………………………………………………………………. 9 C. Selection and Assignment of Study Beats……………………………………….…....10 D. Selection of Internet Users within Study Beats……………………………………….12 E. Development of Internet Survey Instruments………………………………………... 16 F. Development and Implementation of Observation and Questionnaire Methods…….. 16 G. Development of Website and Educational Linkages………………………………….18 H. Posting Surveys/Managing the Monthly Sample…………………………………….. 24 CHAPTER FOUR. LEVELS OF PARTICIPATION IN THE CHICAGO……….………. 28 INTERNET PROJECT A. CAPS Resident Participation………………………………………………………… 28 B. Random Sample Participation………………………………………………………... 37 CHAPTER FIVE. ADVANCES IN MEASUREMENT: THE DIMENSIONS……………. 46 OF INTERNET SURVEY INFORMATION A. Measurement Overview……………………………………………………………… 46 1. Traditional Performance Measures…………………………………………… 46 2. Establish a Mandate and Information Imperative……………………………. 48 3. Level of Measurement………………………………………………………... 49 4. Community-Based Measurement…………………………………………….. 50 5. Community Performance……………………………………………………... 51 B. Measurement Theory and Scale Construction……………………………………….. 51 C. Measures of Police Performance…………………………………………………….. 52 1. Dimensions of Police Performance …………………………………………... 53 2. General Assessments of the Police…………………………………………… 54 3. Global Evaluations of the Police………………………………………………54 D. Competency Indices…………………………………………………………………. 56 1. Police Knowledge Index……………………………………………………… 56 2. Police Reliability Index………………………………………………………. 56 xii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. E. Neighborhood Specific Evaluations of the Police……………………………………. 57 CHAPTER FIVE. (CONTINUED) 1. Responsiveness to Community Index………………………………………… 57 2. Satisfaction with Neighborhood Police………………………………………. 58 F. Experience-based Assessments of the Police………………………………………… 58 1. Assessments of Police Stops Index …………………………………………... 59 2. Satisfaction with Police Contacts…………………………………………….. 60 G. Performance at Public Meeting Index……………………………………………….. 62 H. Affective Response to Police Encounters…………..………………………………... 63 1. Anxiety Reaction Index………………………………………………………. 63 2. Secure Reaction Index………………………………………………………... 63 I. Assessments of Organization Outcome………………………………………………. 64 1. Police Visibility Index………………………………………………………... 65 2. Effectiveness in Preventing Crime Index…………………………………….. 65 3. Effectiveness in Solving Problems Index…………………………………….. 66 4. Willingness to Partner with the Police Index………………………………… 67 5. Engagement of the Community Index………………………………………... 67 6. Police Misconduct Index………………………………………………………68 7. Racial Profiling Index………………………………………………………… 69 8. Organizational Legitimacy Index…………………………………………….. 70 J. Measuring Individual and Collective Performance Indicators……………………….. 71 1. Neighborhood Conditions……………………………………………………. 71 2. Individual Resident Performance…………………………………………….. 76 3. Collective Performance………………………………………………………. 80 K. Further Validation of Scales…………………………………………………………. 82 1. Multi-Method Validation of Scales…………………………………………... 82 2. Known Groups Validation of Scales…………………………………………. 83 L. Measurement Sensitivity……………………………………………………………... 86 1. Within-Race Differences……………………………………………………... 86 2. Identifying Hot Spots………………………………………………………… 90 CHAPTER SIX. THE CAPS EXPERIMENT: FINDINGS AND LESSONS LEARNED... 91 A. Implementation Results within the CAPS Framework……………………………… 91 1. Feasibility Study……………………………………………………………… 91 2. Implementation: Protocol and Integrity……………………………………… 91 3. Police Attitudes about Participation in Project……………………………… 100 4. Obstacles to Implementation…………………………………………………104 B. Testing the Effects on Residents and Officers: Hypotheses and Methods …………. 111 1. Hypotheses …………………………………………………………………. 111 2. Measurement and Scale Construction………………………………………. 112 3. Sample………………………………………………………………………. 118 4. Statistical Techniques and Analysis………………………………………… 121 C. Results……………………………………………………………………………… 124 xiii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER SEVEN. RESULTS FROM THE RANDOM SAMPLE: THE……………… 133 IMPACT OF INTERNET INFORMATION ON PARTICIPANTS’ PERCEPTIONS AND BEHAVIORS A. Hypothesized Impact of Interventions on Random Sample…………………………133 B. Implementation Problems with Website……………………………………………. 135 C. Statistical Techniques and Analysis Strategy………………………………………. 136 D. Results………………………………………………………………………………. 140 E. Summary……………………………………………………………………………. 147 CHAPTER EIGHT. CONCLUSIONS……………………………………………………… 148 A. The Experimental Interventions…………………………………………………….. 148 B. New Measurement System…………………………………………………………. 150 REFERENCES……………………………………………………………………………….. 153 xiv This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. LIST OF APPENDICES APPENDIX A. Pre-test and Post-Test Police Questionnaires……………………………... A1 APPENDIX B. Pre-test and Post-Test Citizen Questionnaires……………………………. B1 APPENDIX C. Beat Meeting Observation Form…………………………………………… C1 APPENDIX D. Directives Memo for CPD Personnel………………………………………. D1 APPENDIX E. Instructions Flyer for Completing Web Survey…………………………… E1 APPENDIX F. Example of Results Distributed at Beat Meetings…………………………. F1 APPENDIX G. Problem Solving Exercise…………………………………………………... G1 xv This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. LIST OF TABLES CHAPTER THREE Table 3.1 Descriptive Statistics of Chicago Internet Study Beats……………………………… 11 Table 3.2 Telephone Pre-Experiment -- Final Calling Outcomes (April 2005)………………… 14 Table 3.3 Post-Experiment Final Calling Outcomes (January 2006)…………………………… 15 Table 3.4 Overview of the Monthly Link Content……………………………………………… 23 CHAPTER FOUR Table 4.1 CAPS Response Rate for Internet Surveys (%)………………………………………. 28 Table 4.2 Participants at CAPS Beat Meetings and CAPS……………………………………... 30 Participants who Completed Internet Surveys Table 4.3 Obstacles to Resident Participation as Identified…………………………………….. 34 by Civilian and Police Facilitators (N=70) Table 4.4 Response Rates for Internet Surveys…………………………………………………. 38 Table 4.5 The Number of Internet Surveys Completed…………………………………………. 40 Table 4.6 Respondent’s Profiles………………………………………………………………… 40 Table 4.7 Summary of Demographic Characteristics of Respondents by Participation Level….. 42 Table 4.8 Bivariate Results for Predictors of Participation in Internet Surveys………………… 43 CHAPTER FIVE Table 5.1 A Comparison of Telephone and Internet Data………………………………………. 83 Table 5.2 A Comparison of Official and Internet Data…………………………………………. 83 Table 5.3 HLM Linear Regression Estimates for the Impact of Resident’s……………………. 85 Race on Policing Constructs Table 5.4 Bivariate Correlations for Residents from African American Communities………… 87 CHAPTER SIX Table 6.1 Implementation Protocol by Experimental Condition……………………………….. 93 Table 6.2 Project Flyer Distribution Rate by Experimental Condition (%)…………………….. 95 Table 6.3 of Survey Results in Feedback and Training Groups (%)……………………………. 97 Table 6.4 Discussion of Survey Results in Feedback and Training Groups (%)……………….. 97 Table 6.5 Obstacles to Implementation as Identified by Civilian and Police………………….. 105 Facilitators (N=68) Table 6.6 Demographic Characteristics of Citizen Beat Meeting Participants…………………119 by Experimental Conditions (N=668) xvi This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER SIX (CONTINUED) Table 6.7 Demographic Characteristics of Citizen Beat………………………………………. 120 Meeting Participants by Experimental Conditions (N=184) Table 6.8 Summary of Community Hypotheses for the CAPS Experiment……………………122 Table 6.9 Summary of Police Hypotheses for the CAPS Experiment…………………………. 123 Table 6.10 A Summary of Multilevel Regression Estimates for………………………………. 125 Community Hypotheses of CAPS Experiment (Resident Questionnaire) Table 6.11 OLS Regression Estimates for Community Hypotheses…………………………... 126 of CAPS Experiment (Observation Form) Table 6.12 Logistic Regression Estimates for Commitment to………………………………... 127 Future Action for Community Hypotheses/CAPS Experiment (N = 48) Table 6.13 OLS Regression Estimates for Police Hypotheses of……………………………… 128 CAPS Experiment (Police Questionnaire) Table 6.14 OLS Regression Estimates for the Level of Implementation……………………… 129 for Community Hypotheses of CAPS Experiment (Resident Questionnaire) Table 6.15 OLS Regression Estimates for the Level of Implementation……………………… 130 for Community Hypotheses of CAPS Experiment (Observation Form) Table 6.16 Logistic Regression Estimates for the Level of Implementation………………….. 131 for Commitment to Future Action of CAPS Experiment (N = 48) Table 6.17 OLS Regression Estimates for the Level of Implementation……………………… 131 for Police Hypotheses of CAPS Experiment (Police Questionnaire) Table 6.18 OLS Regression Estimates for Citizen/Beat Characteristics………………………. 132 for Survey Completion Rates (N=50) CHAPTER SEVEN Table 7.1 Summary of Policing Hypotheses for Random Sample with……………………….. 138 Descriptive Statistics Table 7.2 Summary of Individual and Community Hypotheses for Random…………………. 139 Sample with Descriptive Statistics Table 7.3 A Summary of Multilevel Regression Results for Policing………………………… 141 Hypotheses with Random Sample of Respondents Table 7.4 A Summary of Multilevel Regression Results for Individual/Community…………. 142 Hypotheses with Random Sample of Respondents Table 7.5 A Summary of Multilevel Regression Results for the Number of………………….. 143 Internet Surveys Completed for Policing Hypotheses Table 7.6 A Summary of Multilevel Regression Results for the Number of………………….. 144 Internet Surveys Completed for Individual/Community Hypotheses Table 7.7 A Summary of Multilevel Regression Estimates for Viewed………………………. 145 Survey Results on Policing Hypotheses xvii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER SEVEN (CONTINUED) Table 7.8 A Summary of Multilevel Regression Estimates for Viewing……………………… 146 Results on Individual/Community Hypotheses xviii This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. LIST OF FIGURES CHAPTER THREE Figure 3.1 Chicago Internet Project Website Page……………………………………… 19 Figure 3.2 Login Page…………………………………………………………………… 20 Figure 3.3 Survey Results Example……………………………………………………... 21 Figure 3.4 Crime Prevention Concepts Example……………………………………… 22 Figure 3.5 Community Participation Example………………………………………...... 22 Figure 3.6 Community Resource Links…………………………………………………. 24 CHAPTER FIVE Figure 5.1 Box plots for Police Manners and Fairness Scales…………………………... 87 Figure 5.2 Box Plots for Police Problem Solving and Reliability Scales……………….. 88 Figure 5.3 Box Plots for Police Responsiveness Scales………………………………… 89 xix This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER ONE INTRODUCTION A. Overview Despite the observable progress in the areas of partnership building, problem solving, information technology, data-driven deployment, and police accountability, the fact remains that law enforcement organizations (or other entities) have yet to develop data systems to measure “what matters” to the public and “what matters” according to community policing and problemoriented policing models. One can argue that this reality has limited organizational change, stunted the growth of police-community relations, and restricted organizational and community efficacy (Rosenbaum, 2004). The current project sought to fill this gap by developing, implementing, and evaluating the Chicago Internet Project. The University of Illinois at Chicago, in cooperation with the Chicago Police Department and community residents, developed and field tested a comprehensive web-based community survey in 51 Chicago police beats. The elements of the intervention included the collection of new information online, the dissemination (feedback) of this information to police and residents, the use of these new data elements in a problem solving setting (CAPS), and training in the use and interpretation of survey findings. CAPS (Chicago Alternative Policing Strategy), the Chicago Police Department’s community policing program, consists of multiple components seeking to form and strengthen police-community partnerships in Chicago. This project was implemented, in part, within the context of CAPS community beat meetings. Chicago has 281 police beats in 25 police districts. Each month community residents have a structured opportunity to meet with beat officers and their team sergeant to engage in beat-level problem solving (see Skogan & Hartnett, 1997; Skogan et al, 1999; Bennis et al., 2003). On average, 25 residents and 7 police officers attend a typical beat meeting. In one component of this study, monthly Internet surveys were completed by CAPS beat meeting participants. In another component, monthly Internet surveys were completed by a random sample of residents from each study beat who do not regularly attend CAPS meetings. CAPS participants comprise a voluntary, self-selected group that represents, on average, only 0.5% of the beat population. They are more likely than the typical resident to be homeowners, non-Latinos, and residents over 65 (Skogan, 2006). Hence, the random sample of online participators provided a separate test of external validity and allowed us to examine feedback effects without public deliberation. Whether online surveys and information feedback loops can contribute to the public safety processes is an important question in this electronic age. The Internet opens the door to virtually unlimited possibilities for two-way information sharing between the police and the community. For years, police have resisted sharing crime-related information with residents because of concern about heightened fear, but this apprehension has not been supported by controlled tests of this hypothesis (see Rosenbaum et al., 1998). But the question remains, what is the impact of sharing diverse types of geo-based information on public perceptions of crime, police, police-community partnerships and the community itself? Can partnerships, problem solving, and community crime prevention behaviors be enhanced via this process? To date, we 1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. know very little about how new crime-related and prevention-related information will affect perceptions and behaviors at the individual or collective level. At the same time that social scientists are beginning to explore the social consequences of Internet use (see Katz & Rice, 2002, for a review of the literature), our knowledge of its effects in the police-community context is virtually nonexistent. B. Goals and Objectives The dual goals of this project are (1) to successfully implement, on a large scale, a comprehensive web-based community survey and identify the challenges to transferring this infrastructure to other settings; and (2) to determine whether a web-based survey system can enhance the problem solving process, increase community engagement, and strengthen policecommunity relations. The elements of the intervention include the collection of new information through the Internet, the dissemination (feedback) of this information to police and residents, the use of these new data elements in a problem solving setting (CAPS), and training in the use and interpretation of survey findings. C. Key Research Questions Several key questions were addressed as part of this Chicago Internet Project: (1) Can we successfully design and implement a comprehensive community Internet survey? Specifically, what resources and design processes are necessary to build the infrastructure and implement online surveys and feedback mechanisms in the field? What obstacles were encountered and what lessons were learned? (2) As a measurement device, how well does the Internet survey perform with respect to measuring neighborhood problems, community and police performance, and local program outcomes? Are the survey questions reliable over time? Do they have content validity, covering a wide range of relevant constructs and components of these constructs? Do they have construct validity, thus tapping into some of the key underlying constructs in the police-community arena? (3) How well does the Internet survey work as a mechanism for giving feedback to community residents and police officers? Specifically, are the recipients open to receiving feedback and do they take the process seriously? Do they spend time discussing and reacting to the information? Are they able to use the information to identify and prioritize neighborhood problems? (4) What are the effects on the target audiences of collecting and feeding back this information? For the general public, how does participating in this process affect their perceptions of neighborhood problems, community capacity, the local police, and residents’ own crime prevention knowledge and behavior? For CAPS participants, does it enhance the problem 2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. solving process by improving problem identification or analysis? Does it stimulate more solutions and action plans? 1 Whether residents participate in CAPS or receive feedback online, at the core of this evaluation is a set of questions about whether survey information, when supported with training and technical assistance, will influence residents’ perceptions of crime, neighborhood safety, the local police, and their own capacity to prevent crime. Here we seek to understand the effects of four processes: (1) feedback of public attitudinal and perceptual survey data; (2) feedback of crime prevention tips; (3) public deliberation about survey findings; and (4) training in the use of survey research findings (see research design below for details). We have conducted randomized field experiments testing specific Internet interventions with two separate groups: CAPS participants and a random sample of residents from the beat. The latter provides greater external validity as this group is more likely to be representative of households in their neighborhood. The CAPS group, however, provides the advantage of allowing us to examine the impact of information sharing within a police-community partnership. This study will also examine the impact of survey information and public deliberation on officers’ perceptions and behaviors. Because CAPS is a joint problem solving environment, the role of police is important. How they respond to beat problems or evaluate their partnership with the community may or may not be affected by the survey findings. (5) Do the effects of information feedback vary as a function of individual or group characteristics? The beats and individuals sampled are quite diverse and thus, various factors may interact with the treatment to produce conditional effects. Will residents respond differently to this experiment than the police? Will residents from predominately African American or Latino beats respond differently than White beats? Will neighborhood context (e.g. levels of crime and poverty), which affect community capacity, influence this particular intervention? (6) What lessons are learned that are transferable to other communities? Despite extensive field testing, many feasibility questions remain. These questions include: What obstacles are encountered in the data collection, analysis and feedback process? How can the Internet survey be refined and adapted for use in other cities? Can the results be helpful in establishing expectations about levels of participation, identifying obstacles to online surveys, and evaluating specific survey items and scales? The Chicago Internet Project also raises some fundamental questions about how best to enhance the accountability of the police to the communities they serve, and the potential role of universities or other independent entities in the data collection and reporting process. 1 In 2003, only 1 in 5 Chicago beat meetings resulted in an action plan. 3 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER TWO LITERATURE REVIEW A. Overview Over the past 20 years, we have witnessed a flurry of activity directed at improving policing in America. The COPS Office and NIJ have promoted substantive reforms and evaluation research, respectively at the local and national levels. Community policing and problem solving have emerged as the primary models for policing in the future (see Greene, 2000, Rosenbaum, 1994), but the challenges ahead are numerous (see Fridell & Wycoff, 2004; National Research Council, 2004; Skogan, 2003a). A few quick examples: (1) Advanced technology is now available, but using it for sophisticated problem solving, strategic planning or community engagement is a task for the future; (2) Accountability is the coin of the realm, but accountability to whom and for what? Rather than accountability to central administrators for crime rates, can police organizations become more transparent to the public and accountable at the neighborhood level for things that matter to local residents? (3) Zero-tolerance, broken windows, Compstat, and hot spots policing may have played some role in declining crime rates in the 1990s (Weisburd & Braga, 2006), but critics argue that aggressive policing will undermine trust and confidence in the police, especially in minority communities (Tyler, 2005; Walker & Katz, 2008) and may have other adverse effects as well (see Rosenbaum, 2006; 2007). Some have argued that police can be both stronger and gentler under the right conditions (Harris, 2003), but what are those conditions and why haven’t we created them more often? (4) Police organizations are learning the value of partnerships in crime fighting (McGarrell & Chermak, 2003; Roehl et al., 2006), but we know so little about the partnership dynamics that too often undermine these relationships and limit problem solving skills (see Rosenbaum, 2002). As we strengthen the capacity of police organizations to respond to public safety issues (with better training, intelligence, analysis capabilities, accountability, etc.), what have we done to strengthen communities? How institutionalized are the partnerships and what can be done to strengthen them? (5) Finally, police organizations today are engaging in a multitude of problem solving and community engagement projects, including street-level hot spots policing and disorder policing, but how do we know when these efforts have been effective? How is success defined and what data system can be used to measure success? The Chicago Internet Project is based on the premise that, while much has been done under the community policing and problem-oriented policing models, progress in reforming police organizations and communities has been restricted by our failure to explore new measures of success and new methods of accountability that are grounded in the community. While community residents demand safer streets and less violence, they also want a police force that is fair and sensitive to their needs (Rosenbaum et al, 2005; Skogan, 2005; Tyler, 2005; Weitzer & Tuch, 2005). How can all of this be achieved, and how can it be measured? B. Accountability Traditionally, police accountability has been an internal and legal process, focusing on the control of officers through punitive enforcement of rules, regulations and laws (Chan, 2001). 4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Today, police organizations are under pressure to be responsive to the public both for crime control and police conduct. Following the lead of New York’s COMPSTAT model, we have seen widespread interest in computer-driven measurement of police performance using traditional crime indicators. These traditional measures are important, but grossly inadequate for satisfying the new information imperative of community policing and for taking urban police organizations to the next level of performance. Simply put, these indicators of performance do not attempt to gauge in a meaningful way customer satisfaction with the quality of police service or the quality of police-community partnerships. As Rosenbaum (2004) notes, only when the performance evaluation systems change can we expect police-community interactions to change. To achieve marked improvements in police performance, accountability systems will need to be expanded to incorporate new standards based on the goals of partnership building, problem solving, community engagement, and resident satisfaction with police services. Similarly, local residents must be held more accountable for public safety and crime prevention. Despite the overall success of community policing in Chicago, for example, most local residents do not attend CAPS meetings and do not participate in problem solving. (Skogan & Hartnett, 1997; Skogan, 2006) The residents who do participate often expect the police to solve their problems, and sustaining their participation is a continuous challenge for the police. Thus, creating official measures of residents’ performance may increase their accountability for neighborhood conditions. C. The Information Imperative In this information driven society, community- and problem-oriented models of policing create a new information imperative (Dunworth et al. 2000; Rosenbaum, 2004) and call on police executives to “measure what matters” in the 21st century (see Masterson and Stevens 2002; Mirzer 1996, Skogan, 2003b). If policing organizations are serious about decentralization of authority, for example, then beat officers must be empowered with up-to-date information about neighborhood characteristics and should be accountable for their relationship with neighborhood residents. If data-driven problem solving is a priority, then police officers and supervisors need timely geo-based information relevant to all phases of the problem analysis process (see Boba, 2003; Goldstein, 1990). Especially important (and often neglected) are data about the concerns and priorities of local residents and community organizations, as well as factors in the local environment that are either preventative or criminogenic. If community engagement is a priority, then police officers need reliable information about community capacity, current levels of community crime prevention behaviors of neighborhood residents, and local resources that can be leveraged to help prevent crime and disorder. Measuring the police-community interface is critical for achieving strong police-community relations, and stimulating community-based crime prevention. Both are needed to create effective partnerships that were postulated as the heart of community policing and problem solving (Cordner 1997; Rosenbaum 2002; Schuck & Rosenbaum, 2006). If police-community relations is a priority, accountability systems should begin to examine the mundane day-to-day interactions between police and citizens. Here we should ask: How are the police responding to residents as victims, witnesses, suspects, complainants, callers, and concerned citizens? And how do residents respond to the police? In a nutshell, researchers, 5 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. community leaders, and police administrators should begin to ask what are the important dimensions of this relationship? Drawing on the private sector model, Mastrofski (1999) outlines six characteristics of good police service: attentiveness, reliability, responsiveness, competence, manners, and fairness. Skogan and Harnett (1997) have validated several of these through citywide resident telephone surveys in Chicago. Yet a system for collecting timely, geobased survey data has yet to be created. An important policy question is how to get police officers to engage in these behaviors more often? Some gains have been achieved by introducing new mission statements, rules and regulations, and officer training, but critics have argued that external oversight is necessary to achieve sizable and lasting change in police organizations. Reiss (1971) and Mastrofski (1999) have proposed independent “auditing bureaus” to collect data on how citizens are treated by the police and vice versa. In any event, various methods have been employed for collecting new information from citizens, ranging from surveys to beat meetings (see Skogan and Hartnett 1997). According to national surveys, one of the largest changes in police organizations between 1992 and 1997 was the use of citizen surveys to gauge public reactions (Fridell & Wycoff, 2004). By 1997, roughly 3 out of 4 departments claim to have used citizen surveys to help them identify needs and priorities, and nearly as many used them to evaluate police services. The challenge, as laid out here, is to institutionalize this process using web-based technology. D. Information Technology and the Police The “information technology (IT) revolution” is a half-century old, yet it is just beginning to impact the criminal justice system (see Brown, 2000; Chan, 2001), which lags far behind the private sector (Dunworth, 2000; Reuland, 1997). Since the mid-1990s, the COPS Office has helped to stimulate a renewed interest in IT, especially by funding laptop computers for patrol officers (see Roth et al., 2000). A variety of new technology-driven law enforcement initiatives have received national attention in recent years, such as COMPSTAT (McDonald, 2000, 2005) and COMPASS (Dalton, 2002), and these models have given police organizations a taste of what is possible. While law enforcement agencies are making significant progress toward harnessing the power of information technology, rarely do these initiatives give attention to the information imperative of problem solving and community policing. Rather, police tend to focus on new ways of processing traditional data elements to catch known criminals (Chan et al, 2001; Rosenbaum, 2006; Weisburd et al, 2006). One of the most sophisticated of these information systems is Chicago’s CLEAR (Citizen and Law Enforcement Analysis and Reporting) program. As with other police data systems, it has multiple components intended to improve traditional law enforcement strategies. A key difference, however, is its community component, which has been conceptualized as a vehicle for increased information sharing with the community (see Skogan et al., 2002; Skogan et al, 2005). This community component of Chicago’s CLEAR program remains undeveloped, but recently, the Chicago Police Department, in partnership with community organizations, has moved ahead with a new initiative–CLEARPath–in the hope of beginning to fill this gap. Hence, when the Chicago Internet Project was initiated, the research team faced a unique opportunity to begin measuring, for the first time, neighborhood concerns and behaviors that are 6 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. known to be important for maintaining public safety and strengthening police-community partnerships. Having spent considerable time developing and testing measures of the public’s perceptions of crime, disorder, and police performance, as well as residents’ reactions to crime (e.g. Rosenbaum, 1986; 1994; Rosenbaum & Baumer, 1981; Rosenbaum, Lurigio, & Davis, 1998; Rosenbaum et al., 2005; Schuck & Rosenbaum, 2005), we began working with the CPD and community leaders to develop an unprecedented web-based system of data collection with the potential for transferability. E. Testing a Web-Based Community Survey A number of police departments in the United States now offer online information about their services, programs and crime statistics (Haley & Taylor, 1998; Rosenbaum, Graziano, & Stephens, in preparation). To date, however, few have moved beyond simply posting information to the point of embracing the Internet as a proactive tool for obtaining new information about neighborhood conditions, solving problems, building partnerships, evaluating programs, and assessing unit performance. Rosenbaum (2004) has proposed a comprehensive website with five major components, ranging from crime reporting to performance assessment. The focus of this project is on one major component, namely, web-based community surveys. Although a few police departments conduct Internet surveys, these efforts are not comprehensive, methodologically defensible, or institutionalized. Furthermore, the information is not used as a primary source for strategic or tactical planning by police or community residents. In Chicago, we developed and pilot tested a comprehensive web-based community survey that is designed to achieve several measurement objectives: (1) Monitor neighborhood conditions and citizen performance. Citizen performance measures will capture levels of community involvement, collective efficacy, perceptions and fears about safety and crime, problem solving skills, and crime prevention behaviors, and more. Knowing the level of community efficacy and involvement will someday allow police and community leaders to determine the scope of community building efforts that are needed before satisfactory community-police collaborations can occur. Abrupt reductions or increases in citizen perceptions of crime problems and fears will help monitor “perceptual hot spots,” direct police resources, and evaluate police and/or community initiatives within particular communities. (2) Monitor police performance. An Internet survey can be used to gauge residents’ perceptions of police performance, capturing aspects of police performance important to the community, such as general perceptions of police competency and fairness. In addition to capturing general sentiments, an Internet survey can be used to screen for persons who have had a recent encounter with the police (as victims, witnesses, callers, drivers, walkers, arrestees, meeting attendees, complainants, etc) and then branch off into a series of questions about how they were treated. These “customer satisfaction” items build upon the citywide resident telephone surveys used to evaluate community policing in Chicago for 10 years (see Skogan & Steiner, 2004) and adapted by the Vera Institute of Justice to evaluate police performance in New York City and Seattle (Miller et al, 2005). Thus, key aspects of police-resident encounters can be captured. 7 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. The major limitation of previous high-quality survey research, usually involving telephone survey methods, is that the findings cannot be disaggregated to small geographic areas and they are one-time “snapshots” of community responses. The added cost of collecting data more frequently or producing larger sample sizes (needed for smaller areas) has been strictly prohibitive. Online surveys offer a potential alternative that is cost-effective. (3) Evaluate anti-crime interventions. By conducting monthly or bi-monthly online surveys at the police beat level, both the police and residents can receive timely data indicating whether problems are increasing, decreasing, or staying the same. Data from comparable beats that do not receive a particular police or community intervention can serve as control groups to estimate program impact. (4) Offer policy recommendations. Web-based surveys present a great opportunity for police agencies to receive new ideas and suggestions from citizens. Multiple perspectives on problems, programs, and policies can be encouraged. F. Survey Feedback An Internet survey is valuable for communities if it can produce timely, geo-based information that is useful for planning or evaluating local police and community actions designed to improve neighborhood safety. Within a community policing/problem oriented policing framework, the Chicago Internet Project will examine whether survey feedback is useful for changing the perceptions and behaviors of residents and police officers. 8 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER THREE PROCEDURES AND METHODS A. Experimental Design This project employed a randomized trial in order to study and test our previously stated research questions (see Chapter 1) in regards to the development of a comprehensive web-based community survey. The basic design included (1) the collection of new community-based information through the use of Internet surveys; (2) dissemination (feedback) of this information to police and residents; and (3) exposure to additional training and educational materials. A random sample of residents and residents attending CAPS beat meetings in 51 Chicago police beats were both asked to complete monthly Internet surveys. The survey results were then supplied to residents in the random sample to view through the Internet and to police and residents at their beat meetings for discussion and use in problem solving. To test the impact of using this information and the potential benefits of additional training/education for police and residents, the study beats were assigned to one of three experimental conditions: control, feedback, and training. While residents in beats within each condition were asked to complete Internet surveys, the critical component of feeding back survey results were reserved for beats in the feedback and training conditions. Residents in the control condition were simply asked to complete surveys each month and served as the baseline for testing the impact of receiving feedback. Given the difference in samples (a random sample of residents from the study beats vs. residents attending beat meetings), the nature of the intervention varied for these groups. For the random sample, feedback consisted of providing survey results to residents in the feedback and training beats via an Internet website. The training component, administered through the same website, consisted of crime prevention and other public safety information (discussed below). For CAPS participants, feedback consisted of providing police and residents with paper copies of the survey results at their monthly beat meetings to use in discussion and problem solving activities. Police in the training condition were provided with additional classroom instruction on problem solving and use of survey results. For a full discussion of the research design employed in the CAPS experiment, see Chapter 6. The project was introduced to participating beats in March 2005 and continued until September 2005, during which six waves of Internet surveys were administered and five sets of survey results were fed back to residents in the random sample and CAPS beat meeting participants. B. Setting The study took place in Chicago, Illinois, a large Midwestern metropolitan city. In 2004, the year before the research began, Chicago was the 3rd largest city, only behind New York and Los Angeles, with a total population of about 2.7 million residents. In 2004 about 46.8% of the residents were White, 36.2% African American, and 27.4% Latino/a. The median household income was $40,656 which was slightly below the national median income of $44,684. 9 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. C. Selection and Assignment of Study Beats Using data from the 2000 U.S. Census and survey data from evaluations of the CAPS programs, all 280 police beats in Chicago were analyzed in order to generate profiles of relevant demographic variables for each beat. Using the key variables of income and race, the lowest income beats were excluded and the remaining beats were then stratified to ensure diversity within the final 60 beats selected. After initial telephone interviews within these 60 beats (described below), 51 beats were selected to participate in the study. The 51 beats were geographically located throughout the city and represent 18 of the 25 administrative Chicago Police Department districts. Table 3.1 presents a comparison between the study beats and all beats in the city 2 . Residents from the study beats tended to be more affluent than the general population of the city. For example, the median income of residents from the study beats was $51,663 compared to $36,981 for all residents. The study beats also had a greater percentage of college graduates, homeowners and fewer female headed households. The overall crime rate, the violent crime rate, and the robbery rate were similar between the study beats and entire city. However, on average the homicide rate was significantly lower in the study beats compared to all beats in the city. There were two primary objectives when designing the beat sampling strategy. The first objective was to select beats that would have large percentage of residents with Internet access. An earlier pilot project of the Chicago Internet Project (Skogan et al., 2004) highlighted the problem of the “digital divide”; that is, a larger percentage of economically disadvantaged residents do not have computers or access to the internet. Because of cost and resource considerations, a decision was made to target more affluent beats in the city so that we could recruit a sample of residents who could participate in the project. Hence, this beat selection process accounts for differences describe above. The second objective was to ensure an adequate representation of the diverse racial and ethnic communities in Chicago, especially African Americans, Latinos, and Whites. The first step of the beat sampling selection strategy was to stratify all of the Chicago police beats into four racial and ethnic groups: predominately White, African American, Latino/a and mixed race neighborhoods (a homogeneity index was computed for this purpose). The second step was to sort each of the four strata by the percentage of residents with annual incomes of $40,000 or greater. This process yielded a sampling frame that could be used to select an adequate number of beats from racially and ethnically homogeneous communities (i.e., White, African American and Latino/a) and racially and ethnically heterogeneous communities (i.e., no racial or ethnic group comprising a majority of the residents), as well as maximizing the potential for selecting beats within each of the racial/ethnic strata that would have a large percentage of residents who had internet access. The number of beats selected within each stratum was based on the racial/ethnic representation of the population of Chicago beats. Additionally, because the study design dictated three conditions (i.e., control, feedback only and feedback/training) the number of beats selected within each of the strata had to be divisible by three. As such, the first 21 beats from the White 2 Information on one beat was missing from Skogan’s data. 10 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 3.1 Descriptive Statistics of Chicago Internet Study Beats Study Beats (N = 51) All Chicago Beats (N = 279) M SD M SD F- Test 12,884 4,928 10,379 5,427 9.44** % pop. 65 and older 11.03 5.47 10.08 4.54 1.76 % pop 15-24 years old 13.72 3.62 14.83 3.02 5.11* % White 38.62 35.54 25.89 28.83 7.79** % African American 39.18 42.92 48.74 42.03 2.22 % Latino/a 17.58 26.01 20.45 26.08 .52 51,663 15,086 36,981 15,634 38.43*** % income > $40,000 59.08 10.04 43.77 16.38 41.62*** % income < $15,000 13.96 4.78 24.73 13.77 30.45*** % college graduates 57.19 20.98 45.86 19.54 14.17*** 7.92 6.18 13.56 9.77 15.82*** 55.87 21.47 40.40 21.14 22.96*** 145.68 72.43 293.09 837.83 1.57 14.94 11.49 28.06 72.18 1.67 Robbery rate 5.66 5.11 9.20 14.36 3.03 Homicide rate .16 .19 .26 .32 2000 Census Data Population Median income % female headed households % homeowners 2004 Crime Data (per 1,000 residents) Crime rate Violent crime rate 5.01* *p<.05 **p<.01 ***p<.001 strata, the first 21 beats from the African American strata, and the first 9 beats from the Latino/a strata were selected to be included in the study. One of the first 21 African American beats had to be excluded because residents of that beat participated in an earlier pilot study of the Chicago Internet Project. The next beat on the sampling frame was selected as a replacement. One of the first 21 White beats was excluded because the crime rate for that beat was extremely high. The crime rate of the excluded beat was more than 3 standard deviations higher than the mean 11 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. distribution of crime rates for the other White beats selected to be included in the study. The next beat on the sampling frame was selected as a replacement. Because the beats in the racially and ethnically heterogeneous stratum were very diverse, a decision was made to select groups of beats based on the largest racial or ethnic group represented in the beat. Three beats were selected that that had a significant African American population. Three beats were selected that had a significant Latino/a population. And three beats were selected that did have a clear majority of any racial or ethnic group. Within the White, African American, and Latino/a strata, the beats were matched in groups of three based on the economic advantage characteristics of the beat (i.e., income, education and homeownership) and the robbery rate. A random selection process was then used to assign each one of the matched beats to one of three conditions – control beat, feedback only beat or feedback and training beat. For the racially and ethnically heterogeneous stratum, a random selection process was used to assign each of the beats with a significant African American population to one of the three conditions, each of the beats with a significant Latino/a population beats to one of the three conditions, and each of the beats with no clear majority of any racial or ethnic group to one of the three conditions. Although the heterogeneous stratum beats were not match in the same way as the racially and ethnically homogeneous strata beats, within each of the three groups (i.e., significantly African American, significantly Latino/a and no clear majority) the heterogeneous strata beats were relatively similar in terms of economic advantage and crime rates. During the telephone recruiting phase of the project nine beats were dropped from the study because of the cost overruns. The lowest performing White, African American, and Latino/a beats and their respectively matched beats were dropped from the study. The beats were dropped from the study prior to the administration of the first Internet survey. D. Selection of Internet Users within Study Beats Telephone surveys were employed primarily to locate a representative sample of Internet users in the selected police beats. The telephone data were also used to validate some Internet findings and to test specific hypotheses about the impact of Internet participation (vs. nonparticipation). For the telephone surveys, we subcontracted with Northern Illinois University’s Public Opinion Laboratory, whose job it was to identify Internet-ready households and to collect limited survey data from respondents. Using a Chicago reverse directory, which displays listed phone numbers by block, our research team generated random samples of households in each of the 60 beats. This information was given to the survey lab as the telephone survey sample. Pre-experiment telephone survey. The first telephone survey (pre-experiment) was pilot tested to achieve an interview of approximately 10 minutes in length with items that are meaningful to respondents. In the fielded survey, screening questions allowed interviewers to identify respondents who: (1) have regular access to email from home or work; (2) do not attend CAPS meetings on a regular basis, i.e., 2 or more times in the past 6 months; (3) are at least 18 12 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. years old and (4) continue to live in the neighborhood/police beat from which they were sampled. For respondents who met these criteria and who agreed to participate in the Chicago Internet Project, email addresses were obtained and additional perceptual data were gathered. Specifically, survey questions were asked about fear of crime, individual and collective capacity to respond to crime, general assessments of police in their neighborhood, personal Internet usage patterns, and standard demographic characteristics. Pre-experiment telephone data were collected between January 14 and April 17, 2005. UIC provided the survey laboratory with 45,992 phone numbers. The lab used 32,688 of these numbers and made 94,890 calls. This effort yielded 2,085 completed interviews. The survey lab then sent the UIC research team a list of all completed interviews, names and email addresses for future participation in the Chicago Internet Project. During the data collection process, we realized that, despite our efforts to over-sample middle income neighborhoods, Internet access remained a significant problem for several neighborhoods. Hence, data collection was discontinued in nine police beats because the survey lab was struggling to generate a sufficient sample of Internet users. Consequently, resources were transferred to other beats, with the goal of achieving 25 to 35 respondents per beat. 3 (Across all 60 beats, 24% of all households were excluded from the study because they reported a lack of access to the Internet and for many beats, non-access exceeded one-third of the sample). Hence, our survey strategy sought to balance the total sample size against our concern for the sample size per beat. The final number of completed interviews dropped from 2,085 to 1,976 for the pre-experiment telephone survey as a result of this decision, but more stable estimates were achieved for participating beats. The disposition of all calls for the pre-experiment survey is shown in Table 3.2. Using standard formulas established by the American Association for Public Opinion Research (1998), the pre-experiment survey outcomes can be described in these terms: 11% response rate, 24% cooperation rate, 34% refusal rate, and 57% contact rate. These figures suggest the difficulty associated with the task of identifying random households that have both Internet access and are willing to participate in a long-term Internet survey. Nevertheless, after making nearly 95,000 phone calls, we were able to identify approximately 2,000 willing participants. We estimate the cost of this entire screening process at roughly $30 to $35 per participant. 3 To prevent problems with random assignment, the 9 beats were discontinued/dropped in matched groups of three, representing clusters of three African American, three Latino, and three White beats. 13 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 3.2 Telephone Pre-Experiment -- Final Calling Outcomes (April 2005) # Calls CODE Interim Call Dispositions 78 4R PWA refused for R,incl HUDI w/sel R/R won't come to phone 39 8A 800 line CB, set appointment 3 8C 800 line CB, completion 1 8N 800 line CB, not a residence 16 8R 800 line CB, HH or respondent refusal 2,482 BY Normal busy signal 329 BZ Business or pay phone 10 CB All circuits busy message 2,085 CM Completed Interview 112 FA Firm appointment with a respondent 178 FB Fast busy signal 887 FM Fax/data/modem, no human contact 6 GH Group Home(>9 men or >9 women),temporary residence 59 HA HH away for entire interview period 9,478 HC Neutral HH contact, no respondent selected 10,315 HR HH refuses,incl HUDI's if known household 2,875 HU Hang up without contact, not known if elig, no respondent 75 IM Physical/mental impairment at the household 543 LB Language barrier at HH, (HH does not speak ENG or SPAN) Answering Machine, left message at HH 2,437 MM 7,785 MR Answering machine at residence Answering machine, left message, unknown if residence 4,091 MS 21,579 MU Answering machine, unknown if business or HH 14,686 NA No Answer by any device, Normal ring 7,201 NE R does not meet eligibility requirements of project 4,113 NW NonWorking/NIS/Disc#/Changed/# Can't be verified by PWA 531 OF Person who answers says take off list/don't call back 66 OG Outside geography 2,090 OS Temp OutOfService/Checked for trouble 104 PC Partial, R broke off, either refused to go on or finish later 208 PN Possible non-working number 59 TB Tech barrier at residence, any automated call-blocking Tech barrier, left message or stated why calling 7 TM 253 TU Tech barrier,unverified residence, NO MSG OR ID POSSIBLE 109 UA All residents of HH under 18 or phone line strictly a teen phone 94,890 Total Phone Calls 14 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Post-experiment telephone survey. The second and final telephone survey (postexperiment) served as a posttest to measure a small set of outcomes and to query respondents about their experience as participants in the Chicago Internet Project. The post-experiment was fielded after 7 waves of Internet surveys or 6 months after the pre-experiment survey. This was a panel sample, with telephone interviewers re-contacting persons who had completed a telephone interview at wave 1. To allow for comparisons between different levels of participation in the Internet project, we stratified our post-experiment panel sample to include eligible nonparticipators (i.e., completed the pre-experiment survey but did not complete any online surveys), moderate participators (i.e. completed 1-2 online surveys), and heavy participators (i.e. completed 3 or more online surveys). The survey lab was given a sample of 1,594 cases to call across these groups. Eligible non-participators were, as expected, much more difficult to reach, but the lab was able to achieve a response rate of 37% with this group. Better success was achieved with the moderate participators (56% response rate), and the most success was achieved with the heavy participators (70% response rate). The overall response rate for post-experiment survey was 56.1%, fully consistent with results from other telephone surveys that involve a 6month lag between waves. As expected, response rates varied by neighborhood/beat, with 13 beats between 30-49%, 23 beats between 50-59% and 15 beats having response rates of 60% or higher. The final calling outcomes for post-experiment survey are shown in Table 3.3. In the end, 894 respondents completed both the pre- and post-telephone surveys. Table 3.3 Post-Experiment Final Calling Outcomes (January 2006) Total Cases Fielded (N=1,594) Outcome Num.Cases 110-complete 894 210/220-refusals 77 280-contact/no action 271 335-only pick up was Answering Machine 202 360 - no answer 26 417/410 - no new # 119 430 - Fax/modem Cases Calls Made Completes 1,594 8,330 894 Calls/Complete 9.32 9.75 mins AVG Length Overall Response Rate Contact Rate Cooperation Rate (among contacted) Refusal Rate (among contacted) 5 1594 15 56.1% 77.9% 72.0% 6.2% This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. E. Development of Internet Survey Instruments Prior to the beginning of the project, the research team developed a data matrix which identified the theoretically important constructs needed to test the impact of the experiment and to advance our knowledge of survey measurement in the domain of public safety and police performance. The research team also gathered relevant instruments and questions used in prior research on community-policing and community crime prevention, as well as created drafts of new questions and scales designed specifically for use in this study. A preliminary schedule was created that identified which of the constructs would be measured at what wave. The general plan was to collect at least two waves of data on items that required testing for reliability and validity. Also, each instrument was developed to reflect coherent sets of items that were ordered in a meaningful way for the benefit of the respondents. At the beginning of each wave, a draft of the survey instrument for that wave was created. After the draft was created the research team members would examine the survey and provide feedback on: (a) if the questions were measuring what they were designed to measure; (b) whether any important constructs were missing; and (c) whether the project was generally on track. After the survey was finalized it would go through a quality control process that was developed to decrease the likelihood of typographical errors, to ensure that the survey was functioning properly when completed online (e.g. working skip patterns), and to test that the survey could be accessed with the most common web browsers (i.e., Internet Explorer, Firefox, etc.). F. Development and Implementation of Observation and Questionnaire Methods A three-part methodology was developed to examine the effects of the intervention for CAPS participants that consisted of questionnaires, field observations and field interviews. Questionnaires. To measure the effect of feeding back web survey results on the public safety perceptions and behaviors of CAPS participants, questionnaires were administered to both police (see Appendix A) and citizens (see Appendix B). The questionnaires were designed to measure three primary areas of hypothesized change. The first area was interaction within the beat; items previously developed by the Chicago Community Policing Evaluation Consortium (CCPEC) were used to measure changes in the extent to which police and citizens generally interacted and engaged in public safety activities, such as attending other local meetings and problem solving, with one another (see Skogan & Hartnett, 1997; Skogan, Hartnett, DuBois, Comey, Kaiser, & Lovig, 1999). The second area was citizen capacity for engaging in public safety behaviors theorized to be most likely enhanced by participation in the project. Items measured citizen attitudes regarding beliefs about their ability to engage in effective problem solving for their beat and possessing the necessary knowledge for keeping themselves and their beat safe. The final area examined perceptions regarding the police-citizen partnership. Partly based on previously developed and tested measures employed by the CCPEC, items examined citizen attitudes towards the partnership formed between the police and community both in general within the beat and more specifically within the CAPS beat meeting framework. 16 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Prior to the introduction of the project in March 2005 and upon completion of the project in September 2005, questionnaires were administered to all CAPS participants who were in attendance at their monthly meetings within the study beats. For beats that held meetings only every other month and were not meeting during March or September, questionnaires were then administered in April and October 2005. Arrangements were made to have 10-15 minutes on the meeting agenda in order to introduce the project to participants and allow them to complete a questionnaire. Participants were typically able to complete questionnaires in the time allotted; however, some were required to complete questionnaires while the meeting resumed. Field observations. In order to document the dynamics of the police-citizen partnership and problem solving activities at beat meetings, including how the web survey findings were used in relation to deliberating about problems, observations of meetings in the study beats were conducted between March and October 2005. Observations in all beats were planned during April (wave 2), June (wave 4), and July (wave 5). During May (wave 3), only those beats that had failed to carry out the steps of the project in the prior month were observed to determine whether they were now following the protocol. During August (wave 6), beats were selected for observation based on the number of previous observations that had occurred within each beat to ensure an equal number of observations for as many beats as possible. For each beat in which meetings were held on a monthly basis, a minimum of 4 meetings were attended by an observer, with the exception of a single beat where only 3 meetings were observed during the course of the project. For beats holding meetings every other month, each meeting held during the course of the project was attended, thus allowing a more reliable assessment of problem solving efforts. In all, 266 beat meetings were observed during the course of the project. The observation protocol included two primary components: completing a structured observation form and preparing a narrative from notes taken during the meeting. The observation form was adapted from forms previously developed by the CCPEC; these existing forms provided a well-tested format for the systematic recording of relevant information about beat meetings such as counting the number of police and residents in attendance, classifying the nature of problems brought up by citizens, and depicting the roles citizens and police played in both running the meeting and engaging in different steps of the problem solving process (e.g. identification of problems and solutions). Additional items were prepared to detail the presentation and use of the web survey results during meetings. (To view observation form, see Appendix C.) Observers also prepared a supplemental 1-2 page narrative that included a summary of events, their impressions of the police-citizen relationship, the quality of discussions about problems and web survey results, and notation of unusual or otherwise significant occurrences during each meeting. Through both the observation form and narratives, it was not only possible to systematically examine the nature of problem solving and use of survey results within and across the study beats, but also to identify characteristics of meetings where program effects were greatest. During the project, 32 individuals participated in conducting field observations, primarily undergraduates from UIC. Of these, 7 were Spanish-speaking and responsible for attending meetings in those beats with sizeable Hispanic populations. All observers underwent a two-part training protocol that was developed and originally administered by Dr. Wesley Skogan of the CCPEC (individuals who joined the project after the original training dates were trained with the same protocol by project managers). The first part consisted of a one-day training session in 17 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. which observers were instructed on project objectives, CAPS framework, and observer role and responsibilities, including guidelines for filling out observation forms and preparing narratives. The second part required observers to attend a beat meeting outside of the study beats with a project manager in small groups (3-4 individuals) so they would have the opportunity to practice completing the observation form and taking field notes. Immediately following the meeting, each group went over their completed forms and discussed any concerns. Observers also received certification from UIC for participating in data collection by completing a three-hour training on the protection of research subjects. Throughout the study, project managers monitored the content of observation forms and narratives and observers were provided with regular feedback as to the quality of their work. Field interviews. To share their opinions about the project as to its usefulness and discuss their own experiences in completing surveys and discussing survey results at meetings. This not only allowed us to gain insight into how the project was received by participants, but also to identify areas for future improvement. The interview protocol was designed to collect information on four main topics: (1) Content of the surveys, particularly as to whether the issues covered by the surveys were relevant for addressing beat concerns and discover other dimensions of importance that were not included; (2) Utility of survey results as to format, as a vehicle for understanding beat concerns, and assisting in the problem solving process; (3) Obstacles to implementing the project in terms of both police support for the initiative and citizen participation in completing surveys; and (4) Overall support for the use of the Internet, both in their personal lives and to facilitate communication between citizens and the police. Interviews were sought with individuals key to facilitating and attending meetings in the study beats. For the police, interviews were conducted with a Sergeant, patrol officer, or community policing officer who had attended at least 3 beat meetings during the course of the project; in all, interviews were conducted with police personnel in 48 of the 51 study beats (with a single interview conducted for the two beats that held joint meetings). Per the CAPS framework, each beat is also to have a citizen facilitator who is jointly responsible with police for running beat meetings. To this end, we conducted interviews with a subset of 22 citizen facilitators that provided an equal representation of study beats across both the three experimental conditions and the four dominant racial/ethnic populations (African American, Latino, White, and mixed). All interviews were conducted between June and November 2005 and typically lasted between thirty minutes to an hour. Interviews with police personnel were conducted at their stationhouse, while citizen facilitators were generally interviewed at the beat meeting location or over the telephone. G. Development of Website and Processing Survey Results Development of website and educational linkages. Web developers designed the CIP website to be as straightforward as possible in order to enhance ease of use and to limit any “Internet apprehension” that some study participants may have felt. The CIP introduction page is presented below in Figure 3.1. The website also allowed CIP researchers to upload graphs and tables and change and update link content. After completing surveys, respondents in both experimental conditions were invited to visit the website to view selected survey results for their 18 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. police beat. 4 In addition to viewing beat specific survey results, respondents in the feedback/training condition were further encouraged to visit links on the CIP website providing information about various public safety topics. Figure 3.1 Chicago Internet Project Website Page CIP passwords or neighborhood codes. All participants in the study were assigned passwords also called neighborhood codes which were unique to the police beat in which they resided. 5 Passwords were assigned for two purposes: first, participants used the passwords to access the monthly online surveys and second, people in the experimental conditions used beat specific passwords to access survey results posted each month. 6 It should be noted that the monthly surveys and the CIP website results pages were posted on separate websites in order to avoid the control beat participants from viewing survey results. Over the course of the study, participants were reminded of their beat specific password in all email and mail correspondence. When participants in the experimental conditions went to the CIP website they were prompted to type in their beat and their accompanying neighborhood code in order to log on (see the log in screen in Figure 3.2). 4 Participants in the experimental conditions received the website link via email and postcards. Participants’ beat pass code remained the same throughout the study unless they moved out of the study beat. Participants who moved during the course of the study were still encouraged to complete the monthly online surveys but they were assigned a different code to differentiate them as “moved” and no longer residing in the study beat. 6 Although technologically basic, these passwords allowed researchers to track the number of times respondents by beat accessed the CIP website and any CIP public safety links. 5 19 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 3.2 Login Page Processing survey results. While each online survey was made available to beat participants for an entire month, survey results were processed from responses compiled within the first two weeks after the survey became available. After these two week periods, researchers took a week to process, aggregate, and post beat-level survey results on the CIP website for beats in the experimental conditions. Respondents’ survey responses were stored on the University of Illinois at Chicago’s server as database files. Survey results processing took several steps. First, researchers downloaded survey database files, saved them to password protected CIP office desktop computers, and converted them into SPSS files. Second, CIP researchers chose approximately three to five questions per survey wave to analyze further and post on the CIP website. Researchers chose questions primarily based on their perceived utility for problem solving and introduction of new topics for deliberation at CAPS beat meetings. Finally, survey results were tabulated with SPSS, graphics and tables were created in Microsoft Excel, and the graphics and tables were uploaded onto the CIP website. In order to protect experimental participants’ confidentiality, there had to be ten survey completions per beat per survey wave in order to display beat level data on the website. 7 In order to ensure that every experimental beat had survey results to view, when less than ten beat participants responded to a survey wave, researchers aggregated the responses from other study beats, analyzed, and displayed police district level data. 8 Additionally, if a study beat had a high numbers of Spanish speakers, the results were then made available in both English and Spanish. 7 8 In any given month of the study, anywhere from 10-20 percent of the study beats had less than 10 respondents. Chicago Police Department Districts are comprised of 10-15 beats. 20 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. The CIP website design allowed researchers to upload specific survey results for each study beat. Each month of the study, participants could scroll through several screens of survey results specific to their neighborhood. Participants could also view archival survey results when they logged on. In Figure 3.3 there is an example of what participants might have seen upon logging on the website. Figure 3.3 Survey Results Example Community resource links. When participants in the experimental condition with feedback and training viewed their survey results, they were directed to visit the “Community Resource Links” (herein referred to as “links”). The links were posted in Spanish and English. The images in Figure 3.4 and 3.5 are examples of what a participant would see if they visited the links page. There were two types of links on the web page – links internal to the CIP site and links to external websites. Content of the internal links was controlled and edited by the researchers every month. These links covered topical public safety information and covered four broad areas: 1) 2) 3) 4) Community participation Citizen and police relationships Problem solving and Individual safety behavior. Researchers updated the content of each of these links and tied the content to topics contained in the survey. For example, if participants were asked about their participation in community activities, then the community participation link would provide information about 21 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 3.4 Crime Prevention Concepts Example Figure 3.5 Community Participation Example 22 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. how to join community initiatives. Table 3.4 provides a brief overview of the monthly content covered in each of the public safety areas. 9 The image in Figures 3.6 presents examples of some of the links that respondents could access during the study. The external links were the Chicago Alternative Policing Strategy (CAPS) and Citizen-ICAM (CPD’s crime mapping resource) sites which are part of the overall Chicago Police Department website. The two external links did not change over the course of the study. These links were included because they provided both citywide and beat specific information. The CAPS site offered participants community policing information such as beat meeting times and locations, crime watch, hotline information, and specific Chicago Police Department contacts. The ICAM website allowed citizens to map crimes based on selected geographic and type of crime parameters. Table 3.4 Overview of the Monthly Link Content Month 1 Month 2 Month 3 Importance of reporting neighborhood crimes How to describe a suspect; Links to numerous Chicago community organizations Security checklist for keeping your home safe Citizen and Police Relationships Importance of partnership in crime prevention Accessing nonemergency police services and city services by calling 3-1-1 Your rights and responsibilities when interacting with the police during a traffic stop or an arrest Month 4 What is Neighborhood Watch and how do you join or start it up in your community? Summer safety tips – ways to contribute and keep your community safe during the summer months Problem Solving Steps of problem solving Educational description of the Crime Triangle Ways to keep your children safe when they are on- line Problem solving exercise and fear of crime matrix Individual Safety Behavior Crime Prevention through Environmental Design (CPTED) Tips on how to protect yourself from becoming a crime victim How to reduce the risk of car theft or carjacking Ways to protect yourself and your family Community Participation 9 The “Community Resource Links” content was gleaned from various online and paper sources, namely the National Crime Prevention Council, National Center for Missing and Exploited Children and the American Civil Liberties Union. 23 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 3.6 Community Resource Links Example H. Posting Surveys and Managing the Monthly Sample Posting surveys. In all, six surveys were made available to both the random sample of residents and CAPS participants; a seventh survey was made available exclusively to the random sample. Because availability of surveys and the preparation of survey results was necessarily based on the monthly CAPS beat meeting schedule, the protocol was staggered so that surveys were made available to participants of the random sample during the week that their beat held its CAPS meeting; likewise, survey results were made available to the random sample feedback groups the week before the next CAPS meeting was to be held. Surveys were converted into a web-based format using Perseus SurveySolutions software and then published to a website maintained on the UIC server to house multiple surveys. The original research design held for surveys to be made available to participants for a two week period; this was considered necessary in order to facilitate the timely processing of survey responses into distributable results. However, this scheme was discarded after the first month in order to maximize the response rate and, during all subsequent months, once posted, surveys remained available online to participants for one month and were only taken down once the next survey had been posted and participants had been notified that it was now available 10 . 10 As previously noted, however, survey results were prepared using the compiled responses from the first two weeks after the survey first became available in order to ensure timely dissemination of results to police for use at the CAPS beat meetings. 24 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. In order to ensure the correct individuals were completing surveys, participants were required to enter both their name and a password that we provided them in order to access the survey. Again, during the first month only, each beat was provided a unique website address to access the survey. Each survey had to be published to the Internet separately and, with the inclusion of surveys in Spanish and separate surveys for CAPS participants in each beat, this required over 100 different surveys. After the first month, it was decided to streamline the process by maintaining fewer website addresses. To this end, survey addresses were assigned in weekly blocks according to when CAPS meetings were held and tracking individual beat results in larger response files was accomplished by maintaining the passwords that were unique to each beat. There was little difficulty experienced in adhering to this schedule for posting surveys and the only deviations that occurred were contingent upon problems experienced with the availability of the survey results, which needed to be posted prior to the surveys (see “Implementation: Protocol and Integrity” section of the CAPS Experiment). Resident notification and contact. A protocol was established to notify participants of the random sample about the availability of each survey and, for those in feedback beats, survey results that incorporated multiple points of contact via both postal mail and the Internet. A notification letter and refrigerator magnet thanking residents for participation was sent to each individual after recruitment by telephone; these letters explained the basic premise of the project, how participants would receive information about surveys each month, the password that participants would be required to use to access survey, and information about the first month’s survey. After this, a monthly notification system was followed that consisted of: (1) A postcard sent 3-4 days prior to the start of the survey that alerted residents they survey would become available soon; (2) An email sent the day that the survey began that provided residents with a direct link to the survey and password necessary to access the survey; (3) A postcard sent 8 days after the initial postcard as a reminder to residents to complete the survey if they had not already done so; and (4) A final email 2-3 days before the point at which survey responses would be compiled to prepare results for distribution to participants, including another link to the survey website and password. For those participants in feedback beats, an additional email was sent a week prior to the start of that month’s survey notifying them that survey results from the previous month had become available, including a link to the results website and the necessary password. The rationale for these multiple points of contact was threefold. First, our prior experience during a feasibility study indicated that residents often forgot to go online to complete surveys when provided with only a single reminder and their feedback suggested they would respond at a greater rate if such a reminder had been provided. Second, the use of both postcards and email ensured that participants who did not regularly go online to check their email would be prompted to do so by the arrival of a postcard. Conversely, use of emails with a direct link to the survey website and a reminder of their password (which was not provided on the postcards) served to facilitate accessing surveys by eliminating the chance participants would incorrectly type in the website address or forget their password. Third, multiple reminders were considered important because of the inevitability that some participants might mislay their postcards, delay completing surveys and require assurance the surveys were still available online, or inadvertently delete the email. Ultimately, most participants were satisfied with the notification system, although the majority indicated they found email more useful for learning about each month’s survey than postcards. Some, however, said they preferred to receive both; the reasoning for this 25 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. confirmed our rationale that some participants might require prompting to check their email account or had not received the email. While there is a segment for which postcards would be important to participation, the trends for survey completions across the waves support that participants were more responsive to email notifications; completion rates were always greatest in the first days immediately following the first email notification, with a subsequent, but smaller jump in completions following the reminder email. Maintaining contact with participants was an ongoing concern of the project and a barrier to this was incorrect contact information. In the case of bad email addresses, participants were notified by postal mail that we were having difficulty contacting them and they were invited to contact us with the correct information, many of whom did so. In the case of bad postal addresses, participants were called by phone in order to verify their information. Despite these efforts, we were unable to contact approximately 10% of the participants by email and 2-3% by postal mail because of faulty addresses. In a limited amount of cases, participants had to be withdrawn from the project because we had no means of contacting them with the information necessary for participation. Because there was the need to send out emails in bulk, there was also a small portion of participants who experienced difficulty receiving our emails as their Internet server blocked them as spam, although we were usually able to work with such individuals to overcome this particular problem. As responses to surveys were monitored on a weekly basis, we were aware within the first week of implementation that response rates were far lower than expected. Our original protocol had not anticipated this issue, therefore it was necessary to devise a new strategy for encouraging more participation. During waves 1 and 2, members of the UIC research staff made outreach phone calls to participants in those beats with response rates of 20% or below. During waves 3, 4, and 5, outreach calls concentrated on individuals from all the beats who had not been regularly completing surveys. These phone calls were intended to both determine why an individual had not been completing surveys and encourage them to. Most individuals indicated they simply forgot or had been too busy to do so, while some told us they no longer had Internet access or had had problems with their computers. For those individuals who expressed willingness to complete that month’s survey, they were sent a follow up email with a link to the survey and the necessary password. Implementation problems. Given the scope of the project, we experienced relatively few problems in the implementation process; apart from those associated with processing and posting survey results, most problems that occurred were primarily due to human error. We prepared standard forms for all notification letters, postcards, and emails that participants received which required only the addition of the survey addresses and passwords specific to the particular beats before being sent out. While this served to eliminate the redundancy of preparing the same materials for each beat, it still left room for errors when supplying the survey addresses and passwords, something which occurred on approximately six occasions. For most of these incidents, we were almost immediately aware that they had occurred and were able to send out correction notifications. Because participants were required to provide their names whenever they completed a survey, such incidents were easily dealt with in being able to reconcile survey responses with the proper beats and only one incident occurred in which this was not possible. Participants in nine beats were inadvertently provided with the information for their CAPS counterpart in a reminder email during wave 3. We sought to immediately correct this problem by sending participants emails acknowledging the error and requesting their assistance in helping us to verify the receipt of their results, but this was nevertheless problematic because the CAPS 26 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. survey to which they had been directed did not request their names. Through the use of multiple informational elements collected during the wave 3 survey, we were ultimately able to identify many of those individuals who had taken the incorrect survey. Participation problems and assistance. Participants also experienced few problems in the completion of surveys. During both waves 1 and 3, survey respondents were asked how easy they had found it to complete the survey. For each wave, over 97% indicated they had found it somewhat or very easy to do so. Participants were encouraged to contact UIC with any problems they encountered by either telephone or the email account through which survey notifications were sent out. The problem most frequently brought to our attention related to accessing the survey either because they required a password reminder or had difficulty pulling up the website address for a survey. Although email notifications about surveys always contained the survey password, roughly half of all phone calls and emails that we received concerned requests for the password. Because the validity of survey responses dictated the necessity of requiring participants to enter a password, this was a methodological point that we could not dispense of in order to make the process easier for the participants. The other problem associated with accessing surveys, specifically being able to get to the website, had sources that we were not always able to identify when a participant contacted us for assistance, but two main categories emerged. In some instances, there did appear to be an issue of compatibility between the survey software we employed and the software participants were using in order to access the Internet; when this was suspected to be the case, participants would be directed to use another Internet browser to access the survey and this generally succeeded in taking care of the problem. The issue of software would appear to also be related to complaints we received from some participants about the survey not being properly formatted on their computer screen (e.g. response sets not being lined up with the buttons). In other instances, the inability to access a survey was due to error or lack of computer skills on the part of individual participants. Sometimes participants were simply typing in the website address incorrectly, while others were unfamiliar with the Internet beyond using email. When this was the case, problems were easily taken care of by sending participants another email with a direct link to the survey that would essentially bypass the necessity of requiring greater computer skills by requiring these participants do nothing more than clicking on the link to access the survey. Ultimately, we considered maintaining these avenues of communication with participants in order to answer their questions and assist them with problems as being invaluable given the nature of this project, particularly the absence of face to face contact between UIC researchers and participants. While it would be impossible to determine the number of study participants who experienced problems or had questions and did not contact us, we feel strongly that the ready availability of the UIC research team via phone and email helped to sustain participation on the part of individuals who could have easily become frustrated when they encountered technological difficulties or had other questions about the project that might have caused them to withdraw from the study. Participants were generally pleased with the assistance that we gave them and the promptness we displayed in responding to their concerns. 27 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER FOUR LEVELS OF PARTICIPATION IN THE CHICAGO INTERNET PROJECT A. CAPS Resident Participation Overall, 511 web surveys were completed by citizens who attended their CAPS beat meetings within 49 of the 50 study beats during the course of the project. 11 The overall web survey response rate for the CAPS attendees was 10.6%. An average of 20 residents attended the CAPS meetings in each of the study beats which means an average of 2-3 residents per beat completed surveys each month (see Table 4.1). There were no significant differences in completion rates across the experimental conditions. These findings suggest that receiving feedback and discussing web survey results did not serve as an incentive for residents to complete Internet surveys. Table 4.1 CAPS Response Rate for Internet Surveys (%) All beats Overall Wave 1 10.6 10.9 Wave 2 Wave 3 Wave 4 Wave 5 Wave 6 7.5 10.8 12.4 10.4 11.4 Experimental Condition (3 Groups) Control 9.9 12.4 8.5 9.5 12.3 9.2 7.5 Feedback 9.5 9.3 15.2 9.5 11.3 9.9 11.6 12.4 10.8 8.8 13.5 13.8 12.2 15.4 F = .14 F = .58 F = .99 F = .12 F = .59 F =1.34 9.9 12.4 8.5 9.5 12.3 9.2 7.5 10.9 10.1 6.9 11.4 12.5 11.0 13.3 F = .22 F = .27 F = .43 F = .01 F = .56 F = .20 Training Experimental Condition (2 groups) Control Experiment Race/Ethnicity of Beat African American 7.1 8.2 3.7 7.9 8.1 8.2 6.8 Latino 7.7 7.5 7.2 7.9 7.4 6.6 9.8 Mixed 15.1 15.5 17.0 16.5 16.6 13.9 14.1 White 13.4 13.0 7.7 12.9 17.6 12.8 13.4 F = 2.05 F = 1.79 F = 1.88 F = .56 F = 3.45* F = 1.68 * p<.05 Although experimental condition was not significantly related to response rates, it should be noted that there the police in these beats tended to demonstrate more consistency in implementation of project tasks and their investment in the project appeared to be greater than 11 An additional 241 web surveys completed by residents in relation to a single beat were excluded from this count for reasons discussed below in the section entitled “The 50th Beat”. 28 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. for offices from the other beats. To this end, it would seem logical to expect that resident participation in completing surveys would be influenced by not just the mere act of, but the manner in which police implemented the project. There were no significant differences in terms of response rates across the racial/ethnic beats except at wave 2, when the response rate for primarily African American beats dropped to 3.7%, while beats with a mixed racial/ethnic composition remained steady at 15.5%. The greatest overall response rates occurred within mixed racial/ethnic beats, followed closely by White beats, the response rate of these respective beat was roughly doubled that of the rates seen in the African American and Latino/a beats. There are three possible explanations for the response rate differences across the beat types. The first possible explanation could be the “digital divide” - disparities in Internet access rates exist according to income and race/ethnicity. Studies show that, while the gap is closing, African Americans and Latino/as tend to have lower access rates than Whites. When looking at the Internet access rates reported by CAPS participants in the study beats, we see that over 70% of the residents in the Latino/a, mixed, and White beats reported having access, while under 60% of residents in the African American beats had access, lending partial support to the digital divide argument. Because residents in the Latino/a beats reported rates of access similar to those of residents in White and mixed beats, this does not support the belief that disparities in access were responsible for low response rates in these beats. Yet residents in African American beats did report a significantly lower rate of access (X2 = 14.549, p < .01) that may have contributed, in part, to the low response rates there. The second possible explanation for the response rate difference between the various racial/ethnic study beats may be the nature of participation in CAPS; it is feasible that, given their greater participation in the CAPS program, residents in African American beats have established stronger problem solving partnership with the police and felt less of a need to supplement this partnership by completing surveys than residents in the White and mixed beats. The final possible explanation is residents in the White and mixed racial/ethnic beats may have felt a greater need to have their opinions heard through an additional venue (i.e., web survey). This might be particularly true of residents in the mixed beats, most of which are neighborhoods that are gentrifying. While it could be argued that the original residents (typically racial and ethnic minorities) would use this new form of communication with the police as their foothold in the area is threatened by the newcomers, it is more likely that the reverse is true: newcomers, predominately White residents, wanted to increase their stake in the community by increasing communication with the police. Survey completion trends support the latter explanation, with 80% of the respondents in the mixed beats being White, compared to just 17% African Americans from those same beats. Resident survey participation by beat varied widely, ranging from 5 beats in which a single survey was completed for the duration of the project to 6 beats where citizens completed between 21 and 30 surveys during the same time. Interestingly, 40 beats account for 60% of the surveys completed, while another 9 beats account for 40% of all completions. These nine beats are not in the experimental conditions and the residents attending CAPS meetings in those beats were exposed to varying levels of implementation by police. In testing for predictors of participation in these beats, the common denominator for the higher participation was type of education for residents (overall survey completion response rates were also predicted by beat level crime and meeting attendance rates). Residents in these 9 non-intervention beats exhibited 29 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. significantly higher levels of college attendance (F = 6.465, p < .05); all but one of the nine beats reported an average education level of at least some college, while 28 of the 40 beats accounting for 60% of the survey completion reported an average level of education of either high school degree or technical training (but no college). Table 4.2 Participants at CAPS Beat Meetings and CAPS Participants who Completed Internet Surveys Gender Male Female CAPS (%) Internet (%) 42.9 57.1 45.9 54.1 Race/Ethnicity African American White Hispanic/Latino Other 44.4 45.0 8.5 2.1 36.4 55.7 4.9 3.0 Age 18-19 20-29 30-39 40-49 50-59 60-69 70 and older .2 5.5 8.4 18.2 23.5 24.1 20.0 .2 3.0 10.6 22.9 29.0 25.6 8.7 Who completed the surveys each month? As Table 4.2 shows, the distribution of participation is comparable to the distribution for participation at CAPS beat meetings. Just as a slightly higher number of females attend CAPS meetings, more females than males completed surveys. We also see higher participation rates for individuals between thirty and fifty-nine, particularly those between the ages fifty and fifty-nine. Not surprisingly, those seventy or older completed surveys at a far lower rate than their attendance at meetings; as discussed below, age represents one of the obstacles to participation and was a barrier encountered during a previous feasibility study. Undoubtedly, many individuals in this age group did not have computers or their computer skills were such that they did not feel comfortable completing surveys. Ultimately, the greatest difference between the distribution of CAPS participants and those who completed surveys is among race/ethnic groups. White residents completed surveys at a greater rate and were responsible for completing over half of all surveys, while African Americans and Latino/as participated at lower rates. As previously noted, there could be several reasons for this disparity, but there is no single indicator to explain why. Because two of the highest participating beats were African American, however, it is also possible that a confluence of characteristics, such as individual Internet use, residents’ concerns, and the quality of the policecitizen relationship, that are unique to each beat may be responsible for greater resident participation among beats of different racial/ethnic composition. The 50th beat. We found it necessary to exclude the data of 241 survey completions that occurred in relation to a single beat because of the circumstances under which such a high number of responses were achieved. In wave 1 alone, 124 survey responses from this beat were recorded, immediately alerting us that something unusual was going on. It soon came to our 30 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. attention that a resident from this beat was posting the website address and password for each of our surveys, as well as survey results, on a blog that he maintained about his neighborhood. As his blog showed, this resident was obviously angry and frustrated about ongoing problems in his area, which was undergoing gentrification and he detailed and put photos about this on the site, and was extremely critical of city officials, the police, and developers. While he clearly cared about what was going on in his neighborhood, it was less clear how he regarded CIP; he probably sought to make the opportunity for completing surveys available to other residents who visited his blog, but there can be no question he was also using the material to bolster his own point of view. In posting the survey information, he used the exact wording that we had used to describe the project, but inserted sarcastic references to local officials into the sentences. Similarly, when he posted survey results, he added his own interpretation, using less than acceptable language. For instance, regarding one set of results about participation in CAPS, he commented, “What it tells us is our CAPS program is in the CRAPPER.” While he was expressing what we considered rather valid concerns on his blog, we were more concerned with the volatile manner in which he did so and we did not want the project represented this way. Discussions with police from the beat, however, confirmed our suspicions that confronting this resident would only agitate him further and we thereafter monitored his blog to document his postings. This incident demonstrates the difficulty in controlling how, and to whom, research information is disseminated beyond its intended purpose. This is a concern that anyone faces when making information public and the best safeguards cannot completely remove the potential for misrepresentation. For our purposes, the greater concern was verifying that only residents within the study beats completed surveys since this was in part a geo-based study. There were no limitations put on those who completed the surveys beyond being a resident of a study beat, something monitored by the cross streets provided by respondents on the survey. In this way, it was determined that 82 of the 241 responses received by residents using the password for the 50th beat were from individuals who did not live in a study beat. Of the remaining 159 responses from residents in study beats, 60% were from individuals who did not even live in this particular beat for which the password was intended, but in nearby beats. These survey completions were not included when determining response rates or examining the nature of participation because they did not reflect most beats’ experiences of participation by residents involved in the CAPS program. While unexpected, the experience in this beat actually suggests the potential for achieving greater resident participation in such web-based initiatives through the use of broader recruitment strategies utilizing multiple mediums, such as the blogs, discussion groups or listservs. Resident attitudes about participation in the project. Residents were provided with the opportunity to offer suggestions regarding the improvement of the project during wave 5 of the web survey. Of the 90 respondents who participated in wave 5, 70% indicated they had no suggestions or changes were not necessary. The remaining 30% supplied responses that can be divided among four categories: praise, methodological concerns, topics for inclusion on surveys, and use of survey results. Some residents took advantage of the opportunity to simply offer support for the project, wanting researchers to “keep up the good work” and that they thought the project was a “great idea.” Surveys were praised as being “comprehensive”, with questions “right on target”, “covering all the right things for us”, and “asking the difficult questions, i.e. like the ones on racial profiling.” Other residents expressed concerns with methodological 31 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. issues that stemmed largely from information not made available to the beats due to the experimental nature of the project regarding sampling methods and the number of participants, although a few requested “more opportunity to explain some of the answers which cannot always be answered as ‘yes or no’ but need more words.” Respondents also offered suggestions for topics they wished to see included on surveys. Topics focused on resident understanding of police operations, knowledge of crime in the area, and participation in CAPS. Finally, residents commented about survey results. For residents in beats where results had not been made available (either in control beats or in experimental beats in which police had not made results available as instructed), there was a natural desire to see results. Others expressed the desire for results to be used in a meaningful way, such as turning results into “comparative statistical reports”, passing results onto beat officers, and improving “conditions in our area” and “police presence/service.” Overall, these responses suggested that residents who participated in completing surveys took such participation seriously and saw potential for new data elements to be collected and results used to improve both the beat and police services. Citizen meeting facilitators were interviewed to explore resident attitudes about participation in the project, as well as their own responses to participating in completing surveys and discussing survey results. Facilitators almost unanimously expressed support for the general concept of residents using Internet surveys as a means to communicate concerns about public safety to the police, although many had some reservations as to its feasibility due to problems associated with access to computers or the Internet, computer literacy, and age of residents (discussed below in “Obstacles to Resident Participation”). Internet surveys were perceived as supplying a new “avenue” for communication between residents and police, acting as a “supplemental tool to the current system” for providing police with more information and expanding resident participation. Facilitators welcomed the ability of Internet surveys to act as an additional source of information for police about what was happening in the neighborhood, yet stressed its value as merely supplemental to existing structures; as one facilitator stated, “The Internet is a tool. It doesn’t replace face-to-face interaction like at the beat meetings.” The potential for broadening the scope of resident participation was also recognized, particularly for residents who might have attended beat meetings but were prevented from doing so for various reasons, including conflicting work schedules, health problems or physical handicaps, and those who were reluctant to speak to police one-on-one or in front of other residents. Regarding their own participation in the project, 13 of the 22 civilian facilitators interviewed had completed at least one of the Internet surveys and offered feedback regarding survey content. Overall, they felt surveys had covered issues that were important and were unbiased and fair. The broad range of issues included on surveys was considered as both a good and bad thing. While some facilitators were pleased with this aspect and felt that “most of the questions got to the heart of a lot of issues”, others felt surveys should be more tailored to each beat’s particular concerns. Indicative of the varying issues and state of police-community relations within the participating beats, facilitators offered widely diverse opinions as to the topics they considered most important for inclusion on surveys or problems that they saw with the survey. For instance, a facilitator from a beat in which police-community relations were clearly strained complained that “there were too many softball questions”. While one facilitator observed that surveys were “much too heavy on safety” and felt “the issue of crime in relation to the beat was overlooked”, another felt that “the most important thing is being safe” and therefore surveys covering broader topics of public safety were beneficial to residents. Suggested topics 32 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. for inclusion bore the hallmark of the CAPS philosophy in which facilitators are well-versed: emphasis on community-wide problems as opposed to problems specific to individuals, concerns of the residents, suggestions from residents on how police could work more closely with them, and the extent of residents’ knowledge about what happens in the community. Facilitators also spoke about the presentation and discussion of survey results during beat meetings. Full implementation did not occur in many beats according to the facilitators. Six stated they could not even recall ever having results distributed or discussed at their meetings, while several others stated results had only been made available once or twice during the course of the project. Of the 15 facilitators representing beats in which survey results were supposed to have been made available, 8 facilitators recalled results being made available and all of them found the graphs and tables provided easy to read and understand. Some noted that the importance assigned to neighborhood problems and public safety attitudes as indicated by residents in survey results did not always match how residents who attended meetings felt. However, most agreed that discussion of the results had the primary value of allowing residents to understand how other community residents felt about certain issues, particularly residents who did not attend the beat meetings. While they tended to agree that discussing the survey results was also useful to police for helping them to understand residents’ concerns, doubt was expressed about the extent to which police would use the information. The differences between police and residents as to the importance assigned to problems identified through the surveys was identified as a factor determining the use of the results. Meaning that police would regard results as being useful only if they contained information about issues that police thought of as “big” and that “they might take it more seriously then.” Otherwise, while it was believed that police “take into consideration what the residents say”, they were “not going to change because of it.” This last observation of reflects the police-resident interaction observed at other points during beat meetings. The primary aim of activities such as reading crime reports or identification of new problems, is ultimately information sharing between the police and residents with little attempt at problem solving. In some cases, it is possible police regarded information they were required to share through survey results as threatening, hence little effort being made to foster discussion or even the adoption of attitudes that precluded meaningful resident participation. A facilitator from one beat where police fully implemented the project tasks and faithfully covered results with residents noted, “When you gave an answer on the survey that the police department wasn’t happy with, they came into the meeting on the defense…they would try to act like nothing was their fault.” Obstacles to resident participation. During the project, citizens and police were provided with opportunities to offer their insight as to the obstacles to citizen participation in completing the web surveys. The post-test questionnaires administered to citizens at their beat meeting included questions regarding the extent of their participation in the project; for those who indicated that they had not completed a web survey, they were asked to supply a reason for not participating. Excluding individuals who were attending their first meeting at the time of the post-test administration, 170 citizens provided reasons for lack of participation. The most common reason, cited by 50.6 %, was lack of awareness about the project or failure to have received the necessary information to complete surveys, reflecting poor levels of implementation by police in many beats which stand as perhaps the most serious obstacle to achieving resident 33 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. participation. Of the 170 residents providing reasons for not participating, 31.8% reported that they did not have a computer or did not have Internet access, while 13.5% reported a lack of time, forgetting to or simply not being interested in doing so. Only 4.1% indicated having encountered some sort of technological problem that prevented them from completing the surveys. Ultimately, the reasons given by citizens regarding their own lack of participation mirror many of the reasons both citizen facilitators and police provided during interviews when asked to consider why more residents had not participated in completing surveys (see Table 4.3). As citizens and police typically identified the same obstacles, the decision was made to present their views together and it is noted below where opinions diverged between the two. Table 4.3 Obstacles to Resident Participation as Identified by Civilian and Police Facilitators (N=70) Obstacle % Lack of Internet access/computer 41.4 Apathy 30.0 Computer illiteracy 24.3 Not beneficial 21.4 Lack of time/too busy 18.6 Lack of awareness 18.6 Age 15.7 Fear of sharing information 7.1 Personal agendas 7.1 Lack of trust in police 4.3 Not surprisingly, the obstacle to participation most cited (41.4%) was lack of Internet access or computer. This was often linked to the concept of “computer illiteracy” or lack of necessary skills to use either computers or the Internet (24.3%). Citizen facilitators and police alike believed that many residents either living in their beats or at least attending the meetings simply could not participate because of a lack of access or technical ability. Responses in this vein were typically not qualified by further explanation or accompanied by comments indirectly indicating the belief that this lack was a function of income and residents could not afford it (e.g. “you won’t find Internet users in high crime areas”; “the Internet is a luxury”). Age was often cited as a factor for non-participation (15.7%), particularly in relation to lack of access or ability; it was noted in several beats that meeting participants tended to be older residents who were generally considered as being either uninterested (e.g. “they’re set in their ways”) or afraid of using computers and the Internet (e.g. “they’re intimidated by computers”). However, as over 34% of respondents for the surveys were 60 or older and their participation was in proportion to the rate at which individuals in this age group attended meetings, this suggests age is not necessarily the primary reason for non-participation, but that it is a matter of individual preference. 34 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Two interrelated obstacles also identified by facilitators and police were resident apathy (30%) and an inability to see the completion of surveys as being beneficial (21.4%). Common refrains were that residents “don’t want to get involved” and “they don’t care”, an assessment that extended beyond simply completing surveys to being involved in public safety initiatives and their communities at large. Residents were perceived as only getting involved when they had been personally affected by something going on in their community or had a specific problem they wished to bring to the attention of the police; as one sergeant stated, “If the crime is not close to them, its not their problem.” This view was somewhat tempered by an accompanying belief that residents had not participated because they did not consider it beneficial to do so. A small portion of such responses equated “benefit” with direct or immediate rewards for residents to induce participation, basically the need to “dangle the carrot” with prizes. One sergeant opined, “They want immediacy. The reward of improved police service is not enough.” Most responses, however, suggested a failure to participate because residents did not feel that information shared through the surveys would result in tangible changes for police services or problems in the community. Some likened it to a voting process and the belief “I’m just one person, what will it matter?”, while others discussed it in terms of residents believing their views would not be valued by the police. There was also reference to resident frustration with having coped with long-term community problems, exemplified by one civilian facilitator’s comments that many residents “just don’t believe in it. Some people think that things are never going to change regardless of what you do”. Such responses often did not question the web survey methodology or the type of information being collected, but rather questioned the benefit to residents as one concerning the actual use of the information itself and whether it would be used in a meaningful way. This observation lends support to the idea that beat meeting have moved forward within a narrow framework of information sharing consisting of three main components featured prominently on the accepted CAPS agenda: (1) Police report crime statistics; (2) Residents report new problems; and (3) Police report progress on problems identified by residents. Very little meaningful discussion or problem solving is usually attempted. Within such a narrow framework, broader communication about topics such as police-community relations, fear of crime, and the nature of police work would be difficult to achieve, as witnessed by observers who regularly reported survey results being treated as yet another group of statistics to be provided to residents. The communication between police and residents in many beats seems to be based on an assumption that certain problems are not to be discussed, such as quality of police performance and the police-community partnership, and the framework for meetings is structured to allow discussions predominately about crime and disorder problems. Clearly the value assigned to the project by residents is also related to the manner in which it was implemented. Several police personnel commented on this point, best stated by a beat sergeant who said: It’s not being talked up enough for whatever it’s importance. [Mimicking officer handing out information and speaking in a monotone] And here’s this and that and, oh, yeah, there’ll be a drawing for a laptop or something. There’s a difference. When I talk about it, I’m putting it to them as “help me out here…and it’s all about relationship.” It’s “please do me a favor” versus “do this crap.” If there’s a problem, it’s how we’re selling it. 35 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Arguably, residents would be less likely to see the benefit of being the source of these “statistics” if they had set expectations regarding the nature of the discussion about the results based on prior experience with discussions at their meetings. This suggests a broader obstacle to participation lies within the CAPS framework itself and that the nature of police-resident communication as demonstrated at beat meetings is such that some residents are possibly unable to see any potential benefit because the well-established patterns for interaction make it difficult for new information and dimensions for discussion to be used in anything than the now accepted manner. Conversely, others noted the nature of attendance as an obstacle, alluding to what was commonly termed the problem of residents having “personal agendas” that precluded interest in the broader topics included on the surveys. The idea of personal agendas was explained as residents “want to talk about what they want to talk about” with limited attendance by the same group of residents as a contributing factor: “the same people every time, same issues all the time” have essentially resulted in meetings with “such a narrow focus.” Observation of meetings has indeed shown residents intent on bringing particular problems to police attention with seemingly little concern for other community-wide problems. Yet such behavior would seem to be a by-product of the police-community interaction as it exists and, to a degree, has come to be accepted within the CAPS framework. Certainly both police and citizens expressed dissatisfaction with the quality of interaction and involvement of participants on both sides, but expectations for police and citizen roles have often relegated residents to nothing more than the “eyes and ears” of police, lending justification to resident expectation that they are fulfilling their role by bringing their personal agendas to the table. If beat meetings are not regarded by residents as venues for meaningful discussion and problem solving or feedback about police performance, then interaction that consists primarily of bringing their problems to the attention of the police and receiving information about crime fulfills the perceived purpose for attending meetings. Lack of resident awareness or understanding of project objectives was cited as another obstacle, something acknowledged by both citizens and police alike, although only a single individual (a civilian facilitator) specified that this had anything to do with poor implementation by the police. Most comments about lack of awareness were made in a general vein, e.g. “people do not know about them”, “more people need to be made aware of it.” Resident confusion about project objectives was also discussed in general terms: “people could have been confused on where to go on the Internet to complete the survey” and they “don’t understand the purpose of the survey.” When offering suggestions to improve the project, a common refrain was to increase residents’ awareness of the survey and, again, there was the same disjuncture between acknowledging the problem of resident awareness and positioning this problem as the responsibility of the CPD. In a sense, this oversight is to expected by both civilians facilitators and police who would in effect be blaming themselves for not making residents aware of the project. Yet we consider lack of awareness on the part of residents to be one of the greatest obstacles to achieving participation. As discussed in the implementation section, police in many beats failed to consistently make survey information available, fully explain project objectives, and encourage awareness of the project. Deprived of the necessary information, residents were then deprived of the opportunity to decide whether or not to even participate. If residents did not truly understand the purpose of the project because police did not take the time to explain it to 36 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. them, many may have disregarded the opportunity to participate. Ultimately, there can be no telling the true impact that poor implementation ultimately had on resident participation. Finally, fear was cited with less frequency as an obstacle, but represents a valid issue for participation in such initiatives that would no doubt be encountered with greater incidence if implementation were widespread. Some felt that the fear of retribution by either criminal elements in the community or the police for information/views shared through the surveys prevented some residents from participating, e.g. “People aren’t sure where the information is going and who is going to see it”. While flyers with the survey information provided to residents assured them that their answers were confidential and that no one from the CPD would see their personal responses, it is definitely conceivable that some residents did not believe their identity could not be found out. This stands as a problem that is well known to police when dealing with the issue of calling 911 and retaining anonymity; although assured of this protection, there are residents who have had officers come to their doors who would be distrustful of confidentiality claims under a police-sponsored initiative no matter how they are presented. Indeed, some identified as a primary obstacle resident lack of trust or confidence in the police that would make participation difficult to achieve under any circumstances. Fear of sharing information is not completely insurmountable where it is based on lack of knowledge about how information is collected for web surveys; brief tutorials explaining the process of protecting participant identity could help allay such fears. Concerns about sharing information deemed as going directly to the police, however, are not as easily addressed where maintaining trust between both parties can be an ongoing issue. This suggests that police departments may not be the best institutions for the processing of information collected from the community and an independent agency would be better suited for the task. B. Random Sample Participation Response rates for Internet surveys completed by the random sample. The response rates for the Internet surveys for the random sample are presented in Table 4.4. The average response rate across all seven waves was 34.5%. Not surprising the largest percentage of respondents participated in the first Internet survey (40.5%) and the smallest percentage in the last survey (22.1%). There was a gradual decline in participation from wave 1 to wave 6 and then a sharp decline for wave 7. The large drop in participation for wave 7 may be attributed to changes in the administration of the survey. As stated earlier, there were no postcard reminders mailed out for wave 7. The response rate was calculated by dividing the number of respondents for each wave by the total number of respondents recruited to be in the study through the first telephone survey (N = 1976). Because of recruiting delays not all the respondents were given an opportunity to participate in the first wave of the Internet surveys. As such, the response rate for the first wave is based on the number of respondents who had been recruited into the study at the time of the administration of the survey (N = 1872). The response rates for the other Internet surveys (waves 2 thru 7) are based on the final of number of participants recruited to be in the study (N = 1976). 37 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 4.4 Response Rates for Internet Surveys All beats Overall Wave 1 Wave 2 34.5 40.5 39.4 Wave 3 Wave 4 Wave 5 Wave 6 Wave 7 37.2 36.7 33.6 31.7 22.1 Experimental Condition (3 Groups) Control 33.6 37.2 34.4 36.4 36.7 34.0 33.8 22.6 Feedback 33.0 40.4 37.9 36.2 35.0 30.7 29.6 21.5 Feedback/Training 34.1 41.3 43.0 36.2 35.8 33.7 29.3 19.5 χ2 =.41 χ2 =2.1 χ2 =4.2 χ2=2.5 χ2 =10.8** χ2 =.01 χ2 =2.0 Experimental Condition (2 Groups) Control 33.6 37.2 34.4 36.4 36.7 34.0 33.8 22.6 Experimental 33.6 40.8 40.5 36.2 35.4 32.2 29.4 20.5 χ2 =7.3** χ2 =.01 χ2 =.32 χ2 =.68 χ2 =4.2* χ2=2.4 χ2 =1.2 Race/Ethnicity of Beat White 41.4 49.0 47.9 42.6 41.9 42.1 39.1 27.3 African American 25.6 31.7 28.0 28.5 28.4 24.2 22.5 15.7 Latino/a 23.0 24.4 26.5 27.0 26.0 19.1 21.4 16.3 Mixed 39.5 42.5 46.6 44.4 44.1 39.3 37.0 22.9 χ2=48.6*** χ2=78.0*** χ2=62.0*** χ2=61.3*** χ2=84.1*** χ2=49.5*** 38 χ2=33.4*** This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 4.4 also presents the response rates by experimental condition. The average response rate for participants in the feedback/training beats was 34.1% compared to 33.0% for participants in feedback only beats and 33.6% for participants in control beats. Statistical tests were conducted for each of the seven waves of the survey. Statistical tests were also conducted for comparisons in participation between three groups (i.e., control vs. feedback only vs. feedback/training) and between two groups (i.e., control vs. experiment). It important to note that there are slight differences between the overall response rates and the response rates for the experimental conditions because a few respondents chose to participate in the Internet surveys anonymously. As a consequence, their Internet surveys could not be linked back to their initial telephone survey and are missing information on geographic location and socio-demographic characteristics. The number of anonymous respondents varies across the seven waves with a low of less than 1% in wave 1 to a high of 2.3% for wave 7. Efforts were made to recover key geographic and demographic information for the anonymous respondents; however, it was not possible to recapture all of the missing information. Fortunately, it was possible to recover geographic information for all but 9 of the anonymous respondents. Demographic information was slightly more difficult to recapture, however, no wave is missing more than 2% of the respondents’ demographic information due to the participant’s desire for anonymity. Regarding response rates across the three experimental conditions, there are some interesting findings. Significantly more respondents from feedback only and feedback/training beats completed the wave 2 survey compared with respondents from control beats. There is almost a 10% difference in the response rate between the feedback/training group and the control group. There were significant differences in respondent participation across the different racial/ethnic beats. The average response rate for participants from predominately White beats was 41.4% compared to 25.6% for participants from predominately African American beats and 23.0% for participants from predominately Latino/a beats. Interestingly, there were relatively high levels of participation for individuals from racially/ethnically heterogeneous or mixed beats. The trend in the reduction of participation across the sevens waves of the Internet survey was relatively consistent across the four racial/ethnic groups. There was about a 44% reduction in participation from wave 1 to wave 7 for respondents from predominately White beats, 50% for those from African American beats, 33% for Latino/a beats and 46% for mixed beats. On average each of the respondents completed 2.3 (SD = 2.6) Internet surveys. About 60% of the original sample recruited though the initial telephone survey completed at least one of the Internet surveys and about 12% completed all seven surveys (see Table 4.5). There were no statistical differences in the number of Internet surveys completed by experimental condition (see Table 4.6). Respondents from feedback/training beats completed 2.4 (SD = 2.6) surveys compared to 2.3 (SD = 2.6) surveys for respondents from feedback only beats and 2.3 (SD = 2.6) for respondents from control only beats. Respondents from White and racially heterogeneous beats completed more Internet surveys than respondents from African American and Latino beats. On average respondents from White beats completed 2.9 (SD = 2.7) surveys and respondents from racially heterogeneous beats completed 2.8 (SD = 2.7) surveys compared to 1.8 (SD = 2.4) surveys for respondents from African American beats and 1.5 (SD = 2.3) surveys from respondents from Latino/a beats. 39 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 4.5 The Number of Internet Surveys Completed Number of Surveys Competed 7 Number of Respondents 229 12.23 Cumulative Percent 12.23 Percent 6 5 188 124 9.51 6.28 21.75 28.02 4 123 6.22 34.24 3 105 5.31 39.56 2 147 7.44 47.00 1 272 13.77 60.77 Demographic characteristics of respondents for random sample. Out of the original 1976 respondents recruited from the telephone survey, there are significantly more females than males (61.5% reported being female) and significantly more homeowners than renters (71.8% reported being homeowners). The average age of the respondents was 43 with a standard deviation of 14. The average education level of the participants was about 14 years which is equivalent to an associate degree. There was large percentage of college graduates (62.1%) and even a significant number of respondents with advanced degrees (25.8%). The sample was relatively affluent with over 50% of the sample reporting an annual income of $60,000 or greater. Almost 25% of the respondent reported an annual income of $100,000 or greater and only about 8% reported an income of less than $20,000. Table 4.6 Respondent’s Profiles Number of Internet Surveys Completed None 1 to 2 3 or more Beat Race/Ethnicity White 28.0% African American 42.7% Latino/a 13.9% Other 15.5% Experimental Condition (3 groups) Control 35.2% Feedback Only 32.7% Feedback/Training 32.1% Experimental Condition (2 groups) Control 35.2% Experimental 64.8% *p<.05 **p<.01 ***p<.001 40 37.1% 37.1% 10.7% 15.1% 45.6% 26.4% 7.0% 20.9% χ2 = 91.12*** 36.1% 30.5% 33.4% 34.7% 31.9% 32.9% χ2 = .83 36.1% 63.9% 34.7% 65.3% χ2 = .22 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. The demographic characteristics of sample changed significantly over the course of the study (see Table 4.7). Those who participated in a greater number of Internet surveys were more likely to be White vs. African American, Latino/a or other, and were more likely to be homeowners vs. renters. Those that participated in more surveys were also generally older, had higher levels of educational achievement and greater annual incomes. There were no sample differences across participation in terms of gender or years at residence. Predictors of non-response for random sample. In the course of the study we tested nine different quantitative predictors of participation in completely Internet surveys (see Table 4.8). The information came from a range of sources including crime rates from the Chicago Police Department, respondents’ answers to the first telephone survey and respondents’ answers to the second telephone survey. Using difference sources was important because it allows us to test some predictors in the correct temporal order (i.e., respondent’s answers preceding opportunities for participation) and it allows us to test some predictors for individuals who did not complete any of the waves of the Internet surveys. Nine predictors of participation were created. From official data provided by the Chicago Police Department we computed the violent crime rate (per 1,000) for each of the study beats. From the pre-experiment telephone survey we created variables measuring the respondent’s perceptions of their self-efficacy about public safety, knowledge about crime prevention, perceptions of police misconduct, and time on the internet at home and at work. Self-efficacy about public safety is a scale that included three questions where the respondent was asked to rate on a scale from 0 to 100 how much they agreed with statements about influencing their neighbors to take action, getting the police to be responsive, and working with the police to make the neighborhood safer (α = .63). Knowledge about crime prevention is a 2-item scale composed of questions about knowing the things needed to stay safe when out on the streets and knowing the things needed to keep your house and property safe (α = .76). Perceptions of police misconduct is a 2-item scale where respondents were asked how much of a problem (big problem, some problem, or no problem) is the police stopping too many people on the streets without good reason and the police not responding quickly to emergency calls in the neighborhood (α = .56). Higher values indicate more perceptions of misconduct. Time on the Internet at home and work were each one question that asked the respondent how much time (everyday, several times a week, several times a month, just a few times a year, and never) they spent on the Internet. Higher values indicate more time on the Internet. The scales measuring computing capabilities and feelings about the web-based surveys were created from the post-experimental telephone survey data12 . For all the measures the respondents were asked whether they strongly agreed, somewhat agreed, somewhat disagreed, or strongly agreed with a series of statements. Computing capabilities consists of three items: I am confident I can learn computer skills, I am apprehensive about using computers (reverse coded), and I am able to keep up with advances happening in the computer field (α =.49). Less dystopia feeling about computers consists of two items: computers turn people into just another number 12 Factor analysis was conducted on the eight computing / technology items. The results suggested that there were three factors with the first factor computing capabilities accounting for 29% of the variance, the second factor dystopia feelings about computers accounting for 18% of the variance, and the third factors perceived usefulness of Web surveys accounting for 17% of the variance. The three factors combined accounted for 65% of the total variance. 41 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 4.7 Summary of Demographic Characteristics of Respondents by Participation Level Number of Internet Surveys Completed None 1 to 2 3 or more White 36.7% 48.5% 66.8% African American 43.1% 38.3% 25.1% Latino/a 14.5% 8.9% 4.7% 5.7% 4.3% 3.4% Male 39.5% 39.5% 37.0% Female 60.5% 60.5% 63.0% Homeowner 68.1% 71.5% 75.9% Renter 31.9% 28.5% 24.1% Respondent’s Race/Ethnicity Other χ2 = 153.57*** Respondent’s Gender χ2 = 1.24 Respondent’s χ2 = 11.83** Age (years) M = 40.8, SD = 14.7 M = 43.1, SD = 13.6 M = 46.1, SD = 13.4 F = 28.80*** Education (years) M = 14.2, SD = 2.2 M = 14.8, SD = 2.0 M = 15.3, SD = 1.8 F = 61.21*** Income M = 3.2, SD = 1.3 M = 3.5, SD = 1.3 M = 3.7, SD = 1.2 F = 32.68*** Years at current residence M = 11.1, SD = 11.5 M = 11.2, SD = 10.8 M = 11.4, SD = 11.2 F = .18 *p<.05 **p<.01 ***p<.001 42 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. and computers make people become isolated (α = .50). And last, the perceived usefulness of the web surveys consists of two items: Web surveys are a good way to collect data about community crime problems and I believe that the Chicago Police Department will make good use of information collected through online Webs surveys to help the community (α = .66). In general, respondent’s who felt more capable had higher levels of participation. This included personal assessments about their knowledge and role in promoting public safety, and general assessments about access and skills with computers and the Internet. Individuals, who reported spending more time on the Internet, either at work or at home, also participated in more Internet surveys. A few of the findings were somewhat surprising. First, respondents from beats with higher levels of crime were less likely to participate. Second, there was no relationship between the respondents’ beliefs about the usefulness of Internet surveys and their participation in the project. Table 4.8 Bivariate Results for Predictors of Participation in Internet Surveys Number of Internet Surveys Completed None 1 to 2 3 or more M SD M SD M SD Violence crime rate in beat (per 1,000) 16.62 11.73 15.26 11.88 12.76 9.83 Self-efficacy about public safety 62.91 21.32 64.35 19.26 66.38 17.52 F = 6.32** Knowledge about crime prevention 82.20 19.58 84.59 15.72 85.60 14.09 F = 8.40*** Perceptions of police misconduct 1.55 .62 1.45 .57 1.34 .48 F = 29.43*** Time on internet at home 4.22 1.14 4.52 .94 4.51 .96 F = 19.90*** Time on internet at work 3.47 1.80 3.71 1.74 3.96 1.62 F = 14.53*** Computing capabilities 3.55 .52 3.68 .46 3.75 .39 F = 13.28*** Less dystopia feelings about computers 2.78 .92 2.88 .84 3.02 .79 F = 6.23** Perceived usefulness of web surveys **p<.01 ***p<.001 3.17 .69 3.20 .57 3.23 .62 F = .60 F = 24.70*** Post-experiment telephone findings. Prior to the implementation of the post-experiment telephone survey, we compiled information on which of the respondents had completed Internet surveys and how many Internets surveys each had completed. This information was used to screen respondents and ask those individuals who never filled out any of the surveys why they had decided not to participate. Additionally, respondents who had completed less than three 43 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. surveys were asked why they had had decided to stop participating. The questions were openended. Gathering this data provided us a unique opportunity to try to better understand the barriers to participating in Internet surveys. Seven-hundred and ten respondents provided answers as to why they had chosen not to participate in the project and 701 provided answers as to why they stopped participating 13 . By far the most common response was because of a lack of time. A large percentage of the respondent said that they were simply too busy to participate. Many other respondents gave answers revolving around a lack of motivation such as “I’m just too lazy,” “I just did not feel like it,” and “It’s not in my nature to do surveys.” One person went so far as to say, “I was just busy with other stuff and I didn’t feel like it, I didn’t feel like being a good citizen and doing the responses” (emphasis added). Many of the reasons for nonparticipation or discontinued participation highlight the diversity of normal life events that people experience, even over such a short period of time, including babies being born, serious illnesses, deaths in the family, changes in employment, and relocating. Overall, the results overwhelmingly suggest that time and motivations were the biggest barriers to getting people to complete on-line surveys. While it is important to know that a lack of time and motivation were the most prevalent barriers to participation, it is not particularly surprising or unique to on-line surveys. There were however, other issues raised more specific to this type of project. For example, many people reported technical problems. The technical problems ranged from their computer being infected with a virus to having trouble with their internet connection. Many respondents also reported technical problems specific to accessing the survey. For example, the Mac users had problems getting their default Internet browser Safari to work with our survey engine software. Even more problematic was the password system as describe earlier. A large percentage of people reported not filling out surveys because they lost their passwords. Although we had a mechanism in place for retrieving lost passwords, many of the respondents who forgot their password did not try to contact us and simply did not participate. All of these examples highlight the need to provide easy access to timely technical support for all the study participants. Another important issue revolved around where and when they could access the Internet. We recruited respondents into the study if they had access to the Internet at home or at work. A significant number of respondents cited not being able to access the Internet at work as a reason for not participating. As employers become increasingly concerned about employees wasting time online, they may institute more restrictive policies about Internet usage. Researchers should be aware of this issue. The last important reason cited for lack of participation was related to email. In general we found that the email reminder with a link to the survey was very helpful. However, many respondents cited a change in email address for discontinued participation. Others stated that they stopped getting the emails. And others seemed overwhelmed by the amount of email. For example, one person stated, “I get about 150 emails everyday. It was always at the bottom of the list. I’m sorry I didn’t participate. I volunteered yes, but I literally get about 150 emails 13 A few respondents indicated that our records were incorrect and that they had filled out the surveys. The number of respondents who provided this answer to the question was consistent with the number of respondents who choose to fill out the Internet surveys anonymously. 44 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. everyday.” While the email delivery system was useful to a significant number of the respondents, over reliance on it can be problematic. Many people were overwhelmed by the amount of spam in their inboxes, and spam filtering software may prevent the emails from reaching the respondents. There were a few respondents who stated that they did not participate or discontinued participation because of their attitudes about the police or the survey content. For example, one respondent stated, “I did not trust the police department with this information.” Another respondent commented “I didn’t think that they were going to be responsive. I didn’t think they’d do anything.” Respondents’ also stated concerns about the survey questions not addressing their needs. For example, one respondent “The question did not address my concerns. The prevention of property crime is a major concern for people in my community. I’d like to see more things done to prevent that.” 45 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER FIVE ADVANCES IN MEASUREMENT: THE DIMENSIONS OF INTERNET SURVEY INFORMATION A. Measurement Overview The Chicago Internet Project provided a unique opportunity to develop and field test a new measurement system that could serve a number of objectives within the community policing and problem oriented policing paradigms, including: assessing neighborhood problems, identifying community capacities, evaluating police performance, evaluating community performance, and evaluating local anti-crime initiatives, among others. To provide some context, we begin this section with a brief assessment of the limits of traditional measurements schemes and the value added with this new online system. We then describe our methodology for scale construction and validation. At the core of this section is a description of the various constructs we have sought to capture through this web-based system and our findings with regard to the scaling effort. When creating these scales we paid special attention to representing a variety of theoretically important dimensions of policing as suggested in the policing literature (Maguire, 2004; Mastrofski, 1999; Skogan & Frydl, 2004) and the community literature (Sampson, 1989; Schuck & Rosenbaum, 2006). We have also created some new scales that we believe are important for understanding the police-community nexus, but have received little attention in the literature. 1. Traditional Performance Measurement We have witnessed significant changes in law enforcement operations in recent years as a result of new technology, new accountability systems, and a range of new policing strategies. But as noted in the introduction, despite this progress, police organizations have yet to develop data systems to measure “what matters” to the public and “what matters” in policing according to widely touted theories of community policing and problem-oriented policing. We have argued that this measurement deficiency has placed an upper limit an organization’s capacity to introduce needed changes, to build healthy police-community relations, and to maximize effectiveness in fighting crime and disorder (Rosenbaum, 2004). The growing pressure to increase police accountability begs the question of how to measure police performance and how to define “good policing.” Traditional measures of police performance, emerging from efforts to professionalize the police (beginning in the1920s), have focused on crime-related counts, such as the number of crimes reported, arrests made, contraband seized, and cases cleared, as well as police response times. The limitations of these traditional measures of performance are well documented in the literature (Alpert & Moore, 2000; Goldstein, 1990; Grant & Terry, 2005; Skolnick & Fyfe, 1993; White, 2007). While these measures are consistent with the dominate mythology of police as crime fighters, they do not capture much of what police actually do (Alpert & Moore, 1993; Moore, 2002; Moore & Poethig, 1999; Reisig, 1999). Official statistics, such as crime rates and clearance rates are not only inaccurate and subject to manipulation (i.e. contain large measurement error), but more 46 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. importantly, they provide a very incomplete picture of police work and performance. They fail to capture some of the critical elements of police work, especially efforts to enhance the quality of life through improved interactions with the public and improved problem solving (Alpert & Moore, 1993; Blumstein, 1999; Greene, 2000; Maguire, 2004; Masterson & Stevens, 2002; Moore et al., 2002; Moore & Poethig, 1999; Reisig, 1999). These traditional measures also fail to provide any indication of the quality of day-to-day encounters between the police and community residents, whether these contacts are police initiated or citizen initiated. Historically, police have taken calls and reports about incidents, but have rarely sought external feedback on their non-crime fighting performance. Systematically seeking citizen input is a recent invention in the history of policing. (The one exception is the case of citizen complaint mechanisms, which remain severely flawed to this day). In the 1930s, Bellman introduced one of the first systematic police measurement tools that involved extensive checklists of effective departmental practices (Bellman, 1935; Parratt, 1938). Grounded firmly in the era of professional policing, Bellman stated that police must do their job regardless of public opinion and in his scale of several hundred items he stuck to insular police issues. Community input was summed up as superfluous with one item on his large inventory, “Are there any particular circumstances, geographical, political, social or otherwise, which affect the problem of the police?” Consistent with Bellman’s position, some critics today would argue that the public is fickle and civilian evaluations of police performance will change dramatically with one wellpublicized incident, such as Rodney King, and therefore, should not be given any credibility or weight. Indeed, there is evidence suggesting that a single incident can alter public opinion about the police (Weitzer, 2002). However, any valid measurement system should provide continuous data collection on a large scale and therefore, have the capacity to identify stable patterns across different neighborhoods and demographic subgroups (which our data suggest are present), as well as pinpoint the amount of variance due to local or national events. The stability of these changes can also be examined. Returning to the current state of measurement, the crime fighting model retains its dominance with police organizations. Even the large volume of citizen-initiated calls for service (up to 18,000 calls per day in Chicago) are quickly channeled and screened into a narrow set of crime measures. Rarely do police departments report on their handling of the roughly 80% of the calls that do not involve crime incidents, per se (Scott, 1981). In fact, the attention of the police is drawn to the roughly 2% of calls that involve violent crime. The question then becomes—how well are the police performing on activities or tasks that consume the vast majority of their time but are not captured in traditional statistics? We simply don’t know. Another problem with traditional measures of police performance is that, with the exception of crime rates (which, in the long run, are shaped by forces beyond the control of the police), the organizational focus is on counting activities as indicators of success rather than measuring whether the organization has achieved its desired goals. Goldstein (1979; 1990) argues that this obsession with “means” rather than “ends” has dramatically reduced the effectiveness of police organizations. Measuring processes is not inherently evil, but as noted earlier, only a few processes (such as arrests, seizures, enforcement activities) are recorded, leaving the quality of daily policing activities largely unmeasured. Also, a wide range of non- 47 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. crime outcomes are ignored as well, such as fear of crime, perceptions of crime and disorder, and public assessments of police performance. Progressive police departments have recently enhanced traditional measurement with an accountability push, both department-wide (i.e. COMSTAT) and officer specific (i.e. early intervention or warning systems). Some departments are even seeking to integrate their internal measurement and external monitoring systems involving citizen complaints (see Walker, 2005). This is an important step forward in police accountability, but the primary limitation of these new systems is that they are typically reactive and incident driven. The focus remains on the unrepresentative group of citizens who formally complain about police conduct rather than aggregate data from the entire community. The quality of policing during routine encounters remains below the radar screen. 2. Establish a Mandate and Information Imperative If police organizations attempt to move beyond traditional “bean counting” or “the numbers game,” the question then becomes—what are the goals of the organization? What problems are they trying to solve? Unfortunately, the police do not have a clear mandate—they have been given a wide range of duties and responsibilities, ranging from preventing crime to controlling crowds to saving lives. However, the emergence of several new policing paradigms during the past 30 years has provided some guidance in this regard, bringing with them a new information imperative (Dunsworth et al., 1998; Rosenbaum, 2004). Some of the most popular policing models—community policing, broken windows policing, and problem oriented policing—suggest that the function of the police reaches far beyond crime fighting to outcomes such as improving the quality of neighborhood life as measured by social disorder, physical decay, and fear of crime. Community policing also mandates that the police play a role in creating self-regulating neighborhoods by strengthening informal social controls (Rosenbaum, Lurigio & Davis, 1998). Achieving this goal presumably entails promoting community crime prevention behaviors, strengthening collective efficacy, and building partnerships with other agencies and organizations—functions that are not reflected in traditional police performance measures. The achievement of these goals cannot be assessed without the collection of new information, which requires new measurement systems. Thus, to solve neighborhood problems and engage the community in preventative behaviors, police organizations would benefit from knowledge of (1) public perceptions of the severity of various problems; (2) the community’s capacity to engage in community crime prevention; and (3) the community’s support for police initiatives and willingness to engage in joint ventures. If police had a better understanding of problems in their respective communities then they would be better equipped to deal with them (Moore et al., 2002). Furthermore, to achieve maximum effectiveness, the police must have the support and cooperation of the public. To do this, they must have the trust and confidence of the people they serve. This is an indispensable outcome in and of itself. Hence, new measurements of performance should capture not only the community’s assessment of problems and its capacity to prevent crime, but also the community’s evaluation of police services, police encounters, police-community 48 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. relations, and overall organizational legitimacy. These will determine the health of the policecommunity partnership and its ability to “co-produce” public safety. Finally, we emphasize that the importance of perceived fairness in police encounters reaches far beyond building problem solving teams to the very essence of law and order. Research on procedural justice theory suggests that citizens’ willingness to obey the law hinges on their perceptions that the police are procedurally just and fair (Sunshine & Tyler, 2003; Tyler, 2004). If the police are free to be abusive to citizens or to violate the law, many residents will scoff at their own responsibility to be law abiding. 3. Level of Measurement Measurement of police performance can occur at the individual, small group or organizational level. Traditional internal systems to assess individual police officers are seriously flawed. In most agencies, the performance evaluation process has no credibility and is unrelated to real officer performance. Researchers and police executives have offered many suggestions for improving internal evaluation systems (e.g. Oettmeier & Wycoff, 1997; Skolnick & Fyfe, 1993; Walker, 2005), but the fact remains that an officer’s work is largely unsupervised and difficult to measure. Evidence of successful problem solving by officers holds promise as an outcome measure, and more work is needed to develop good internal measurement systems in this domain. But community judgments of success in problem solving are equally important. If the quality of life in the neighborhood has not changed in noticeable ways (e.g. fear of crime, perceived severity of local crime and disorder problems, residents use of the local parks and facilities), this reflects on the true success of the problem solving project. Hence, external community assessments are essential, not only to capture perceived changes in local conditions, but also to provide independent judgments about police performance and to serve as a real-time barometer for police-community relations. In our measurement framework, we have chosen to focus on measuring organizational and small group performance from a community perspective that brings attention to police performance at the neighborhood level. This decision is based on a number of factors. First, we believe that holding groups of officers responsible for police performance within specific geographic areas is a sensible accountability strategy and consistent with the logic behind the popular COMPSTAT model and community policing model. Second, the performance of individual officers is difficult to measure with our community methodology because local residents (similar to police supervisors) are unable to make reliable observations at the individual level (e.g. most residents cannot recall the name of a police officer serving their neighborhood). Having said this, we are not opposed to external measurement at the individual level. Indeed, our measurement scheme provides for assessments of police service during individual police-resident encounters. The only question is whether these data should be incorporated into individual performance review systems or aggregated to hold small groups of officers accountable for the overall quality of policing at the neighborhood level. (More on this issue later). 49 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 4. Community-Based Measurement Readers may ask, “why the need for a community-based measurement system? Can’t police organizations evaluate their own performance?” Certainly, we are not suggesting that current systems be abandoned, but rather supplemented with new information. There are two questions here: (1) why new information? (2) why can’t the police collect it? Beginning with the second question, there are several key reasons for creating a community-based survey system rather than a police-based system. First, most police organizations do not have sufficient motivation, on their own, to systematically collect and utilize community feedback to improve agency performance. 14 If anything, the history of police reform suggests that departments have sought to insulate themselves from public scrutiny. The largest police reforms occur only after external oversight is exercised. Second, the information loses some degree of credibility if it is managed by the police, who spend considerable energy working on impression management. Third, the external management of information guarantees that the information will be publicly available in aggregate form and thus allows police stakeholders to exercise pressure on the organization to improve its performance. In part we have already addressed the question of “why new information?” To elaborate, the history of efforts to reform the police also suggests that police misconduct is the source of most political crises involving law enforcement agencies. Policing in the 21st century is characterized by a heightened awareness of the importance of equity, fairness, demeanor, and overall professionalism as requisites for maintaining public confidence and trust in the police. Communities continue to want effective cops who can reduce crime, but they also insist that officers treat community members with respect and dignity. The title of the National Research Council report on the status of American policing says it all—Fairness and Effectiveness in Policing (Skogan & Frydl, 2004). Today, the emphasis (and measurement!) must extend beyond efficiency and effectiveness to issues of equity and fairness of treatment across race, class, gender, sexual orientation, and religion. Hence, we have proposed a system of measurement that provides regular feedback about the quality (and quantity) of policing at the neighborhood level through the eyes of the police service consumers. Finally, we believe that such a system is timely because of a growing schism between popular policing strategies to address violent crime on the one hand and community expectations of fair treatment on the other. With the increased application of aggressive zero-tolerance approaches in many cities, police organizations are running the risk of numerous adverse community consequences (see Rosenbaum, 2006; 2007), some of which may be preventable if community feedback loops are introduced. Progressive police leaders have acknowledged the importance of community input and feedback, as well as the need to create transparent learning organizations (Fridell & Wycoff, 2004). Interest in gauging local public opinion is apparently widespread, as reflected in the statistic that 75 % of police departments in the U.S. participated in a community survey in 1997 (Fridell & Wycoff, 2004). Externally focused police measurement can include surveys of 14 Current efforts by police agencies to invite community input are widespread, but are best characterized as public relations events or crisis management meetings rather than reflecting a deep institutionalized commitment to strategic planning at the neighborhood level. 50 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. arrestees, victims, witnesses, callers, persons stopped, the general public, as well as community observations and focus groups. In recent history, a salient example of advancement in crime measurement is the introduction of the National Crime Victimization Survey (NCVS); started in 1973, the NCVS is the Nation’s primary source of information on criminal victimization and provides essential data on unreported crime and victim responses. The NCVS, however, does not regularly capture public perceptions of the police or the community and does not allow for reliable estimates of smaller geographic areas or even cities. To fill this gap, hundreds of police departments have conducted or outsourced local community surveys to obtain local feedback. Unfortunately, most of these are unscientific mail or telephone surveys and provide only a snapshot (one time) view of local conditions. Occasionally, we will see valid surveys that capture many of the dimensions of interest here (e.g. see Skogan and Hartnett, 1997, for a citywide survey conducted over several years), but even in these cases, the sampling usually does not allow for estimates at smaller geographic areas, such as neighborhoods, and the time lag between measurement periods is one year or more. As we have noted previously, we are unaware of “any efforts to establish (a) representative samples of community residents, (b) regular reporting periods, (c) comprehensive survey content to measure the important dimensions of policing and public safety, (d) data analysis or feedback mechanisms, and (e) plans for the systematic use of these data for strategic or tactical planning.” (Rosenbaum, 2004). The Chicago Internet Project attempted to expand policing measurement paradigm not only by expanding performance measure but by focusing the measurement on a small geographic unit the police beat—something that is made possible with the Internet. Information technology has expanded possibilities by offering new public safety tools and measurement methods (i.e. web surveys, websites with accessible up-to-date crime data, crime mapping and forecasting software). New measurement systems will enhance analyses that are central to smart policing, such as community analysis, problem analysis, deployment decisions, and program evaluation (Dunworth, 2000). 5. Community Performance With all the attention focused on police, it easy for police leaders, politicians, and the general public to forget that the community is indispensable in public safety. The community’s role in the prevention of crime is well established (Rosenbaum et al, 1998; Schuck & Rosenbaum, 2006; Sampson, 1989). Hence, this measurement system presumes that community residents should also be held accountable for certain types of “performance.” Society has not held the community accountable for neighborhood safety, and therefore, has not developed any standardized measures of community performance. To some extent, we have attempted to do so here. B. Measurement Theory and Scale Construction This section provides a technical description of our approach to scale construction. Within the context of measurement theory, it is important that we provide evidence of the validity and reliability of any composite measures being constructed. First, social science methodologists consistently argue that a relatively small number of measures can represent a 51 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. particular theoretical construct. This generalization from measures to constructs, however, requires some explanatory theorizing (cf. Cronbach & Meehl, 1955) and empirical evidence (cf. Shadish, Cook, & Campbell, 2002) to articulate the nature of the construct and its components, how the construct is related to other similar and dissimilar constructs, and if appropriate, what mediating or moderating processes might be operating. In essence, researchers need to establish the validity of their measures and scales and provide evidence that the selected items are suitable reflections of the underlying construct of interest. As part of this process, we begin with the well-established premise that multi-item indices are superior to single-item measures of constructs because of their relative strength in reducing measurement error, increasing measurement stability, and capturing more of the content or components of the construct. Hence, for each of the constructs of interest, we have followed a rigorous plan of scale construction and testing to determine whether relevant survey items can be combined into a single composite index. To begin with, a “good” scale should be unidimensional, internally consistent, and stable over time. Factor analysis was used to establish the presence of unidimensionality. Once a factor was identified, Cronbach’s Alpha coefficient was calculated to measure internal consistency or reliability (i.e. how well these items “hang together”). If an item did not contribute to the scale’s reliability, it was dropped prior to constructing the index. Finally, to establish the test-retest reliability of the index, Pearson’s Correlation Coefficient was calculated to determine the correlation of the index with itself as measured at two points in time, typically 3 months apart. For key indices, additional tests were performed to explore construct validity (as described below). C. Measures of Police Performance One of the primary objectives of the CIP initiative was to develop new external measures of police performance. In 1996, the National Institute of Justice held a series of workshops entitled “Measuring What Matters,” where leading police scholars and practitioners reflected on the problems with traditional performance measures and explored new possibilities more consistent with emerging community-oriented and problem-oriented policing strategies. The participants agreed that police organizations and other stakeholders would need to creatively define and measure the dimensions of police performance that matter most to the community. In the ensuing decade, unfortunately, little progress has been made at the empirical level, although the theoretical dialogue has continued with some refinements. Recent conceptual work builds on the public’s expectations for the police and what we value as a democratic society. As Herman Goldstein notes, the police are expected to do many things, including preventing crime, resolving interpersonal conflicts, managing pedestrian and vehicular traffic, protecting constitutionally guaranteed rights and creating a sense of security in the community (Scott, 2000, p. 17). Furthermore, our demands on the police do not end with these role expectations. The public increasingly insists that the police achieve these objectives in a manner consistent with our democratic principles. As Goldstein (1990, p. xiii) underscores, “we have an obligation to strive constantly—not periodically—for a form of policing that is not only effective, but humane and civil; that not only protects individual rights, equality, and other values basic to a democracy, but strengthens our commitment to them.” 52 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 1. Dimensions of Police Performance In this framework of policing in a democratic society, several dimensions of police performance appear repeatedly in the literature (Moore, 2002; Mastrofski, 1999; Skogan et al., 2000). The central focus has been on assessing the processes of policing. Reaching far beyond traditional crime statistics, particular emphasis has been given to the following performance questions, each of which we have sought to measure in this project: • Are the police exhibiting good manners during encounters with residents? • Are the police competent in the exercise of their duties? • Are the police fair and impartial when enforcing the law? • Are the police lawful in the exercise of their duties? • Are the police equitable in the distribution of services? In addition, theories of community policing and problem oriented policing have uniquely underscored the importance of other process and outcome questions for the police. In particular: • Are the police responsive to the community’s concerns and problems? • Are the police effective in solving neighborhood problems? • Are the police engaging the community in crime control and prevention actions? • Are the police creating cooperative partnerships with the community? • Does the public perceive less crime and disorder? • Does the public report lower rates of victimization? • Does the public report less fear of crime? • Does the public perceive a higher quality of life in their neighborhood? • Does the public attribute organizational legitimacy to the police? These questions define the scope of the measurement system developed as part of this project. This system assumes that the various behaviors of officers will be reflected in the perceptions and judgments of local residents, which in turn, will shape residents’ overall assessment of the police organization. If public expectations of the police are met, then public confidence in the police and perceptions of police as a legitimate authority should increase accordingly. Our conceptual scheme for external evaluation of the police offers three primary types of community assessment: general assessments of police officers; experience-based assessments of 53 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. police officers; and assessments of the police organization. Each is described and distinguished below. 2. General Assessments of the Police General assessments of the police provide civilians with the opportunity to broadly evaluate the behavior patterns or characteristics of police officers without reference to a particular observation, encounter, or incident. Anyone who is generally aware of the existence of municipal police is, arguably, qualified to express their opinions via general assessments. We have constructed two types of general assessments—global evaluations (e.g. survey items referring to “Chicago police officers”) and neighborhood-specific evaluations (e.g. survey items referring to “Chicago police officers in your neighborhood”). Past community surveys demonstrated that these are distinct constructs and although global and specific perceptions can influence one another, they are unique assessments (Brandl, Frank, Worden & Bynum, 1994; Schuck & Rosenbaum, 2005). Typically, surveys have focused on global perceptions, but a better understanding of perceptions of neighborhood policing practices will provide a strong foundation for a local geo-based evaluation system. Many of these evaluation dimensions were designed to capture perceptions of efficacy and fairness which are conceptually distinct from judgments about officers’ crime fighting abilities and thus require different measures to capture them (Eck & Rosenbaum, 1994; Skogan & Frydl, 2004; Sunshine & Tyler, 2003; Tyler, 2004). Our general assessment measures have been influenced by previous theoretical and empirical work. Mastrofski (1999) proposed six global dimensions for assessing police officers—attentiveness, reliability, responsiveness, competence, good manners and fairness, but to our knowledge, these dimensions have yet to be fully validated. Additionally, in repeated telephone surveys, Skogan and Hartnett (1997) measured three neighborhood-specific dimensions of policing in Chicago—demeanor when dealing with people in the neighborhood, responsiveness to community concerns, and effectiveness in preventing crime and disorder. Even broader conceptualizations of performance measurement have been proposed in the literature (Moore, 1999, 2002). For this project, multi-item scales were constructed to measure both global and neighborhood-specific indicators of police performance. 3. Global Evaluations of the Police Police Manners Index. The Police Manners Index was designed to measure the public’s general perception of officers’ courtesy or manners when interacting with the public. This three-item index, measured at waves 3 and 6, includes courtesy/respectfulness toward residents in general, youth, and minorities. Factor analyses produced a single factor at both waves, explaining 82.7% of the variance in the items at wave 4 and 80.5% at wave 6. The factor structure was replicated across each of the four racial/ethnic groups for both waves. The Police Manners Index exhibited good internal consistency, as reflected in the Cronbach alpha coefficients at wave 4 (alpha = .89) and wave 6 (alpha = .88). The Index also exhibited good test-retest reliability between waves 3 and 6, r =.68 (p < .01). 54 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. In sum, the Police Manners Index is unidimensional, internally consistency, stable across racial/ethnic groups, and reliable over time. The final index properties are shown in the table below. Higher scores on the index denote more frequent displays of good manners by the police. Police Manners Index: In your opinion, how often do Chicago police officers act in the following manner? (5 = Always, 1 = Never) Items 1. Courteous to residents. 2. Respectful of youth. 3. Respectful of minorities. Scale Statistics Wave 4 Wave 6 N 676 580 M 3.54 3.47 SD .74 .73 Min 1 1.33 Max 5 5 Police fairness index. The Police Fairness Index seeks to measure the general perceptions of officers’ fairness or evenhandedness in the treatment of citizens and their application of the law. A two-item index was constructed after analyses indicated that the items were strongly and consistently correlated in all neighborhoods. The Index revealed good internal consistency at both wave 3 (alpha=.87) and wave 6 (alpha=.90). The test-retest reliability for the Fairness Index was also high, r = .77, p < .01. The final index properties are shown in the table below. Higher scores on the index denote a stronger belief that Chicago police are fair when dealing with citizens. 15 Police Fairness Index: Please indicate whether you agree or disagree with the following statements about Chicago police officers. (4=Strongly agree; 1=Strongly disagree) Items 1. Chicago police officers treat all people with dignity and respect. 2. Chicago police officers are fair and impartial when applying the law. Scale Statistics Wave 3 Wave 6 N 670 560 M 2.43 2.42 SD .80 .77 15 Min 1 1 Max 4 4 In the future, we would consider including two additional items: The police are fair to residents; The police make decisions based upon facts, not personal biases. In the present survey, however, these items used a different response format (always-never). 55 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. D. Competency Indices Several indexes were developed to measure the public’s view of police competency in a wide range of areas. Given that police engage in a variety of behaviors, a single index was considered too insensitive to capture their performance. These are global assessments of officer competency, not specific to local beat officers. 1. Police Knowledge Index This two-item index measures whether the public believes police officers are knowledgeable about police procedures and are well trained. The internal consistency is high (alpha=.79). The test-retest reliability between waves 3 and 6 was moderately high, r = .60, p < .01. 16 The final index properties are shown in the table below. Higher scores indicate stronger belief in the knowledge and training of Chicago police officers. Police Knowledge Index: Please indicate whether you agree or disagree with the following statements about Chicago police officers. (4=Strongly agree; 1=Strongly disagree) Items 1. Chicago police officers are well trained. 2. Chicago police officers are knowledgeable about police procedures. Scale Statistics Wave 3 Wave 6 n 651 538 M 3.16 3.10 SD .54 .64 Min 1 1 Max 4 4 2. Police Reliability Index This 4-item index taps residents’ feelings about the reliability and consistency of Chicago police officers. The scaling results show a single factor across all neighborhoods, explaining 66.3% of the variance at wave 3 and 75.8% at wave 6. The internal consistency of the index was good at both waves (w3 alpha = .82; w6 alpha = .84). The test-retest reliability was moderately strong, given that the two indices were not identical, r = .67, p < .01. 17 The final index properties are shown in the table below. Higher scores indicate a stronger belief in the reliability of Chicago police officers. 16 Only a single item was measured at wave 6, so this correlation coefficient indicates the relation between that item and the Index score at wave 3. 17 Only 3 of the 4 items were used at wave 6. 56 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Police Reliability Index: Please indicate whether you agree or disagree with the following statements about Chicago police officers. (4=Strongly agree; 1=Strongly disagree) Items 1. 2. 3. 4. Chicago police officers follow through on their commitments. Chicago police officers are reliable when you need them. Chicago police officers respond quickly to emergency calls. Chicago police officers are visible on the streets. Scale Statistics Wave 3 Wave 6 n 729 596 M 2.96 2.95 SD .64 .64 Min 1 1 Max 4 4 E. Neighborhood Specific Evaluations of the Police 1. Responsiveness to Community Index With the community policing paradigm, this 4-item index measures the extent to which residents view their neighborhood police as responsive to their concerns, including a willing to share information and work with residents on problems of high priority to the community. This index is modeled after Skogan and Hartnett (1997) Responsiveness Index, with some new items added (#2 and #3) for content validity. The scaling results show a single factor across all neighborhoods, explaining 81.7% of the variance at wave 3 and 83.2% at wave 6. The internal consistency of the index was strong at both waves (w3 alpha = .93; w6 alpha = .93). The index demonstrated good test-retest reliability, r = .74, p < .01. The final index properties are shown in the table below. Higher scores indicate that local Chicago police officers are viewed as more responsive to community concerns and engaged in a problem solving dialogue with the community. Responsiveness to Community Index: Please rate how good a job you feel the Chicago police are doing in your neighborhood: (4=Very good job, 1= poor job) Items 1. 2. 3. 4. Dealing with problems that really concern residents. Sharing information with residents. Being open to input and suggestions from residents. Working with residents to solve local problems. Scale Statistics Wave 3 Wave 6 n 653 535 M 2.36 2.40 SD .84 .83 57 Min 1 1 Max 4 4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 2. Satisfaction with Neighborhood Police A single item was used to measure overall satisfaction with policing at the neighborhood level regardless of whether the respondent reported any contact with the police in the past year. The item properties are shown in the table below. Higher scores indicate greater overall satisfaction with police officers who serve the neighborhood. Satisfaction with Neighborhood Police: (4=Very satisfied, 1=Very dissatisfied) Items 1. In general, how satisfied are you with the police who serve your neighborhood? Are you … Scale Statistics n M SD Min Max Wave 6 611 3.06 .62 1 4 F. Experience-based Assessments of the Police Since the bulk of police work involves some kind of community contact, responding to calls for service, traffic stops, order maintenance, community meetings, and since most of these interactions are not criminal in nature, police-citizen interactions are expressly important to capture. Measuring constituents’ perceptions of the policing process is central to capturing whether or not police are “doing justice” (Alpert & Moore, 1993), and arguably, citizens want the police to be fair and equitable when they are meting justice. Additionally, Tyler’s (1990) work indicates that procedural justice, the perception of fair treatment, is related to satisfaction regardless of whether or not citizens’ perceive that the police have solved the problem in question. Interactions with the community residents – either voluntary, citizen-initiated (e.g. calls for service) or involuntary, police-initiated (i.e. traffic stops, arrests or requests to change behavior) – are at the heart of police work. Research suggests that positive police contact can reduce fear and improve public attitudes about the police (Pate et al., 1989), but a larger body of studies indicates that negative police encounters have a much greater impact on perceptions of the police than positive interactions (Skogan, 2006). For our purposes, the important point is that these encounters are only examined occasionally via research surveys and not measured systematically by police organizations or outside entities. Unless a police contact results in an arrest, ticket or citizen complaint, there are no data collected on these encounters. Given that citizen’s trust of police hinges on citizen-police interactions, vicarious and direct, and given that most police-citizen interactions don’t result in “formal police action” (i.e. arrest), it seems imperative that we find a way to evaluate the quality of these contacts. The police process measures were conceived and influenced by the “customer service” model. The idea is these measures would allow the customers, individuals who call the police, organized petitioners or experience “obligation encounters,” to evaluate police service received (Moore 1999, 2002). Much like private sector, and increasingly the public sector, customer satisfaction surveys are integral to evaluating and adapting operating procedures and giving the consumer a voice in their service delivery. 58 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Procedural justice theory (Lind & Tyler, 1988; Tyler, 1990) provided a framework for developing measures of police-civilian encounters, as people’s judgments about the police are based heavily on their sense of whether the process is fair. Research suggests that a process is more likely to be judged fair when the following elements are present (Skogan & Frydl, 2004, p. 304): (1) demeanor—people are treated with dignity and respect; (2) participation—people have a voice and are allowed to explain their situation; (3) neutrality—the authority is seen as evenhanded and objective; (4) trust—people trust the motives of the authority as serving their needs, concerns, and well-being. Our experience-based assessment questions capture all or part of these dimensions. From a crime victim’s perspective, these dimensions are also important, as too often victims of violence encounter non-supportive professionals, which can inhibit their psychological recovery (Ullman, 1996). Using restorative justice theory (Bazemore, 1998), one can argue that police should be judged by their ability to “restore crime victims” (Alpert & Moore, 1993). This implies the need for police to be sensitive to the needs and concerns of crime victims when the incident is reported (Rosenbaum, 1987). The experience-based assessment measures described below cover a wide range of direct and indirect encounters with the police. Direct encounters include calls to 311 and 911, domestic home visits, traffic stops, and crime incidents as a victim or witness. Some are police-initiated, others are civilian-initiated. Some are close personal encounters; others are observations from a distance (e.g. witnessing encounters in the neighborhood). Regardless, survey respondents were asked to report their overall satisfaction with their most recent encounter (described in the table below). More importantly, they were queried about the procedural justice and restorative justice aspects of these encounters. Only the traffic stop responses are reported here to illustrate the potential for measurement. In addition, we have developed new measures of emotional responses to police encounters. Other than an occasional item about fear of being stopped by the police, surveys have yet to capture the affective component of potential police encounters. 1. Assessments of Police Stops Index A national survey in 2005 indicated that roughly one-in-five U.S. residents ages 16 or older (or 43.5 million people) have face-to-face contact with the police each year and more than half of these contacts (56%) are traffic related (Durose, Smith, & Langan, 2007). Over the past decade, police stops have become a lightning rod for tensions between the police and minority communities. Complaints about racial profiling, as well as verbal and physical abuse, have been widespread. Hence, there is a pressing need to institutionalize the measurement of police conduct during traffic stops. A Police Stop Index was constructed to capture some key procedural elements of police stops as perceived by the person being stopped. The questions (and what they measure) are listed in the following table in sequential order. The screening question asked, “In the past year, have you been stopped by a Chicago police officer when you were in a car, on a motorcycle, on a 59 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. bike, or out walking?” We also asked, “Did this police stop occur in your neighborhood or somewhere else?” Assessments of Police Stops Items During the most recent time you were stopped by Chicago police … (1 = Yes; 0 = No) 1. 2. 3. 4. Did the police clearly explain why they stopped you? (trust/concern) Did you feel that you were stopped for a good reason? (neutrality) When they talked with you, did the police pay careful attention to what you had to say? (participation/voice) Did the police clearly explain what action they would take? (trust/concern) During this stop…(1 = Yes; 0 = No) 5. 6. Did the Chicago police say anything that you thought was insulting, disrespectful, or rude? (demeanor) During this stop, did any Chicago officer use any form of physical force against you, including pushing, grabbing, kicking, or hitting? (demeanor) During this stop… (4 = Very polite; 1 = Very impolite) 7. Did you find the Chicago police? (demeanor) During this stop… (4 = Very fair; 1 = Very unfair) 8. How fair were the Chicago police? (neutrality) During this stop… (4 = Very satisfied; 1 = Very dissatisfied) 9. Overall, how satisfied were you with the way the Chicago police responded? (satisfaction) Why were you dissatisfied with the way the police responded? (open ended question) Scale Statistics Wave 3 n 111 M 6.02 SD 3.03 Min 1 Max 9 1. Satisfaction with Police Contacts The following items measure Chicagoan’s overall satisfaction with diverse police encounters during the past year, ranging from residents’ calls for police assistance to policeinitiated vehicle stops. Satisfaction varies by type of encounter. Given the different sample sizes for each encounter, a composite satisfaction index was not computed. 60 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Satisfaction with Police Contacts: (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. How satisfied were you with the person who answered your 311 call? Scale Statistics n M SD Min Max Wave 3 445 3.25 .89 1 4 Satisfaction with Police Contacts: (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. How satisfied were you with the person who answered your 911 call? Scale Statistics n M SD Min Max Wave 3 250 3.43 .78 1 4 Satisfaction with Police Contacts: During this stop…(4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. Overall, how satisfied were you with the way the Chicago police responded? Scale Statistics n M SD Min Max Wave 3 110 2.87 1.06 1 4 Satisfaction with Police Contacts: Concerning the incident… [In the past year, have you had any inperson contact with a Chicago police office because someone in your family had a problem, either children and/or adult?] (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. Overall, how satisfied were you with the way the Chicago police responded? Scale Statistics n M SD Min Max Wave 3 88 3.10 .94 1 4 Satisfaction with Police Contacts: Concerning the incident… [In the past year, have you had any inperson contact with a Chicago police office because you were a victim or witness to a crime.] (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. Overall, how satisfied were you with the way the Chicago police responded? Scale Statistics n M SD Min Max Wave 3 115 3.12 .96 1 4 Satisfaction with Police Contacts: Concerning the incident… [In the past year, have you had any inperson contact with a Chicago police officer because you were involved in a traffic accident or witnessed a traffic accident.] (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. Overall, how satisfied were you with the way the Chicago police responded? Scale Statistics n M SD Min Max Wave 3 72 3.29 .94 1 4 Satisfaction with Police Contacts: (4=Very satisfied; 3 = Somewhat satisfied; 2 = Somewhat dissatisfied; 1=Very dissatisfied) Items 1. How satisfied were you with the handling of the complaint? Scale Statistics n M SD Min Max Wave 3 22 2.05 1.05 1 4 61 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. G. Performance at Public Meetings Index Police officers today are expected to attend public events, organize and facilitate community meetings, give educational presentations, and engage in problem solving tasks with other agencies, community organizations, and local residents. The Chicago police hold monthly beat meetings for each of its 280 police beats, as well as attend other community meetings. The Performance at Public Meetings Index is a new 8-item scale that seeks to gauge public assessments of police officers performance in these group settings. A wide range of performance dimensions are explored. The final index properties are shown in the table below. The internal reliability of the scale is high (alpha = .93). Higher scores indicate more positive police performance in public meetings. Performance at Public Meetings Index: In the past year, have you had any in-person contact with a Chicago police officer because you attended a CAPS meeting or another community meeting? ((1=yes CAPS; 2=Yes Other meeting; 3=No) How would you rate the performance of the Chicago police officers at the community meetings you have attended this past year? (1=very good; 4=poor; 5=DK) Items 1. Leadership skills 2. Communication skills 3. Problem solving skills 4. Openness to input from residents 5. Fairness to all residents 6. At the meeting, did you find the police… (1=very helpful; 4=not at all helpful)? 7. When residents talked to the police at the meeting, were the police… (1=very polite; 4=very impolite)? 8. Overall, how satisfied were you with the way the police acted at the meeting? (1=very satisfied; 4=very dissatisfied) Scale Statistics n M SD Min Max Wave 4 131 3.20 .58 1.38 4 H. Affective Response to Police Encounters Researchers have overlooked residents’ emotional or affective responses to police encounters. Contact with the police can produce a wide range of emotions, from being upset or angry to feeling reassured or comforted. Six items were tested and factor analyses yielded two separate dimensions as described below. The first factor accounted for 52.4% and 57.5% of the variance at wave 4 and wave 6, respectively, while the second factor was predictably less explanatory (19.6% and 16.8% at waves 4 and 6). Importantly, this factor structure remained stable when tested across all four types of neighborhoods. 62 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 1. Anxiety Reaction Index This 2-item index, reflecting the primary factor, captures a negative emotional response upon seeing a police officer and includes feeling afraid and uneasy. (A third item about feeling “angry” suppressed the internal consistency of the scale, and therefore, was dropped). The index was internally reliable at wave 4 (alpha = .84) and wave 6 (alpha = .88). The Anxiety Reaction Index also demonstrated strong test-retest reliability, r = .69, p < .01. The final index properties are shown in the table below. Higher scores indicate that residents feel less anxious when seeing a Chicago police officer. Anxiety Reaction Index: When you see a Chicago police officer, how often do you feel … (1=Always, 5=Never) Items 1. Afraid 2. Uneasy Scale Statistics Wave 4 Wave 6 n 711 618 M 4.11 4.07 SD .92 .90 Min 1 1 Max 5 5 2. Secure Reaction Index This 3-item index measures a positive emotional response to seeing a police officer, including feeling relieved, proud, and secure. The internal consistency of the index was strong at wave 4 (alpha = .81) and wave 6 (alpha = .81). Also, the Secure Reaction Index exhibited strong test-retest reliability, r = .73, p < .01. The final index properties are shown in the table below. Higher scores indicate residents feel more secure or relieved when seeing a Chicago police officer. Secure Reaction Index: When you see a Chicago police officer, how often do you feel … (5=Always, 1=Never) Items 1. Relieved 2. Proud 3. Secure Scale Statistics Wave 4 Wave 6 n 720 624 M 3.33 3.40 SD .94 .88 63 Min 1 1 Max 5 5 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. I. Assessment of Organizational Outcomes Not unlike individual officers, police organizations can be judged using both process and outcome indicators. Police organizations are often judged by the three E’s: efficiency, effectiveness, and equity (Eck & Rosenbaum, 1994). Efficiency is not the primary focus of the present measurement system, but it is addressed in previously discussed measures of police reliability, response time, follow-through, and accessibility (variables captured in the Police Reliability Index). Here we have added a police visibility index as an organizational measure. Police visibility remains a concern to many communities and a primary organizational objective, so we have developed a composite measure of visible police activity from the eye of local residents. On the issue of effectiveness, certainly official crime statistics will continue to be important for measuring the achievement of crime fighting objectives. Similarly, we have constructed measures of residents’ perceptions of the severity of crime and disorder as well as their level of fear of crime (See “Neighborhood Conditions” section below). Tracking these can be useful for monitoring changes in the environment and the effectiveness of new police programs. As police departments tailor solutions and strategies to neighborhood-specific crime issues, be it via problem solving or hot spot policing, measuring outcomes beyond the crime rate are crucial for understanding the full impact of any one strategy. We sought to measure the overall social ecology of the neighborhood, such as informal social control and collective efficacy. To the extent that organizational objectives include engaging and strengthening the community, reducing fears and concerns, reducing disorder, and improving the overall quality of life, then regularly measuring these variables is a necessity. If police departments focus on community policing and problem oriented policing, then they should truly measure their effectiveness at engaging the community, solving neighborhood problems, and preventing crime by surveying community residents. Whether or not community residents believe that police organizations are effective in these domains is an important question addressed with these new performance indicators. Finally, the third E, equity, has become a dominant organizational performance indicator in the past decade. Equity includes the distribution of services (distributive justice) and equity in the treatment of customers (procedural justice). This project has measured both, but most attention is given to the equitable treatment of service recipients, regardless of their race, gender, religion, or other defining characteristics. Earlier, we discussed the measures that captured perceptions of police manners during police-civilian encounters. Here the focus is on street-level processes that have been the subject of considerable legal action and over which police organizations are expected to have more control including racial profiling and police misconduct. It is important to emphasize that these measures are designed to capture the perspective of the community and not the viewpoint of the police or investigative bodies. Perhaps the most important indicator of organizational performance is the community’s overall faith in the institution of policing and confidence in the police organization staff and structure. Organizational legitimacy, as conceived by community stakeholders, is an indispensable indicator of overall police performance. Is the department transparent and 64 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. accountable to their constituents? Do they share information and are they responsive to citizen inquiries and complaints? Do residents feel the department is committed to the principles of problem solving? Is the department committed to principles of community policing, such as communication, cooperation and collaboration? These elements of organization legitimacy are important measurement dimensions for police because in order for their constituents to maximally and effectively partner and cooperate with them, residents have to think that the police department is a legitimate, professional entity with competent staff. 1. Police Visibility Index One of the most consistent public expectations for the police, across diverse communities, is the demand for greater police visibility. Despite research evidence demonstrating that the visibility of randomized patrols is insufficient to deter crime, the public outcry for more police officers on the streets remains consistent. We should note that the demonstrated effectiveness of hot spots policing and directed patrol missions may be due, in part, to the visibility of the police units and the enforcement actions occur with additional manpower. In any event, measuring public perceptions of police presence is critically important for external accountability and may be important in people’s overall assessment of police performance. The Police Visibility Index was computed by summing the scores on 8 different types of police activity. The index properties are shown in the table below. A higher score indicates greater police visibility in the neighborhood. Conceivably, this index could be used as an indicator of policing at the neighborhood level, but since deployment decisions are dictated by management, we decided to include it as an organizational measure of performance. Police Visibility Index: In your neighborhood, how often do you see Chicago police officers engage in the following activities? (1=Never; 5=Daily) Items 1. Drive through on patrol 2. Walk or stand on foot patrol 3. Patrol the alley, checking garages or the backs of buildings 4. Chat or have friendly conversation with people 5. Make a traffic stop 6. Search and frisk someone 7. Break up a group of people 8. Arrest someone Scale Statistics N M SD Min Max Wave 3 640 20.19 6.05 8 38 2. Effectiveness in Preventing Crime Index This 3-item index measures residents’ assessments of the police department’s effectiveness in preventing crime and disorder in their neighborhood. A range of items were analyzed but three items provided the most parsimonious results, with a focus on creating a safe 65 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. neighborhood for children. 18 The final scale was unidimensional across all neighborhoods, with a factor that explained 82.0% of the variance at wave 3 and 81.1% at wave 6. The index was internally reliable at waves 3 and 6, alphas = .89 and .88, respectively. The index also has good test-retest reliability, r = .70, p < .01. The final index properties are shown in the table below. Higher scores indicate greater perceived police effectiveness in preventing crime and keep order within their neighborhood. Effectiveness in Preventing Crime Index: Please rate how good a job you feel the Chicago police are doing in your neighborhood. (4=Very good job; 1=Poor job) Items 1. Preventing crime. 2. Keeping order on the streets and sidewalks. 3. Keeping children safe. Scale Statistics Wave 3 Wave 6 n 685 578 M 2.58 2.66 SD .79 .73 Min 1 1 Max 4 4 3. Effectiveness in Solving Problems Index This 2-item index captures residents’ judgments about the police department’s effectiveness in solving neighborhood problems and fighting crime. While problem solving and crime fighting are conceptually distinct outcomes, they are similar as “bottom line” results, and are empirically related, as these findings suggest. The internal consistency of this index was stable across neighborhoods within wave 3 (alpha=.79; range =.67-.81) and within wave 6 (alpha=.80, range = .75 to .81). The test-retest stability of this index was relatively strong, r = .71, p < .01. The final index properties are shown in the table below. Higher scores indicate a stronger belief in the effectiveness of Chicago police officers in solving problems and fighting crime. Effectiveness in Solving Problems Index: Please indicate whether you agree or disagree with the following statements about Chicago police officers. (4=Strongly agree; 1=Strongly disagree) Items 1. Chicago police officers are effective at solving neighborhood problems. 2. Chicago police officers are effective at fighting crime. Scale Statistics Wave 3 Wave 6 n 672 585 M 2.73 2.75 SD .68 .65 18 Min 1 1 Max 4 4 Although other items retained membership in a single Effectiveness factor, such as reducing homicide and helping crime victims, they did not contribute to the internal consistency of this dimension. 66 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 4. Willingness to Partner with Police Index The community’s willingness to work with the police has never been more critical. The criminal justice system can only achieve justice when victims and witnesses are willing to cooperate in the identification and prosecution of suspects. Today, police detectives are unable to solve most homicides because of community fear, exacerbated by websites that post the pictures, names and addresses of “snitches.” Also, effective problem solving is not possible without the creation and maintenance of cooperative partnerships between the police and community stakeholders. Three survey items were used to measure resident’s willingness to participate with the police in the co-production of public safety. The items were measured three months apart and explain 70% of the variance at wave 3 and 66% of the variance at wave 6. Scale reliability was good at wave 3 (α = .77) and wave 6 (α = .71). The re-test reliability was moderately high (r = .52, n = 509, p < .001). The variable is coded so that higher values indicate a greater willingness to work with the police. Willingness to Partner with the Police Index: Please indicate how likely you would be to: (4 = Very likely; 1= Never) Items 1. Call the police to report a crime occurring in your neighborhood. 2. Help the police to find someone suspected of committing a crime by providing them with information. 3. Report dangerous or suspicious activities in your neighborhood. Scale Statistics Wave 3 Wave 6 n 732 627 M 4.68 4.71 SD .54 .48 Min 1 2 Max 5 5 5. Engagement of the Community Index Police organizations are expected to make their officers accessible to the public and increase public awareness and knowledge about crime prevention. This 2-item index measures residents’ perceptions of community engagement or outreach activities by the police. This is a limited scale, but the two items hang together well. The internal consistency of this index was stable across neighborhoods within wave 3 (alpha=.82; range =.78-.86) and within wave 6 (alpha=.81, range = .74 to .90). The index also exhibited reasonable test-retest reliability, r=.60, p < .01. The final index properties are shown in the table below. Higher scores indicate more positive perceptions of Chicago police involvement in community engagement activities. 67 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Engagement of Community Index: How often do Chicago police officers act in the following manner? (5=Always, 1=Never): Items 1. Provide crime prevention tips to residents. 2. Make themselves available to talk to residents. Scale Statistics Wave 3 Wave 6 n 665 534 M 3.26 3.20 SD .94 .93 Min 1 1 Max 5 5 6. Police Misconduct Index For better or worse, police organizations and their leaders are ultimately judged not by the performance of their best officers, but rather by the misconduct of officers and the official response to their behavior. The problem with even the most innovative early warning systems (see Walker, 2005) is that they are reactive by nature and focus on severely delinquent individuals rather than seeking to improve the aggregate performance of officers assigned to particular units or geographic areas. Geo-based surveys and customer satisfaction audits of targeted police-civilian encounters have the potential to generate near-real time data that can be used for management intervention, especially problem solving and training about “hot spots of misconduct.” The option of intervening with individual officers remains available as well. For the CIP project, we sought to demonstrate that web-based surveys can be used to measure misconduct through the eyes of the public, short of filing an official complaint against an individual officer. There are numerous factors that discourage civilians from filing such complaints, and therefore, alternative measures of police performance would be beneficial. Our web survey sequence on misconduct began with the following screening question: “In the past year, have you had any contact with the police, or witnessed an encounter with the police, where you felt the officer(s) acted inappropriately? If the response was affirmative (in this study, 14.8%), respondents were asked to report on the nature of the most recent incident, using categories familiar to the Office of Professional Standards, the agency assigned to investigate civilian complaints in Chicago. The Police Misconduct Index measures the severity of the incident as reflects in the number of misconduct behaviors listed by the respondent. Although not shown here, our web survey also captured whether the incident was reported, how quickly, to whom, and the complainant’s level of satisfaction with the way the complaint was handled. The latter question taps into procedural justice considerations. The Police Misconduct Index is only preliminary and could be expanded to include other types of delinquent behaviors. We recommend that specific behaviors be separated into different survey questions. We would also recommend that personal experience with misconduct be separated from observed incidents. The final index properties are shown in the table below. Higher scores indicate greater perceived severity of the most recent incident. By far, the most frequent types of misconduct listed for those who reported an incident were verbal abuse (54.9%), stopping people without sufficient cause (30.1%), and discrimination by race or other characteristics (27.4%). 68 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Police Misconduct Index: What was the nature of the incident that you experienced or witnessed? (check all that apply) Items 1. Use of excessive force (officers were physically abusive or used weapons unjustifiably) 2. Verbal abuse (officers used profanity, made verbal threats or were generally discourteous) 3. Misuse of police power (officers accepted bribes or forced residents to perform an illegal activity) 4. Failure to address a known crime 5. Failure to give name when asked or failure to wear nametag 6. Discrimination on the basis of race, gender, sexual orientation, class, religion 7. Too often stopping people in the neighborhood without sufficient cause 8. Other [please specify] Scale Statistics Wave 3 n 113 M 1.80 SD 1.29 Min 0 Max 7 7. Racial Profiling Index The extent to which racial profiling is a problem is believed to vary by organization and even within larger organizations, suggesting that leadership, supervision, and norms of behavior play some role. Hence, we consider community-based measures of racial profiling as indicators of organizational (rather than individual) performance. The Racial profiling Index captures residents’ beliefs about the frequency of racial profiling behaviors by police officers. The content validity of this 4-item index is strengthened by including a range of circumstances under which profiling behaviors might occur, from police stops to arrests. Factor analyses revealed that the index was unidimensional for the total sample and for each of the four neighborhood types. The internal consistency of the index was very high (alpha = .94 at wave 3 and .95 at wave 6), as was the test-retest reliability, r = .71, p < .01. The final index properties are shown in the table below. Higher scores represent a stronger belief among residents that Chicago police officers use race when making decisions to stop, search and arrest. Police Racial Profiling: Please indicate how often you think that Chicago police officers consider race when deciding: (4=All the time; 3=Often; 2=Not very often; 1=Never) Items 1. 2. 3. 4. 5. Which cars to stop for possible traffic violations. Which people to stop and question on the street. Which people to search. Which people to arrest and take to jail. How quickly they will respond to calls for help. 69 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Scale Statistics Wave 3 Wave 6 n 559 471 M 11.16 11.11 SD 2.82 2.69 Min 4 4 Max 16 16 8. Organizational Legitimacy Index This index seeks to capture the public’s general trust and confidence in the Chicago Police Department as an organization, reflecting the extent to which residents believe that the organization is under good leadership, is doing a good job overall and holds its officers accountable for their actions. Five relevant items were included on wave 3 and then repeated on wave 6. Factor analyses of these five items yielded a single factor at both waves, accounting for 66.1% of the variance in the items at wave 3 and 69.5% at wave 6. This unidimensional factor structure was replicated across each of the racial/ethnic groups for both waves. The Organizational Legitimacy Index exhibited strong internal consistency or reliability, as demonstrated by the Cronbach alpha coefficient at wave 3 (alpha = .87) and wave 6 (alpha = .89). Furthermore, the Index was shown to have strong test-retest reliability between waves 3 and 6, r =.73 (p < .01). In sum, the Organizational Legitimacy Index is unidimensional, has strong internal consistency, is stable across racial/ethnic groups, and is reliable over time. The final index properties are shown in the table below. Higher scores indicate higher perceived organizational legitimacy of the Chicago Police. Organizational Legitimacy Index: Please indicate whether you agree or disagree with the following statements about the Chicago Police Department. (4 = Strongly agree; 1 = Strongly disagree) Items 1. I have confidence the Chicago Police Department can do its job well. 2. I trust the leaders of the Chicago Police Department to make decisions that are good for everyone in the city. 3. People's basic rights are well protected by the Chicago Police Department. 4. Chicago police officers are held accountable and disciplined when they do something wrong. 5 When Chicagoans are upset with the police, there is usually someone they can talk to at the Chicago Police Department. Scale Statistics Wave 3 Wave 6 n 728 621 M 2.83 2.77 SD .61 .63 70 Min 1 1 Max 4 4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. J. Measuring Individual and Collective Performance Indicators Community crime prevention theory is built on the premise that community members play an integral role in maintaining social order and preventing criminal activity (Rosenbaum, 1988). Criminologists have made it clear that crime rates are influenced by a wide range of social factors outside the police function (Reiss, 1986; Reiss & Roth, 1993), and that public order is heavily influenced by informal social control processes within the community (Greenberg et al., 1985; Sampson & Raudenbush, 1997). Hence, communities should be enlisted with the job of enforcing informal social mores, taking individual and collective action to prevent crime, providing information and resources to police, and working with the police to problem solve public safety problems. Building partnerships with the police has been identified as particularly important for the co-production of public safety (Cordner 1997; Rosenbaum 2002; Schuck & Rosenbaum, 2006). Building on this knowledge, community policing and problem-oriented policing theory confirm the importance of community engagement and police-community partnerships as vehicles for solving neighborhood problems and maintaining a safe environment (Goldstein, 1990; Greene, 2000; Rosenbaum, 1994). To test these ideas and hold both the community and police accountable for community change, we need to construct a new measurement system. This new system should regularly monitor the social ecology of urban neighborhoods and evaluate the "performance" of the community, individually and collectively. Knowing the levels of community social capital, crime prevention behaviors, and collective efficacy within small geographic areas can assist police and community leaders in determining the scope of resources and planning needed to achieve a measurable reduction in crime and disorder. Also, abrupt reductions or increases in citizen perceptions of crime problems and fears will help monitor “perceptual hot spots,” direct police resources, and evaluate police and/or community initiatives within particular communities. The community component of the CIP web survey taps into several overarching variable domains: • Neighborhood conditions: Fear of crime, social and physical disorder and overall perceptions of neighborhood conditions • Individual resident performance: Individual, household, and collective crime prevention knowledge and behaviors • Community performance: Informal social control and collective efficacy 1. Neighborhood Conditions The social and physical conditions of a neighborhood are what define the quality of urban life. The presence of liquor stores, vacant lot, abandoned cars, garbage on the street, graffiti on the walls and broken windows are physical conditions that, collectively, send a strong message about the level of safety in the neighborhood. Similarly, loud music, groups of youth hanging out, prostitution, panhandling, and public drinking are social conditions that define the 71 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. interpersonal landscape and quality of life in a neighborhood. These conditions, whether signs of physical or social disorder, are dangerous because, as Skogan (1990) cogently argues, they undermine the neighborhood's capacity to exercise informal social control, enhance residents' fear of crime, contribute to more serious crime, and destabilize the housing market. Although researchers continue to debate whether disorder contributes directly to serious violent crime (Sampson & Raudenbush, 1999; Taylor, 2006), overall, there is consensus that it is an indicator of neighborhood decline that should have the attention of police, community leaders and policy makers. Hence, reliable measurement of this construct is essential for managing the quality of neighborhood life. Similarly, fear of crime and actual crime rates are widely used as indicators of community stability. When residents are afraid and when crime rates are high, the community's capacity to defend itself is undermined. For planners and policy makers, having baseline information on the perceptual and behavioral conditions that define each target neighborhood is critical. Problem-oriented policing stresses the importance of identifying, defining, and solving these neighborhood problems and conditions (Goldstein, 1979; Goldstein, 1990). Community policing stresses the importance of addressing residents' perceptions of and reactions to crime and disorder (Rosenbaum, 1994; Skogan & Hartnett, 1997). The public's fears, concerns, and behavioral responses to their environment can either make the neighborhood more hospitable or repellent to potential offenders and criminogenic conditions (Skogan, 1990). Hence, the measurement framework we have developed via the Chicago Internet Project assumes that planning and problem solving demand reliable estimates of community perceptions of disorder, perceptions of crime problems, fear of crime and actual rates of victimization. Although our sample sizes were insufficient to generate reliable estimates of victimization, nevertheless, the survey items were constructed with this goal in mind. These CIP measures tap into the concerns and priorities of communities at a local level. Repeatedly measuring these concerns can alert police and communities to emerging “disorder hot spots” or "fear hot spots." Essentially, crime forecasters in the future may be able to identify neighborhoods that are near the tipping point or about to enter a “cycle of decline.” Additionally, police can use these measures to evaluate police-community problem solving efforts or targeted police missions. Overall, these survey measures capture the perceived physical and social conditions related to crime and quality of life. Presently, communities have no sensible way of assessing the impact of police or community interventions on neighborhood conditions. Crime and disorder index. The importance of measuring the public's perceptions of disorder can be found in the “broken windows” theory of crime. The central notion is that when a neighborhood is physically and socially disorganized it is a breeding ground for crime because these conditions heighten fear, reduce natural crime prevention (e.g. guardianship) and therefore, contribute to more serious crime (Felson, 2006; Skogan, 1990; Taylor, 2006; Sousa & Kelling, 2006; Wilson & Kelling, 1982). A crime and disorder index can be used not only to assess neighborhood conditions for planning purposes, but as an outcome indicator to monitor the effectiveness of order maintenance strategies. Perception of disorder should change if police take action to ameliorate actual disorder problems (i.e. youth congregating or prostitution). 72 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Community disorder has been studied carefully with both resident surveys and observations (Sampson & Raudenbush, 1999; Taylor, 1999). Disorder has been found to be important to community residents (Skogan, 1990; Skogan & Harnett, 1997) and to be strongly associated with fear and other public safety constructs (Scheider, Rowell, & Bezdikian, 2003; Skogan, 1990; Warr, 2000). The scale items in this study are adapted from Skogan and Hartnett's (1997) physical and social disorder scale used in the annual evaluations of Chicago’s CAPS program. Most social and physical disorder problems are area-specific and given this web-based survey methodology, we are able to track small geographic “units of disorder.” We have constructed a single index for crime and disorder, although the range of items covers social disorder, physical disorder, and crime, which could be treated as subscales. The overall Crime and Disorder Index showed strong internal consistency at wave 1 (α = .88) and wave 5 (α =.90) and demonstrated very high test-retest reliability (r = .86, n = 323, p <.001). Crime and Disorder Index: The following is a list of things that you may think are problems in your neighborhood. Please indicate whether you think each is a big problem, some problem, or no problem in your neighborhood. (3 = Big problem; 2 = Some problem; 1 = No problem) Items 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. Garbage in the streets. Poor street repairs. Poor street lighting. Graffiti — writing or painting on walls or buildings. Public drinking. Loud music and/or noise. Illegally parked vehicles. Abandoned houses and other empty buildings. Dogs off leash or owners not picking up after them. Groups of youth hanging out. Speeding or drag racing. Homeless people asking for money. Cars being vandalized — things like windows or aerials being broken. Drug dealing on the streets. Prostitution. People breaking into homes/garages to steal things. Shootings and violence by gangs. Scale Statistics Wave 1 Wave 5 n 625 524 M 24.68 24.72 SD 6.08 6.40 Min 16 16 Max 49 50 Fear of crime index. Fear is an important social factor with real consequences for individuals, communities and cities. It is a defining feature of urban neighborhoods. As noted above, fear of crime can curtail residents’ activities by increasing apprehension about venturing into public spaces and reducing their interaction with neighbors, thus jeopardizing social control (Garofalo, 1981; Hartnagel, 1979; Moore & Poethig, 1999; Perkins & Taylor, 1996; Skogan, 73 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 1986). Early research focused on the factors associated with fear, especially behavioral avoidance and other crime prevention measures, such as carrying a weapon or locking doors (DuBois, 1979; Lavrakas, 19xx; Rosenbaum & Heath, 1991?; Skogan & Maxfield, 1981). Considerable research has focused on fear of crime as a consequence of victimization (Skogan, 1987), particularly sexual assault recovery (Ferraro, 1996). Also, community crime prevention and community policing initiatives have been evaluated using fear of crime as a central outcome measure (Brown & Wycoff, 1987; Ditton, Khan, & Chadee, 2005; Eck & Spelman, 1987; Rosenbaum, 1987). Other research studies have focused on fear of crime as a social condition in its own right (Denkers & Winkel, 1998; Perkins & Taylor, 1996). The Chicago Internet Project generated two kinds of fear measures. First, we sought to replicate the widely used item employed in national surveys to capture a general sense of fear when "alone outside in your neighborhood at night." Second, we sought to measure localized fear in particular neighborhood settings. We explored fear levels in various settings, ranging from public transportation to local parks, both during the daytime and at night. These types of questions about anticipatory fear of victimization under particular circumstances have been utilized in prior research (Denkers & Winkel, 1998). All of our measures focus on settings within the neighborhood, and therefore, allow for the construction of fear hot spots within small geographic areas. A 10-item Fear of Crime Index was computed at two waves. The internal consistency of the Index was high at both waves (wave 2 alpha = .914; wave 6 alpha = .920). The Fear Index also showed strong test-retest reliability, r = .77, p<.001, n = 464. Fear of Crime Index: How safe do you feel or would you feel being alone in the following locations at night? (1=Very safe; 2 = Somewhat safe; 3= Somewhat unsafe; 4= Very unsafe) Items 1. 2. 3. 4. 5. Walking around my neighborhood. In your lobby or stairway. In local parks. Walking to/from transportation. On public buses or trains. How safe do you feel or would you feel being alone in the following locations during the daytime? (1= Very safe; 2 = Somewhat safe; 3= Somewhat unsafe; 4= Very unsafe) Items 6. 7. 8. 9. 10. Walking around my neighborhood. In your lobby or stairway. In local parks. Walking to/from transportation. On public buses or trains. Scale Statistics Wave 2 Wave 6 n 739 604 M 17.10 17.05 74 SD 5.32 5.34 Min 7 8 Max 40 40 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Victimization index. As we know from the National Crime Victim Survey, survey methods provide an excellent opportunity to generate knowledge about the nature of crime and victimization that cannot be captured through official police reports. Geo-based surveys have the additional benefit of being able to produce information about local crime and victimization patterns, data which can be used both for community planning and evaluating localized public safety initiatives. In the CIP initiative, victimization questions were asked at only one point in time, but still gave us an opportunity to explore the feasibility of web-based measurement in this domain. The victimization items used a six-month reference period to minimize problems of memory decay and telescoping (Skogan & Lehnen, 1985) and provide more opportunity to evaluate shortterm programs. The content validity was reasonably good as the instrument captured victimization experiences with residential burglary, theft and criminal damage to property, completed and attempted robbery, and completed and attempt assault. When victimization was indicated, the victims were queried about two important conditions: Did the incident happen in the victim's current neighborhood? (in order to establish local crime rates) and was it reported to the police? Crime reporting behavior is an important measure of public trust in the police and perceived importance of the incident, and will likely vary by neighborhood. The sample size was not sufficient to compute separate victimization indices. An overall Victimization Index was computed with 8 items. (For future applications, we recommend that the index exclude victimization incidents that occurred outside the neighborhood. A Crime Reporting Index can also be developed). The final index properties are shown in the table below. Higher scores indicate more victimization experience. Victimization: In the past 6 months, have you or members of your household experienced the following…(check all that apply) Items 1. 2. 3. 4. 5. 6. 7. 8. Has someone broken into your home or garage to steal something? Have you found any sign that someone tried to break into your home or garage? Has anyone stolen, damaged, or taken something from your car or truck? Have you had anything stolen that you left outside, including motorcycles or bicycles? Has anyone stolen something directly from you by force, or after threatening you with harm? Has anyone tried to steal something from you by force or threat, even though they did not get it? Has anyone physically attacked you? Has anyone threatened to physically attack you? Scale Statistics Wave 2 N 778 M .49 SD .85 75 Min 0 Max 5 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 2. Individual Resident Performance Knowledge about crime prevention and staying safe indices. Since the introduction of the national crime prevention media campaign in the late 1970s (better known as the McGruff campaign), there have been many efforts to educate the public about possible crime and drug prevention behaviors (O'Keefe et al., 1996) and many academic statements about the need for local residents to become more actively involved in community crime prevention (Lab, 1988; Rosenbaum, 1988; Surette, 1992). The assumption is that residents' awareness and knowledge of crime prevention are the first steps on the road to preventative behaviors, such as self-protection, household protection, neighborhood problem solving, as well as enhanced perceptions of individual and collective efficacy (Rosenbaum, 1986). For the CIP project, knowledge about crime prevention emerged as a multidimensional construct consisting of one dimension tapping into individual’s knowledge about keeping themselves and their property safe and one dimension tapping into individual’s general knowledge about crime prevention and crime in their neighborhood. The items were coded on a four point scale with higher values indicating greater knowledge. The two factors accounted for 64% of the variance at wave 2 and 64% of the variance at wave 6. Reliability was high for both the general measure (wave 2 α = .80; wave 6 α = .84) and the staying safe measure (wave 2 α = .79; wave 6 α = .78). The items were measured four months apart and the re-test reliability was high for both the general measure (r = .53, n = 495, p <.001) and the staying safe measure (r = .60, n = 487, p <.001). All items should be generalizable to other communities and cities, with the exception of item #2 in the Knowledge about Staying Safe index, which may be relevant only to Chicago. Knowledge about Crime Prevention: Please indicate whether you agree or disagree with of the following statements about safety. (4=Strongly agree; 1= Strongly disagree) Items 1. I know the things I need to do to stay safe when I’m out on the streets. 2. I know the things I need to do to keep my home and property safe from crime. Scale Statistics N M SD Min Max Wave 2 776 3.36 .51 1 4 Wave 6 627 3.39 .52 1 4 Knowledge about Staying Safe: Please indicate whether you agree or disagree with of the following statements about safety. (4=Strongly agree; 1= Strongly disagree) Items 1. I know how to work with the police to solve crime problems in my neighborhood. 2. I know when beat community meetings take place in my neighborhood. 3. I know how to contact the police for non-emergency problems. 4. I know where to find information about crime prevention. 5. I know where to find information about crime in my neighborhood. Scale Statistics Wave 2 Wave 6 n 773 623 M 2.79 2.84 SD .69 .67 76 Min 1 1 Max 4 4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Knowledge about specific prevention concepts index. For communities that are serious about measuring their own performance in the public safety arena, they will need some baseline information on local residents' knowledge of specific crime prevention theories, concepts, and local programs. For community leaders and organizers, as well as neighborhood police officers, this information will help to identify police beats or other neighborhoods where remediation is most needed. The Knowledge about Specific Prevention Concepts index was designed to measure the residents’ knowledge about important concepts and theories in crime prevention (e.g. CPTED, SARA model, routine activities) but also local crime prevention initiatives. In terms of the latter, Chicago residents should be familiar with the Chicago Police Department's community policing program (CAPS) and crime mapping program that is available to the public (ICAM). These local items should not be used in other cities. The six items were scored on a four point response category scale and summed to create the final measure. The reliability was acceptable at both wave 1 (α = .61) and wave 5 (α = .71). The re-test reliability was high (r = .64, n = 466, p <.001). Knowledge about Specific Crime Prevention Concepts Index: You may or may not be familiar with the following terms or concepts in public safety. Please indicate whether or not these terms are familiar to you. (4 =Very familiar; 1 = Not at all familiar) Items 1. 2. 3. 4. 5 6. CAPS The Crime Triangle CPTED SARA Model ICAM Broken Windows Theory Scale Statistics Wave 1 Wave 5 n 757 663 M 9.61 10.08 SD 2.46 2.96 Min 5 4 Max 20 24 Protection behaviors index. Criminologists have established, as routine activities theory suggests (Cohen & Felson, 1979) that an individual’s daily activities are predictive of criminal victimization (Maxfield, 1987). Patterns of travel, work, affiliation, and recreation can affect one’s chances of falling victim to crime. Similarly, crime prevention theories suggest that victimization will be reduced when actions are taken to reduce the opportunities to commit the crime – either by reducing access to vulnerable persons, places or things or by changing the environment to increase the likelihood that potential offenders will be detected or apprehended (Clarke, 1992; Greenburg & Williams, 1987; Rosenbaum, 1988). Hence, law enforcement agencies and community leaders have sought to educate the public about specific behavioral responses they can take to protect themselves, their property, and public spaces from crime. 77 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. The protective behaviors scale measures the frequency in which individuals take actions to protect themselves, their loved ones or their property. The five-item scale accounted for 45% of the variance at wave 1 and 52% at wave 5. The index had high internal consistency (wave 1 α = 68; wave 5 α = .76). The scale was measured four months apart and the test-retest reliability was very high (r = .81, n = 466, p <.001). The variable is coded so that higher values indicate a greater frequency of engagement in safety measures. Future research should expand this set of items to include more indicators of protective behaviors in public places and crime prevention measures to property outside the household (see Lavrakas et al. 1980) Protective Behaviors Index: How often do you take any of the following actions in your neighborhood to protect your home, yourself, or loved ones? (4 = Always; 3 = Frequently; 2 = Sometimes; 1 = Never) Items 1. 2. 3. 4. 5. Keep a look out for suspicious activities. Ask a neighbor to watch your home when you’re away. Leave the radio or TV on when you go out at night. Carry mace or pepper spray. Limit the amount of jewelry you wear or amount of money you carry on the street. Scale Statistics Wave 1 Wave 5 n 755 664 M 12.92 13.40 SD 3.59 3.88 Min 5 5 Max 20 20 Formal collective action. Community crime prevention is often conceived as a combination of individual, household and collective actions. Crime prevention experts have warned that individual crime prevention measures involving risk avoidance (e.g. not using streets or parks) or household measures that create a fortress with locks, fences, and cameras may work for the individual, but can increase the risk of public street crimes. Hence, since the 1970s police have played a critical role in initiating, orchestrating and encouraging public-minded collective strategies designed to prevent crime where residents share an interest in public safety. Neighborhood Watch is the prototype for collective action (Rosenbaum, 1987), but some cities hold regular meetings with the police to engage in local problem solving (Skogan & Hartnett, 1997). Police and community leaders can foster public safety by organizing residents, working with community partners to collectively define and address crime problems, and encouraging more positive social interactions. These collective processes are expected to strengthen informal social controls, reduce crime, and reduce fear of crime (Rosenbaum, 1988) The collective action construct was multidimensional with one dimension tapping into informal collective action and another dimension tapping into formal collective action. The two dimensions accounted for 77% of the variance at wave 1 and 79% of the variance at wave 5. There was high internal consistency for the informal collective action scale (wave 1 α = .73; wave 5 α = .81) and moderate internal consistency for the formal collective action scale (wave 1 α = .63; wave 5 α = .62). The test-retest reliability for the informal collective action was r = .64 (n = 465, p < .001) and the test-retest reliability for the formal collective action was r = .64 (n = 464, p < .001). The scale items were assessed approximately four months apart. Higher scores 78 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. indicate more informal collective action (i.e. more talking with others about crime) and more formal collective action (i.e. more involvement in local neighborhood meetings). Informal Collective Action Index: In the past 6 months, how often have you done the follow things: (1 = Never; 2 = Once or twice; 3 = About once a month; 4 = About once a week; 5 = More than once a week) Items 1. Talked with your neighbors about crime issues. 2. Talked with your family or friends about crime. Scale Statistics Wave 1 Wave 5 n 756 663 M 2.35 2.37 SD .92 .98 Min 1 1 Max 5 5 Formal Collective Action Index: In the past 6 months, how often have you done the follow things: (1 = Never, 2 = Once or twice, 3 = About once a month, 4 = About once a week, 5 = More than once a week) Items 1. Attended a CAPs meetings beat meeting. 2. Attended a community meeting in your neighborhood. Scale Statistics Wave 1 Wave 5 n 754 663 M 1.34 1.30 SD .53 .51 Min 1 1 Max 5 4 Self-efficacy about crime prevention index. Self-efficacy about Crime Prevention is a new measure developed specifically for the Chicago Internet Project. Self-efficacy is rooted in social cognition theory (Bandura, 1997) and has been used to help explain a wide range of behaviors including academic and work-related performance (Bandura, 1993; Stajkovic & Luthans, 1998), the use of technology (Compeau & Higgins, 1995) and health and well-being (Lorig et al., 1989). Self-efficacy is the belief that people hold about their causal capabilities (Bandura, 1997). The research suggests that perceptions of self-efficacy shape several dimensions of behavior including: (a) decisions about what behaviors to engage in, (b) the amount and persistence of effort in attempting a specific behavior, (c) the individual’s emotional response when carrying out the behavior, and (d) the actual achievement of the individual with respect to the behavior (Bandura, Adams, & Beyer, 1977; Wood & Bandura, 1989). According to Bandura (1997) self-efficacy is not a generalized concept, but rather is specific to the behavior being studied. Self-efficacy about crime prevention refers to an individual's beliefs regarding his/her capabilities to secure and organize resources and execute a course of action that improves neighborhood safety. There are several aspects of self-efficacy when applied to crime prevention. First, the individual must perceive having the means necessary to achieve success such as 79 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. knowledge about crime and the skills related to working with the police, such as problem-solving aptitude. Second, key aspects of self-efficacy are perceptions of the importance and seriousness of problem, as well as the motivation or incentives to take action. Finally, and most importantly, self-efficacy about crime prevention includes the belief that one can carry out the desired actions and that participation in these behaviors will lead to positive results. These components of the self-efficacy construct are consistent with the health belief model, which has been used to explain public health and crime prevention behaviors (see O'Keefe et al., 1996). Self efficacy about crime prevention is a five-item index measuring an individual's perceived capacity to carry out effective crime prevention actions at the neighborhood level. A single factor accounted for 57% of the variation at wave 2 and 56% of the variance at wave 5. The internal consistency of the items was high at wave 2 (α = .80) and wave 5 (α = .79). The test re-test reliability was also high (r = .63, n = 662, p < .001). The variable is coded so that higher values indicate higher levels of self-efficacy regarding crime prevention. Self-Efficacy Index: Please indicate whether you agree or disagree with the following statements about yourself. (4=Strongly agree; 1= Strongly disagree) Items 1. 2. 3. 4. 5 I can influence my neighbors to take action on important crime issues. I can influence the police to take action on important crime issues. I know I can make a difference in my neighborhood. If I work with the police, my neighborhood will be a safer place to live. If I work with other community members, my neighborhood will be a safer place to live. Scale Statistics Wave 2 Wave 5 n 775 662 M 3.67 3.64 SD .75 .74 Min 1 1 Max 5 5 3. Collective Performance Informal social control index. Social control refers to community residents’ efforts to regulate their behavior and the behavior of visitors to the neighborhood in order to achieve living in an area that is relatively free from the threat of crime (Bursik & Grasmick, 1988). Albert Hunter (1985) developed a three-level approach to understanding how social control operates in a community. The first level of social control, called private social control, is used to describe the informal efforts of intimate primary groups in the community. For example, private social control includes the use of relationships among friends to shape an individual’s behavior though positive reinforcement such as social support or mutual esteem, and through negative reinforcement such as criticism, banishment from the group, or even violence (Hunter, 1995; Black, 1989). The second level, or parochial social control, refers to the efforts of broader local interpersonal networks such as churches, schools, local businesses, or voluntary organizations. These broader social networks have a vested interest in the well-being of the community and will exercise control through formal and informal interaction that establish norms about acceptable behavior in the group and by intervening to stop deviant behavior, among other ways. The third and final level, public social control, is used to describe the residents’ ability to acquire goods 80 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. and services that are allocated by organizations and agencies outside the neighborhood. This includes the ability of local residents to leverage resources from both private and public organizations in an effort to maintain public order and keep residents safe. This would include the relationship between community residents and the police (Bursik & Grasmick, 1988). Our informal social control scale captures one form of parochial social control using specific survey items drawn from an established literature (Sampson, Raudenbush & Earl, 1997). The re-test reliability was .65 (n = 494, p < .001). Higher scores indicate greater informal social control. Informal Social Control Index: For each of the following questions, please indicate how likely it is that your neighbors would do something if … (5 = Very likely; 1 = Very unlikely; 3 = Don’t know) Items 1. 2. 3. 4. Children were spray-paining graffiti on a local building. Children were skipping school and hanging out on a street corner. A fight broke out in front of your house and someone was being beaten. The fire station closest to your home was threatened with budget cuts. Scale Statistics Wave 2 Wave 6 n 777 626 M 3.84 3.92 SD .94 .91 Min 1 1 Max 5 5 Collective efficacy index. Collective efficacy refers to the combination of the social cohesion among neighbors and their willingness to intervene for the common good (Sampson, Raudenbush, & Earls, 1997). Collective efficacy is based on characteristics such as mutual trust, solidarity, and shared expectations among neighbors. It also includes the element of active informal social control where there is a perception that neighbors will intervene for the common good of the neighborhood. Research suggests that collective efficacy is strongly related to victimization and crime rates (Sampson et al., 1997; Morenoff, Sampson, and Raudenbush, 2001). The Collective Efficacy scale replicates the work of Sampson, Raudenbush and Earl (1997). The scale items for collective efficacy were only assessed at one time point. The re-test reliability was .65 (n = 494, p < .001). Higher scores indicate stronger collective efficacy. 81 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Collective Efficacy Index: For each of the following questions, please indicate how likely it is that your neighbors would do something if … (5 = Very likely; 1 = Very unlikely; 3 = Don’t know) Items 1. 2. 3. 4. Children were spray-paining graffiti on a local building. Children were skipping school and hanging out on a street corner. A fight broke out in front of your house and someone was being beaten. The fire station closest to your home was threatened with budget cuts. The next few questions are also about police in your neighborhood. For each statement, please indicate whether you agree or disagree. (4 = Strongly agree; 1 = Strongly disagree; 3 = Don’t know) 5. 6. 7. People around her are willing to help their neighbors. People in this neighborhood can be trusted. People in this neighborhood do not share the same values (reverse coded). Scale Statistics Wave 2 n 777 M 3.82 SD .76 Min 1 Max 5 K. Further Validation of Scales The scale validation process began with factor analysis and reliability analysis to confirm the unidimensionality and internal consistency of the scales. The indices were also examined for stability over time using test-retest reliability scores. Additional validity analyses were performed on selected scales to test their robustness. In particular, we used a multi-method approach to examine whether the web-based findings would correspond to the results derived from other methods (telephone surveys and police statistics). We also utilized "known groups" validation techniques to assess whether the scales would behave in predictable ways as dictated by prior research and theory. 1. Multi-Method Validation of Scales For the community scales, we were able to compare three sets of data for the same 51 police beats: our web-based survey data from 2005, official police records from 2005 and telephone survey data collected in 2002 from these same 51 police beats. As shown in the table below, the correlation between telephone and Internet findings, using data collected three years apart with different random samples, are incredibly strong (ranging from .552 to .790). These findings suggest that the Internet can be used to capture valid impressions of neighborhood conditions in relatively small geographic areas. The data in this table are also useful for construct validity. The research literature indicates that neighborhood disorder is linked to fear of crime and informal social control, and 82 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. indeed, the web survey findings confirm these relationships. Stated differently, our disorder measure behaves in a predictable manner at the neighborhood level, with higher levels of disorder associated with higher levels of fear and less informal social control (see Table 5.1) Table 5.1 A Comparison of Telephone and Internet Data Northwestern University Telephone Survey Data Informal Social Fear Disorder Control University of Illinois of Chicago Internet Survey Fear Disorder Informal Social Control .667** .642** -.600** .696** .790** -.521** -.566** -.634** .552** The table below (Table 5.2) compares the findings from the Internet and telephone surveys with official police records for the 51 police beats. Again, the correlations between data collected from three very different methods are consistently positive and almost always statistically significant. Neighborhoods (police beats) with higher levels of violent crime, illegal drugs, weapons, and disorder (as defined by the Chicago police) are places where web-survey respondents report significantly higher levels of fear, victimization, and disorder. The telephone survey findings are also consistent with the police data, but the Internet survey findings (especially for fear of crime) are more highly correlated with the police findings. This difference may be the result of a time lag, as the Internet data were collected during the same time period as the police data, while the telephone data were collected three years earlier. Table 5.2. A Comparison of Official and Internet Data Official Chicago Police Department Crime Data (logged) Crime Violent UIC Internet Survey Fear .489** Victimization .186 Disorder .276 Disorder .216 Northwestern Telephone Survey Fear .292* Disorder .188 Robbery Homicide Drug Weapons Disorder .702** .342* .541** .476** .694** .331* .463** .422** .522** .353* .485** .427** .725** .394** .698** .652** .770** .477** .680** .620** .345* .354* .284* .265 .482** .400** .490** .405** .351** .308* .519** .514** .586** .563** .371** .233 2. Known Groups Validation of Scales Additional analyses were performed on some of the new policing scales to further validate the constructs. Methodologists often employ "known groups" validation procedures to demonstrate that a particular measure is able to discriminate between groups that are known, on the basis of prior research and/or theory, to have different scores on the test variable. Race is 83 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. one variable that has been shown previously to predict citizen perceptions and judgments about the police. In particular, minorities consistently report, on telephone surveys, more negative attitudes toward, and satisfaction with the police (Skogan, 2006; Rosenbaum & Schuck, 2005; Weitzer, xxxx). Hence, we performed a series of regression analyses to determine whether scores on Web-based police performance scales could be predicted from the race/ethnicity of the respondents. The findings are consistent with prior research using telephone survey methods. As predicted, African Americans and Latinos were more likely than whites to report more negative views of the police on several performance dimensions (see Table 5.3). These findings suggest that our web-based indices of police performance are successful at capturing known group differences. 84 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 5.3 HLM Linear Regression Estimates for the Impact of Resident’s Race on Policing Constructs African American (vs. White) Latino (vs. White) Other (vs. White) SE Residual Varianc e Est. -.21 .16 .47*** .14 -.22 .16 .56*** -.15 .10 -.22 .12 .26*** .06 -.33*** .11 -.50*** .13 .35*** -.49*** .08 -.62*** .15 -.52** .18 .64*** 607 -.32*** .07 -.35** .13 -.36** .14 .31*** Organizational Legitimacy 712 -.29*** .06 -.17 .11 -.07 .12 .34*** Effectiveness in Problem Solving 656 -.40*** .07 -.27* .12 -.25 .14 .41*** Effectiveness in Preventing Crime 669 -.56*** .08 -.43** .14 -.32 .16 .49*** Engagement of the Community 654 -.47*** .09 -.58*** .16 -.44* .20 .82*** 709 -.42*** .09 -.34* .17 -.32 .19 .81*** Dependent Variables n Est. SE Est. SE Manners 665 -.48*** .07 -.25 .13 Fairness 654 -.55*** .08 -.37* Knowledge 636 -.27*** .06 Reliability 713 -.45*** Responsiveness to the Community 637 Satisfaction with Neighborhood Police Est. General Assessments of Police Competency Indices Assessments of Neighborhood Police Organizational Outcomes Affective Responses to Police Encounters Security *p≤.05 **p≤.01 ***p≤.001 85 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. L. Measurement Sensitivity 1. Within-Race Differences In this report we have argued that one of the benefits of the web-based survey methodology is the ability to cost-effectively detect differences between small geographic areas. The 51 police beats in this study are examples of relatively small areas (arguably neighborhoods) where stable estimates of community perceptions and behaviors are possible. Although the sample sizes at the beat level are limited in the current study we are nevertheless able to illustrate the potential benefits of this approach. Earlier we described differences by racial/ethnic groups in perceptions of the police. These types of findings, whether citywide or national, have contributed to the impression that race is the primary variable for explaining community evaluations of the police. African Americans, Latinos and whites are thus viewed as homogeneous groups with very little within-group variability regarding assessments of the police. The analyses that follow illustrate that differences exist within these groups when data are collected at smaller geographic areas. Although social class differences have been artificially restricted in these data (as lower income police beats were excluded from the study), even so, not all minority communities hold the same impressions of the police. The bivariate correlations in Table 5.4 show predicable differences across African American communities in their assessments of the police. African American neighborhoods with higher levels of disorder and fear of crime and low levels of collective efficacy are significantly less satisfied with police performance on virtually all dimensions than African American neighborhoods where disorder and fear are under control and residents feel efficacious. High rates of violent crime showed less predictive power in African American neighborhoods. High violent crime rates predicted lower assessments police effectiveness in fighting crime, but did not predict assessments of police manners and fairness. Only when police are facing African American neighborhoods with high levels of disorder are they subject to more negative evaluations on demeanor and equity dimensions. The box plots below confirm some predictable differences in the judgments of the police when comparing African American, Latino, white, and mixed neighborhoods. But these charts also illustrate that there is substantial variability within each racial/ethnic cluster. The 18 predominately African American neighborhoods, for example, are fairly divergent in their views of police manners, fairness and effectiveness in problem solving, with some beats expressing more positive views of the police than those of white neighborhoods. For some indices, however, such as police effectiveness in preventing crime or police reliability, the medians are further apart and the standard deviations are smaller , thus producing little or no overlap in the distributions for African American and white neighborhoods. 86 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 5.4 Bivariate Correlations for Residents from African American Communities Violent Crime Rate Dependent Variables General Assessments of Police n Disorder r Fear Collective Efficacy n r n r n r Manners 189 -.13 109 -.37** 123 -.30** 135 .34** Fairness 193 -.04 100 -.32** 129 -.16 130 .16 Knowledge 186 -.07 96 -.28** 124 -.07 126 .20* Reliability 203 -.05 104 -.41** 136 -.29** 139 .23** Responsiveness to the Community 183 -.13 95 -.35** 121 -.29** 123 .37** Satisfaction with Neighborhood Police 154 -.23** 104 -.38** 103 -.28** 113 .33** Organizational Legitimacy 202 -.09 104 -.33** 135 -.22* 139 .16 Effectiveness in Problem Solving 189 -.18* 98 -.46** 128 -.29** 129 .31** Effectiveness in Preventing Crime 190 -.20** 99 -.49** 125 -.35** 128 .34** Engagement of the Community 187 -.14 108 -.29** 122 -.28** 131 .27** Security 201 -.08 112 .00 132 .01 141 .18* Anxiety 198 -.05 109 -.33** 130 -.24** 139 .18* Competency Indices Assessments of Neighborhood Police Organizational Outcomes Affective to Police Encounters *p≤.05 **p≤.01 87 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 5.1 Box plots for Police Manners and Fairness Scales 88 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 5.2 Box Plots for Police Problem Solving and Reliability Scales 89 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Figure 5.3 Box Plots for Police Responsiveness Scales 2. Identifying Hot Spots One of the implications of these findings is that cities can geographically identify not only hot spots of violent crime (as is conventionally done), but hot spots of police-community tensions, fears, and other concerns. For example, web-based survey findings can be used to locate police beats, regardless of race/ethnicity, where police manners are rated as poor, when anxiety about police stops is high, and where residents are most dissatisfied with police services. These would be ideal locations for police officers to be engage in a problem solving exercises with community leaders around these concerns. 90 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER SIX THE CAPS EXPERIMENT: FINDINGS AND LESSONS LEARNED A. Implementation Results within the CAPS Framework 1. Feasibility Study A preliminary study was conducted to explore the feasibility of using a web-based system to collect data from residents about public safety concerns and monthly feedback sessions at CAPS beat meetings (see Skogan et al., 2005). Conducted in three beats from February to September 2004, the study consisted of components and objectives similar to those for the Chicago Internet Project (CIP): (1) Residents attending CAPS meeting in the study beats were asked to go online each month and complete a survey on various public safety issues; (2) Survey results were presented to residents and police at their meetings; and (3) Training was provided to police and civilian facilitators on problem solving. Given the very limited sample size and extremely experimental nature of the study, few program effects were found, but this study was invaluable for allowing us to identify what worked and what problems we could expect to encounter if we were to implement such a project on a broader scale. The study demonstrated that residents would be willing to repeatedly participate in Internet surveys and experienced little difficulty doing so. The feasibility also demonstrated that it would be possible to incorporate presentation of survey results into the existing beat meeting framework, although problems with the meetings agenda would need to be addressed. Most obstacles that we identified were taken into consideration during the planning phases of the current project, as discussed below with regards to implementation of CIP. The major difference between the feasibility study of 2004 and the CIP was the University’s role in project implementation. During the feasibility study, university researchers assumed full responsibility for all facets of implementation, from preparation of study materials (e.g. handouts and survey results) to distribution of handouts and facilitation of the presentation and discussion of survey results. The same researchers attended the beat meetings each month and, while only three beats participated in the study, this nevertheless required a major commitment on the researchers’ part that would have been difficult to sustain on a regular basis. While it appeared the researchers presence was accepted by most participants, by the end of the study they retained the distinction of being “from the University.” The presentation and discussion of survey results by expert facilitators could have prevented police and residents from feeling fully invested in the study. Given the demands of implementation in 50 to 60 beats and the need for participants to take ownership of the project, we decided that CIP would need to be adopted and internalized by the Chicago Police Department. 2. Implementation: Protocol and Integrity Protocol. A protocol was developed jointly by UIC and the CPD that assigned primary responsibility for the administration of the project at CAPS meetings to beat team leaders (selected community residents) and community policing officers facilitating the meetings each 91 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. month. To combat the perception of the project as an “university experiment”, it was agreed all directives and memos would be issued formally through the CAPS Project Office. At the onset of the project, District Commanders, CAPS managers, and beat team leaders were issued directives detailing the necessary tasks to be completed as part of the project (see Appendix D). Because implementation problems with beat personnel were observed early on and turnover was a significant problem during the course of the project, these directives were re-issued twice to ensure that all relevant police personnel were informed about project objectives and tasks. UIC staff prepared and supplied the necessary handouts via email each month for both the appropriate beat personnel and the CAPS Project Office. Additionally, the CAPS Project Office also provided beat personnel with faxed and hard copies of all handouts as well. UIC assumed initial responsibility for introducing the project to participants at their beat meetings. In subsequent months, administration of project tasks was solely assumed by meeting facilitators. Implementation integrity remained a concern throughout the duration of the project and numerous attempts were made to secure cooperation using various avenues. In order to hold personnel in participating beats accountable for carrying out project objectives, the CPD monitored implementation levels and prepared formal audit reports detailing compliance in each beat with implementing tasks. These audit reports were distributed as memos from the Assistant Deputy Superintendent of the CAPS Project Office to District Commanders and CAPS managers after Waves 2 and 4. The CAPS Project Office also flagged low-compliance beats and sought cooperation from beat personnel through multiple informal contacts. During Wave 2, District Commanders and CAPS managers were sent a memo regarding survey participation by residents and the importance of full implementation, including the need to increase levels of participation in the online surveys. Similarly, UIC researchers attended a monthly CAPS Lieutenants meeting to discuss the objectives of the project and the importance of making sure the necessary materials were distributed at meetings, as well as introduce the possibility of the CPD using raffles to encourage participation by residents. Full implementation of project tasks by all participating beats, however, was never achieved despite consistent efforts by the CAPS Project Office and UIC. Continuous efforts were made to simplify the process of receiving materials and to clarify the project objectives. On a positive note, implementation levels steadily increased over time. Experimental design. As Table 6.1 shows, the 51 participating beats were randomly assigned to one of three experimental conditions receiving varying levels of treatment: control, feedback, and training. Originally, each condition had an equal number of beats, but we discovered early on that one of the beats assigned to the control group shared a joint meeting with a beat in the training group. A decision was made that these two beats would be treated as a single beat within the training group. As noted in the methodology section, we were asking two primary questions: (1) Does receiving feedback on public safety issues affect police-resident discussion and problem solving at CAPS beat meetings? and (2) Would additional training and guidance have supplemental effects on discussion and problem solving at meetings? To this end, project tasks increased progressively across conditions as follows: All beats: All participating beats followed the basic steps of implementation, which consisted of: including the project on the printed agenda, distributing flyers about the Internet surveys, and encouraging resident participation to complete the monthly surveys. The sole 92 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. purpose of these steps was to make residents aware of the project, provide them with the necessary information, and solicit their participation in completing surveys each month. Feedback: In some beats, participants were also given feedback in the form of printed survey results and then encouraged to discuss the findings during their CAPS meetings. Results were selected each month based on their perceived utility to police and residents for both assisting in identifying and prioritizing local problems and introducing new discussion topics. Training: In some beats, in addition to the encouragement to participate in the surveys and survey feedback, participants were the beneficiaries of two training components. The first was an all-day training for beat sergeants consisting of a problem-solving refresher, instruction on using survey results in problem solving, and overview of the CPD’s planned expansion of information technology use. The second was a monthly problem solving exercise to guide discussion about selected survey results in order to gain a fuller understanding of certain problems, gather resident input for solutions, and otherwise educate residents. Table 6.1 Implementation Protocol by Experimental Condition EXPERIMENTAL CONDITION BASIC STEPS FEEDBACK TRAINING EXAMPLES Include on agenda CONTROL N=16 FEEDBACK N=17 Survey Flyer Appendix E Distribute flyers Encourage participation Include on agenda Distribute flyers Distribute/ Discuss survey results Encourage participation Include on agenda TRAINING N=17 Distribute flyers Distribute/ Discuss survey results Encourage participation Survey Results Appendix F Training of Sergeants Discuss problem solving exercise Problem Solving Exercise Appendix G Basic steps. Beats in all three experimental conditions were to follow several fundamental steps meant to incorporate the project into their meeting’s regular proceedings and encourage resident participation. Activity in control group beats was limited to these primary steps; survey results were collected from participants, but were not made available to police or residents during the course of the project. Residents in these control beats were only aware they had been chosen to participate in a joint UIC-CPD project in testing a new web-based survey system for gathering resident input. 93 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Inclusion on meeting agenda. A CAPS-required component, agendas provide a basic framework for identification and discussion of new problems at meetings. Because meetings typically last no more than an hour, placing the project on the agenda would assure time be given to introducing the project to residents who were not familiar with it and encouraging their participation; additionally, it would provide time for beats receiving survey results to discuss them. Although required, printed agendas were made available at only 71% of the 266 meetings observed during the course of the project, with another 12% providing the agenda verbally to residents. At the 216 meetings where police were asked to include the project on their meeting agenda, 76% actually provided a printed agenda and another 8% offered the agenda verbally. Rates for the inclusion of CIP on the meeting agenda were exceedingly low, with CIP appearing on only 22% of printed agendas. There were no significant differences among the rates at which beats in the different experimental conditions provided printed agendas and included CIP on the agendas (X2=6.705, p> .05), although beats within the training groups included CIP at a slightly higher rate (29%) than beats in either the control (24%) or feedback (14%) conditions. Distribution of survey flyers: CPD personnel were provided with flyers containing instructions for accessing and completing the web survey each month, which they were told to distribute to meeting participants (see Appendix E for example). This information included the basic objectives of the project, the website address, a password to access the survey, and contact information for the UIC research team. Officers were encouraged to pass out flyers directly to meeting attendees rather than simply place them on the table with other handouts in order to draw residents’ attention to the opportunity to complete the surveys. As Table 6.2 indicates, police did not fully comply with instructions to distribute project flyers to residents at meetings, although distribution occurred regularly in most beats and the distribution rate remained constant or increased across experimental conditions. Overall, police provided flyers at 80% of the 169 meetings observed during the project when requested to do so. Distribution of flyers was most problematic during wave 2, the first point at which responsibility for carrying out this task was assumed solely by police personnel, with significantly lower rates for beats in the control and feedback conditions. Overall distribution rates were significantly higher for training beats. Distribution occurred more sporadically within control and feedback beats, gradually increasing in frequency for both conditions by wave 5. Of the 27 beats in which flyers were provided consistently at all points of observation, 14 were training beats, 7 were feedback beats, and 6 were control beats. The most fundamental task for police personnel to foster resident survey participation was the survey flyer distribution, however residents were not offered the opportunity to complete the survey at 1 in 5 beat meetings. While anecdotal evidence demonstrates that police in a few beats did indeed pass out the materials (vs. placing them on the table with other brochures), examination of handouts collected by observers indicated that CIP materials (flyer and survey results) were frequently stapled to the general meeting “packet” which usually included the meeting agenda and ICAM crime reports. While there were on average only seven separate handouts provided at any given meeting over the course of the project, some beats frequently had double or even triple that amount of handouts. The same type of informational “packet” was repeatedly offered month after month. Beats that prepared meeting packets also brought other handouts that they wished to draw special attention to, such as crime alerts or announcements about community events and would typically distinguish them from the usual handouts provided. Likewise, regular attendees at meetings 94 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. appeared familiar with the practice of facilitators using the packet in relation to covering certain agenda points; facilitators directed residents to certain handouts (e.g. ICAM reports) while discussing them at the meetings. Participation may have increased had the flyer and survey results been treated as separate handout and not stapled to the standard meeting packet. Table 6.2 Project Flyer Distribution Rate by Experimental Condition (%) Wave 2 (%) Wave 4 (%) Wave 5 (%) Wave 2-6 (%) N = 14 N = 16 N = 25 N = 52 Yes 50.0 68.8 93.3 73.1 No 50.0 31.2 6.7 26.9 N = 15 N = 15 N = 14 N = 58 Yes 73.3 66.7 85.7 75.4 No 26.7 33.3 14.3 24.6 N = 14 N = 14 N = 15 N = 59 92.9 92.9 93.3 93.1 Control Feedback Training Yes No 7.1 2 X = 6.4* 7.1 6.7 2 2 X = 3.3 X =3.3 6.9 2 X = 8.07* * p< .05 Encourage completion of surveys: In order to facilitate survey completion, interested residents were asked to supply their email addresses for a UIC-maintained list which provided monthly email notifications with links and passwords for the surveys. Residents had the opportunity to supply their addresses on questionnaires completed at the time the project was first introduced, as well as at the end of each web survey. Officers were specifically instructed to encourage resident participation in the web surveys and to discuss any resident concerns about accessing or using the Internet. During Wave 3, beat personnel were also provided with pens and magnets bearing the name of the project to be provided to residents as encouragement from the CPD for participation. When police made survey flyers available to residents, they also tended to offer some form of encouragement for resident participation. The extent of encouragement varied among beats, ranging from simple reminders to go online and complete a survey to providing explanations as to the benefits of resident participation. The latter was most commonly described in terms of its utility to police for understanding resident concerns, increasing resident participation in CAPS, and generally improving CAPS as a program. As an added inducement, the CPD reached an agreement with a non-profit agency, Computers for Schools, to supply refurbished laptop computers to be raffled off in the last four months of the project among residents who completed a survey, printed out the survey submission page, and brought the page to the next meeting. Officers were asked to announce the raffle and collect survey submission pages. Information about the raffle was also added to the survey flyers for residents. As with distributing flyers, police did not fully comply with the request to announce the raffle and did so at 63.7% of the 102 observed meetings, yet announcement rates increased from Wave 4 to Wave 5 by at least 30% in each experimental 95 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. condition. There were no significant differences among the rates at which beats in the different conditions made the raffle announcements (X2=4.771, p> .05), however the training beats were the highest (76.9%) when compared with feedback beats (56.3%) and control beats (54.8%), Providing survey results. The residents in the 34 beats in the feedback and training conditions also received selected results from the web surveys completed by residents (both CAPS participants and the randomly selected panel) from respective beats. Survey results were emailed with project flyers to beat personnel and the CAPS Project Office; beat personnel were directed to provide paper copies of the results at the meeting and facilitate discussions with residents about the findings. Survey results were also posted on the Project website; as with email notifications about the availability of surveys, residents on the UIC-maintained listserv were notified when survey results became available online. Results were typically available to each beat during the week prior to the scheduled beat meeting to ensure residents had time to receive the email and view results online if they wished to. Every survey wave items were selected for public dissemination among CAPS participants. The selection was based on two factors: (1) perceived utility to police and residents for assisting in the identification and prioritization of local problems; and (2) introducing new areas for deliberation, including resident fear of crime and perception of the realities of police work. Likewise, the number of results made available was contingent not only upon the necessity to quickly process results from multiple beats in a timely manner, but also the limited time that would be available during beat meetings for discussion. For this reason, monthly results usually consisted of 10-12 items, with related items grouped together to form 4-5 tables (see Appendix F for an example). Content of results selected from each survey is as follows: Wave One: Items included feelings of resident safety, the most serious local problems as identified by residents, and high priority activities deserving public resources (e.g. after-school programs for youth, neighborhood watch programs). Wave Two: Items included frequency of individual safety behaviors (e.g. locking doors, asking neighbors to watch home), residents’ feelings of efficacy regarding ability to solve neighborhood problems, and level of resident engagement in community safety activities (e.g. attending beat meetings, speaking with neighbors about crime issues). Wave Three: Items included willingness to engage in crime reporting, attitudes towards the CPD and CPD officers, and satisfaction with 911 services. Wave Four: Items included residents’ beliefs/stereotypes about the nature of police work, potential areas for improvement of CPD services, and visibility of CPD officers. Wave Five: Items included resident engagement in community-level safety activities, knowledge about public safety strategies (e.g. CPTED, the Crime Triangle), and residents’ feelings of efficacy regarding ability to leverage resources to address crime issues. As with the previous steps of the protocol, difficulties were encountered in having police both provide and discuss survey results at the meetings. The provision and discussion of survey results pattern is shown in Table 6.3 and 6.4. While full compliance for both groups was never achieved implementation levels increased between Waves 2 and 6, with greater and more 96 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. consistent performance by police in the training group. Of the 17 feedback beats, only 3 consistently made results available, with a single beat failing to provide results at any points of observation, while 7 of the feedback/training beats consistently made results available to residents and all other training beats provided results on at least one occasion. Ultimately, making the survey results available was the key indicator as to whether survey results would then be discussed with residents. When printed results were provided at meetings, survey results were discussed at 86% of those meetings while there were only 8 occasions when printed results were not provided, yet police still discussed results with residents. Table 6.3 Availability of Survey Results in Feedback and Training Groups (%) Wave 2 (%) Wave 4 (%) Wave 5 (%) Wave 2-6 (%) N = 15 N = 15 N = 14 N = 58 Yes 40.0 73.3 78.6 60.3 No 60.0 26.7 21.4 39.7 N = 14 N = 14 N = 15 N = 59 Yes 71.4 71.4 80.0 74.6 No 28.6 28.6 20.0 25.4 Feedback Training 2 X = 2.9 2 2 X = .01 X = .00 2 X = 2.7 Table 6.4 Discussion of Survey Results in Feedback and Training Groups (%) Wave 2 (%) Wave 4 (%) Wave 5 (%) Wave 2-6 (%) N = 15 N = 15 N = 14 N = 58 Feedback Results discussed 13.3 26.7 57.1 31.0 Residents told to read results 13.3 6.6 6.6 8.6 Results not discussed 73.4 66.7 36.3 60.4 N = 14 N = 14 N = 15 N = 59 85.7 50.0 46.7 64.4 0.0 14.3 13.3 6.8 Training Results discussed Residents told to read results Results not discussed 14.3 2 X = 15.5*** 35.7 40.0 2 2 X = 2.8 X = .43 28.8 2 X = 13.3* ** p< .01, *** p< .001 Discussing survey results. The discussion of survey results was the most critical component of the study and it increased over the course of the project especially in the feedback beats. As Table 6.4 shows, police in all but 2 of these beats failed to discuss survey results in the first month of implementation, but the rates for discussion increased to over half of the beats by Wave 5. Police in feedback/training beats, however, demonstrated the opposite pattern for holding discussions, starting off at a high rate of participation in Wave 2 that declined by Wave 97 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 5. Yet it should be noted that of the 12 beats observed in the feedback/training group during Wave 6, 75% discussed the survey results during their meetings. Despite these conflicting patterns, police from beats in the feedback/training group exhibited more survey result discussion consistency at a significantly higher rate than police in the feedback group. Of the 8 beats in which police discussed results at all points of observation, 7 were from the feedback/training group; conversely, of the 7 beats in which police failed to discuss survey results at any given point of observation, 6 were from the feedback group. Ultimately, survey results were discussed at 48% (56) of the 117 observed meetings; of those meetings where discussions occurred, 67.3% occurred within feedback/training beats as opposed to just a 32.7% in the feedback beats. What was the nature of the discussions that took place regarding survey results? Given that results were intended to foster police-resident communication and increase problem solving, what was the quality of the ensuing discussions between police and residents? Given that the police facilitator role was central, the manner in which police solicited resident input about the survey results stands as the key behavior for encouraging resident participation in the overall discussion. For meetings at which discussion of survey results occurred, only 8 instances (14.5% of meetings) were recorded in which police did not actively seek resident participation in the discussion. Police most commonly encouraged residents to join in the discussion by first providing their own feedback regarding the survey results (56.4%) or asking residents whether they had any questions about the survey results (50.9%) and less frequently (34.5%) they requested that residents supply their own feedback about results. Police in feedback/training beats encouraged residents to participate in discussions more than police in feedback beats, using multiple forms of encouragement during 37.8% of these meetings versus 27.8% for feedback only beats (X2=.542, p>.05). UIC observers were instructed to determine whether police or residents seemed to dominate discussions, defined as controlling and otherwise talking the most, in relation to both the overall discussions taking place during meetings and specific discussions about survey results. Despite encouraging resident input, police tended to dominate discussions about survey results (85.5%) more frequently than they dominated general discussion during meetings (45.6%). Residents, in contrast, dominated or contributed equally in discussions about survey results at only 14.5% of the meetings, well below the rate at which they actively participated in discussions during meetings (55.4%). The nature of discussions about survey results were categorized according to the inclusion of specific problem-solving components on the part of police and residents: causes of problems, proposal of solutions, and agreed courses of action. Discussions most frequently included the first component, with the exploration of causes and nature of problems as identified through the surveys occurring during 46.3% of all discussions about survey results. Solutions to address problems were proposed with slightly less frequency (40.7% of discussions) and occurred during almost 60% of discussions that also included covering the nature of problems. Police tended to propose more solutions, doing so during 35.2% of discussions, while residents did so at only 18.5%. Discussions rarely, however, led to arranging a definite course of action for either police or residents to address the identified problems, a component which occurred in only 7.4% of discussions. While no significant differences were found between the rates at which problem-solving components were included in discussions held in feedback/training beats and discussions in feedback beats, feedback/training beats still exhibited greater rates of inclusion of these components. Exploring the nature of problems occurred in 52.8% of 98 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. feedback/training beats versus 33.3% of discussions in feedback beats (X2=1.83, p>.05), proposing solutions occurred in 44.4% of feedback/training beats vs. 33.3% of feedback beats (X2=.614, p>.05), and reaching an agreed course of action occurred in 11.1% of feedback/training beats vs. 0% in feedback beat discussions (X2=2.16, p>.05). The quality of discussions varied just as many aspects of the project implementation. Often discussions were little more than police reading the survey results to residents directly from the printed sheet, asking if there were questions, and then moving onto the next point of the agenda. In these instances, survey results were treated in the same manner as crime reports where statistics about arrest and crime rates are read and residents are then provided an opportunity to ask questions. Arguably, some police perceived survey results as simply another set of information to be shared with residents and the objectives of the project had been met by making the information available, particularly if they consider information sharing (in lieu of genuine dialogue or problem solving) the primary outcome to be achieved at meetings. At the other end of the spectrum, some discussions about survey results were treated in a fashion quite similar to that exhibited when discussing problems identified by residents during meetings, exploring both the causes and nature of the survey findings and possible solutions to address the issue at hand. This was seen more when survey findings focused on resident identification of serious problems and priority activities, possibly because of the similarity to problems residents routinely bring to the meetings. In one beat, for example, burglary had been identified as the most serious problem in the survey findings and police prepared a short presentation about preventative measures that residents could take to guard against burglary. In another, residents assigned fixing potholes as a high priority activity; the police responded by passing out forms about the location of potholes for residents to fill out during the meeting that would be delivered to the appropriate city service agency. Ultimately, discussion of survey results lasted on average 5-10 minutes. On the surface, this would appear an insufficient amount of time for genuine discussion about results or engaging in problem solving behaviors. Yet it is important to understand the amount of time devoted to discussing survey results within the context of how the overall time is apportioned during the beat meeting itself. Meetings typically last no more than one hour and the structure of is dictated by the standard CAPS agenda which inevitably restricts the amount of time that can be devoted to any single topic. The fact that discussions about survey results lasted 5-10 minutes possibly reflects this reality; indeed, more than once an observer noted that the time spent on survey results was the most or a comparable amount afforded to any single topic covered during the meeting. Training components. In addition to receiving selected survey results, the 17 feedback/training beats also received two separate training components. As noted earlier, the first component concerned an all-day training session for beat sergeants run jointly by UIC and the CAPS Project Office prior to the start of the project. This training consisted of a problemsolving refresher and instruction on the use and interpretation of survey results in problem solving. The session not only allowed sergeants an opportunity to ask questions and express concerns about project objectives and their own expected roles, but also to become familiar with additional plans by the CPD to expand police-community communication through technology. Given that police in these beats typically outperformed those in other beats in terms of both implementation levels and quality, this would suggest that the training session served to make 99 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. sergeants feel more invested in the project and better understand the larger goals of the CPD to further incorporate information technology into the day-to-day operations of the organization, thus positioning the project as more legitimate. The second training component was the Wave 4 introduction of exercises to guide discussion pertaining to selected survey results. These exercises were intended to enhance discussions by providing officers with questions that sought to (1) provide a fuller understanding of citizen concerns and nature of the problem as related to the particular survey findings; (2) engage citizens more fully in the problem-solving process by seeking their input as to potential solutions; and (3) allow officers the opportunity to educate citizens regarding police function and crime prevention activities. Topics selected as the focus of the three problem-solving exercises included (1) citizen perception of how fair and impartial Chicago police officers were; (2) common misperceptions citizen have regarding the police function; and (3) resident fear of crime within their neighborhoods (see Appendix G for an example). To insure compliance, officers were required to use a form to summarize the points discussed during the exercise and return the completed form to the CAPS Project Office. As with other tasks, full police compliance was not achieved, but implementation rates steadily increased with each subsequent wave. Based on 41 observations, the rate of participation in the exercises increased from 35.7% in Wave 4 to 58.3% by Wave 6. When police did use the exercise to guide their discussions, however, observers tended to record more robust discussions regarding the survey results that stimulated somewhat more problem-solving on the part of participants. Indeed, more problem-solving was demonstrated during discussions where the exercises were used than when the exercises were not used. Examination of the nature of problems occurred in 52.9% of discussions where exercises were used compared to 40% of discussions where exercises were not used (X2=.259, p>.05). Solutions to problems were proposed during 52.9% of discussions using the exercises, but occurred in only 20% of discussions without exercises (X2=1.69, p>.05). Observers at times noted somewhat more participation by residents during discussions using exercises, no doubt attributable to the fact that questions specifically intended to elicit resident input were built into the exercises. The forms completed by police in they summarized their discussions during the exercises are a less reliable, but nonetheless valuable source of information. While some are terse or state residents did not have questions or did not respond to encouragement by police to contribute to the discussion, others suggest the depth of discussions that occurred. This is particularly evident with the final exercise in which the discussion centered on not only identifying locations around the beat where residents felt unsafe, but exploring the causes for this fear and possible solutions to reduce fear of crime. Responses from several beats demonstrated a more in-depth analysis of causes and solutions than is typical at beat meetings. This suggests a certain value to providing police with further guidance for discussion and examination of beat problems, particularly those problems that police feel they can do little about, such as resident fear of crime. 3. Police Attitudes about Participation in Project Internet communication. Police personnel, including beat sergeants, patrol officers, and community policing officers, who regularly attended and/or facilitated beat meetings were 100 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. interviewed at length regarding their attitudes about the project and the nature of their participation. Of the 48 individuals interviewed, roughly 80% expressed at least partial support for the general idea of citizens using the Internet as a means to communicate public safety concerns to the police, while 12% did not support the idea in any way and another 8% took a neutral stance. Support was largely based on the premise of the Internet as a viable alternative for information sharing by residents whether they did or did not attend beat meetings. Police perceived the anonymity of Internet communication as the primary benefit to residents; as one patrol officer stated, it would provide residents with a “good opportunity to express feelings without being in a public forum.” Police acknowledged that some residents were leery of speaking out when they attended beat meetings just as other residents failed to attend meetings due to “a fear of reprisal”. The implied source of this fear was typically the criminal element within a community, but it was recognized that police were also a source of “intimidation” for some residents. As one community policing officer said, “they want to talk to us, but they don’t want to talk to us.” To this end, it was felt residents would be “more prone to speak out on the Internet” and “more open and honest than at a meeting.” Those who supported the use of the Internet presented two sets of beliefs regarding its role within the CAPS context. One belief was that the Internet could be an alternative for attending meetings and some even felt residents would be more likely to participate in CAPS if given this opportunity, e.g. “You’ll get a greater response using the Internet. Most people in this area have the Internet and they would use it rather than come to the meetings.” This belief extended to reaching individuals who had never attended meetings either because they were not interested or were unable to do so because of other conflicts or problems. The other belief was that the Internet was acceptable as an “additional tool along with meetings”, usually qualifying support with comments that indicated the beat meeting should remain the central source for police-resident interaction. This view was also shared by some individuals who did not support Internet use and formed the basis for their reasoning. The general thrust of this belief was summed up by one officer: “The interaction within the confines of beat meetings are much more useful than Internet surveys.” Police tended to like the idea of providing a new avenue for residents to share information, but considered it “a little one-sided” and not conducive to problem-solving. One sergeant provided the main concerns of this viewpoint: “You lose the personal touch between people and the officer’s ability to delve into the problem. Otherwise it’s just like 911: good for providing information. But that could be a problem for problem solving issues…without actual meetings, citizens would lose contact with other citizens and their views; they may have similar problems and it’s good for them to get other citizens’ perspectives.” These beliefs reflect the broader police views regarding the general function of beat meetings, with traditional values competing with more community-oriented values. For those officers who saw Internet communication as potentially interchangeable with meeting attendance, an implicit message was that meetings were primarily about residents sharing information with the police without any problem solving pretense. An apt description of this view was provided by one individual who called the beat meeting “in-person 911”. While Skogan (Skogan, 2006) has documented the phenomenon of residents attending meetings to essentially present their problems with the expectation that police will fix them, the flip side has been somewhat less elucidated: police often do not consider problem solving as a valid activity 101 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. at beat meetings any more than residents do. With this rationale, any venue allowing residents to air their concerns would then be deemed acceptable if the objective is to simply collect resident complaints and act on them in an isolated fashion. The Internet is certainly valid in this sense as “another way to give information.” On the other hand, those who positioned Internet communication as a supplement to beat meetings assigned a broader value to meetings, encompassing multiple objectives of a community policing paradigm: problem solving, face to face interaction between police and residents, and more involvement of residents in the process. Expressions of support for Internet communication were typically tempered by other concerns that were also shared by those who did not support the concept. Lack of access to computers or the Internet, being computer literate, age, and apathy were all cited as problematic for conducting such an initiative. Survey topics. Only 9 of the 48 police personnel interviewed had gone online to see the content of the Internet surveys; while some said they had looked at multiple surveys, others had only viewed the first survey. Of those who did look at the surveys, their feedback was largely positive. They appreciated the fact that a wide array of topics had been covered by the surveys, feeling it “gives us a broader view of the problems,” although some expressed the wish that surveys could have been more specific to the concerns of their beats. The officers were also asked to offer suggestions for the type of topics they would like included on surveys; their answers indicated a broader use of the Internet than just the use of surveys and fell into one of two general categories. The first and most frequently mentioned was some version of an online reporting system in which residents could report their concerns and keep police apprised of conditions and problems in the beat. While this concept was linked with the basic idea of providing a forum for residents to share information with police, it was also sometimes connected with the problem solving process and included following up to see what if any progress had been made by the police on previously reported problems. The second category concerned information both about and for residents. Along with collecting relevant demographic information about beat residents, police expressed interest in having data about their specific public safety behaviors (e.g. “Are you keeping your door locked?”) and their participation in CAPS (e.g. “How many attend functions sponsored by the CPD”). Some were interested in getting residents more invested in maintaining public safety and gathering information about what they “have done, can do”; as one officer said,“ Place more responsibility of assisting law enforcement and problem solving on citizens. Ask the citizens what would you do? How would you solve the problem? What and how is a problem affecting you? Police officers cannot solve specific problems without targeted information.” There were also suggestions that sought to examine resident understanding of certain issues with the ultimate objective of supplying the “missing piece to make it work: education.” For example, one community policing officer referred to a high profile incident that occurred during the project which spurred public debate in Chicago regarding the citizen role during traffic stops, stating residents “don’t know the proper protocol” and the belief that residents should be tested regarding their knowledge about their rights. Others referred to what they perceived as a need for “better understanding of what a police officer’s job entails” and noted that residents grew upset by negative portrayals of police in the media or misperceptions of what actions could legally be taken to address local problems. It was believed there was a need for police to “give direction to get answers” by supplying residents with information that would not only help them 102 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. to understand the nature of police work and improve police-community relations (e.g. “giving an explanation shows respect”), but also in terms of providing residents with more resources to address problems within their communities. Content and discussion of results. Police were also asked about their reaction to the survey results, both their content and the utility of discussing them during their beat meetings. Of the police in the 34 beats that had received and were expected to discuss survey results, 79% reported doing so. Their general response to the content of the results was mixed and not easily categorized. A quarter of police responded positively to the results, usually because the findings were relatively positive, reflected the opinions expressed at the meetings, or the police themselves agreed with them. Roughly another quarter of the individuals discussed their reaction to a specific set of results from their beats that had made an impression on them, typically a negative one. Not surprisingly, in all but one instance the results pertained to resident dissatisfaction with various aspects of police services or performance and it was clear that introducing these topics had touched a nerve with some of the participants. Police questioned on what residents were basing these assessments and some dismissed the findings altogether as illogical, lacking the specificity needed to devise solutions, or simply beyond their control because, as one sergeant put it, “I can’t affect that…either you love us or you hate us.” However, such responses are indicative of the mindset among many officers, one that basically denies that it is the responsibility of the police to find out the causes of negative performance assessments and address them as problems to be solved. Yet another quarter of the responses related reservations about the accuracy of the findings they had received, largely because they considered it “hard to gauge” the value of the results when they had not been provided with the number of residents who had completed the surveys. Police expressed similarly mixed feelings regarding whether discussing the survey results at their meetings had been useful to them. About a third reported finding the discussion of results useful, while a third did not find it useful and another third took a neutral position or were not sure. For those individuals who considered discussing the results as useful, most felt that the primary value of discussion was bringing a broader perspective about the beat to their meetings. They saw the results as allowing police to “know things that they otherwise wouldn’t”, particularly from the “Internet people” whom officers acknowledged as having different concerns than residents who attended meetings. One community policing officer felt discussing results achieved the central objective of beat meetings, that of focusing on beatwide problems as opposed to individual-level concerns, something often lost in the current incarnation of CAPS: Discussing results “reinforces what is most important in meetings. It helps people to remain focused. Bringing in personal problems are not the correct issues. Those issues that are general to the community are the appropriate ones.” Police who did not find discussing the results useful expressed multiple reasons for feeling this way. Some felt the “personal contact” they already maintained with residents was sufficient to apprise them of community concerns, although it was noted that the results might have value for officers “not involved in the process.” Others cited lack of resident participation in discussions; unable to discuss results in a meaningful fashion with residents, police did not consider discussions as having an “honest use”. For some, resident contributions to discussions were described as virtually non-existent: “there were seldom comments, mainly just blank stares,” sometimes because residents were seen as being too concerned with their personal 103 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. problems. For others, residents failed to respond with feedback that would have provided police with additional information that they felt necessary for addressing the problems identified through survey results. The nature of the results were also considered by some as not being conducive to engaging in problem solving because either “we don’t know how to answer the complaints” or the information provided in the results were not deemed “actual problems.” The remaining responses can be classified as neutral or unsure. Based on the point at which police were interviewed during the project, some chose to withhold judgment about the utility of discussions until they had further exposure. Others provided responses indicating they considered the project as simply another activity to be done, but did not necessarily think of it in negative terms, “it was just there.” Police considered discussing results as being more clearly beneficial for residents, with over half expressing views that discussions provided a forum for residents to not only receive insight into how other residents felt about certain issues but also to express their own opinions. Being exposed to views of other residents not present at meetings was considered valuable, allowing residents to “revisit [their] judgment on an issue”, particularly if their perception diverged from the general consensus. While providing additional information of any type was considered constructive (e.g. “anything you give in terms of knowledge is useful”), some also felt providing this particular type of information was important for boosting resident participation because it sent messages to meeting attendees that other residents were interested in what was happening in the beat and that their opinions were valued. To a lesser degree, police felt residents also benefited from the opportunity to share their views about police-citizen relations and police services. For those individuals who did not believe citizens benefited from discussing survey results, the nature of participation at meetings was cited as the primary reason. Again, residents were described as failing to contribute to discussions. This was attributed at times to a lack of interest; one community policing officer bluntly stated, “We don’t get much feedback from the people…we’re not going to pull teeth!” At other times, it was linked to the preoccupation with other problems on the part of meeting regulars that was identified by many as an obstacle to resident participation and illustrated by one beat sergeant, “The core group was less than enthusiastic. They want to talk about their own problems, not other people’s problems.” 4. Obstacles to Implementation In considering the low quality of implementation in many of the beats, there were clearly obstacles that prevented achieving full implementation of project objectives. Given the welldocumented problems associated with introducing change to police organizations, some obstacles were previously identified and attempts were made to counteract these within the project design, while other obstacles could not have been foreseen and still other were tied to other trends prevalent in the CPD during the project. Police and civilian facilitators in the study beats also provided their opinions about obstacles encountered; undoubtedly based on their individual experiences with the CPD and CAPS, their observations nevertheless dovetailed with researchers general observations and offered valuable insight as to the difficulties associated with initiating new projects. It should be noted that some police and residents declined to comment, typically because they had not experienced any problems with implementation in their beats. Some indicated they did not wish to speculate about what had occurred in other beats. Others 104 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. made a point of stating that materials had been received, passed out, and discussed as required and did not understand why it would be otherwise; as one officer stated, if it’s “your responsibility to do the job, there’s no excuse.” The obstacles to implementation as identified by police and civilian facilitators are shown in Table 6.5. Table 6.5 Obstacles to Implementation as Identified by Civilian and Police Facilitators (N=68) Obstacle % Organizational communication 32.4 Difficulty receiving materials 19.1 Individual commitment/resistance 17.7 Nature of beat meetings 13.2 Heavy workload 10.3 Reassignment of personnel 7.4 Lack of awareness 7.4 Organizational communication. As Table 6.5 indicates, the obstacle most frequently cited by participants were communication problems within the CPD. Police referred to this variously as “bureaucracy, a “breakdown in the organization”, “red tape”, and “a communication glitch” that interfered with personnel receiving necessary information about the project and therefore implementing the tasks. For some, this was discussed as a problem that occurred specifically in relation to the project and manifested itself as difficulties in receiving project materials. Police and civilian facilitators alike related instances where they never received materials, although this occurred chiefly during the first wave of implementation with it being noted as a one-time occurrence and the observation that material distribution had since been “running smoother.” Indeed, UIC researchers experienced problems during the first wave when results needed to be made available to beats, often leaving the CAPS Project Office with little time to ensure relevant personnel received materials. Only encountered during the first weeks, UIC immediately reevaluated and streamlined operations to correct this problem. However, others described ongoing problems of receiving materials in an untimely manner, such as not seeing results until an hour prior to meetings. More problematic, some police complained of confusion about project objectives and expectations despite repeated efforts to guarantee all parties involved were informed about the project either through contact with the CAPS Project Office or exposure to directives detailing project tasks. There were also approximately nine documented occasions when beats received materials, either survey results or problem solving exercises, when they were not in an experimental condition that was to receive them. This only happened in districts where multiple beats were taking part in the project. Such occurrences suggest more systemic issues with communication typical of large bureaucratic organizations; “Dissemination of information by CPD is a problem…we get information haphazardly,” one beat sergeant confirmed. Given that CAPS beat meetings were the vehicle for the project, the CAPS Project Office represented the logical choice for coordination and dissemination of project information. UIC 105 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. worked closely with the CAPS Project Office, holding several joint planning sessions aimed at preventing such communication problems. Information was provided to beat personnel through formal memos sent by the Assistant Deputy Superintendent who oversaw the CAPS Project Office, following established protocol of sending memos to the Chief of Patrol and directed attention to relevant personnel, typically District Commanders, CAPS Lieutenant, Community Policing Sergeant, and Beat Sergeant. In all, nine memos regarding the project were sent out. Not only were project materials (flyers, results, and problem solving exercises) sent to the CAPS Lieutenants, Community Policing Sergeant, and Beat Sergeant by email each month, but the CAPS Project Office faxed and mailed hard copies as well. More informal, ongoing communication with personnel related to the study beats was also maintained by members of the CAPS Project Office. This inclusive approach was considered sufficient to ensure that information reached the correct parties given the variations across both districts and beats as to both the day-to-day operations and, more importantly, the facilitation of beat meetings. Despite all efforts, problems existed both with the channels of communication as they exist within the CPD, but also with the researchers’ approach to disseminating information. When considering levels of implementation and observations provided by police, communication lines clearly operated more effectively in some police districts than in others. Beyond the communication problems common in large bureaucracies, the CPD is distinctive for having a well-structured community policing program built into its existing framework. However, the chains of command for CAPS and patrol units run parallel to one another in a manner that was not always conducive to passing along project information. The CAPS Project Office made diligent attempts to find “go to” individuals for each beat with moderate success, but relationships forged with CAPS Lieutenants or Community Policing Sergeants did not necessarily translate into a clear flow of information to Beat Sergeants who report to patrol supervisors. Emails were sent to individuals considered relevant for facilitating beat meetings and therefore most in need of receiving materials in a timely fashion, but not all of these individuals maintained email through the department. For example, about 20% of the beat sergeants did not have email accounts and could not be reached in this manner. While some beat sergeants spoke highly of their district’s CAPS office and its efficiency at distributing project materials, we suspect other sergeants had less than productive working relationships with their district’s CAPS office which disrupted communication lines, particularly for beats where the CAPS Project Office had not established a relationship with beat sergeants. Communication breakdowns might have also been accompanied by “information overload” with individuals responsible for bringing materials to beat meetings also either being required to bring so many other handouts or already handling so much paperwork in the course of their day-to-day work that project materials essentially got lost in the shuffle. While study beats only provided an average of seven handouts per meeting, an average 22% of beats provided between 10 and 20 handouts and another 4% provided over 20 handouts each month. For such beats with routinely high numbers of handouts, the addition of even the 3-4 pages of project materials may have been easily overlooked when materials were collected for meetings. Misplacement of materials received by the appropriate personnel seems to have occurred in some instances, although it should be noted that neither the CAPS Project Office nor UIC received requests for the materials to be resent during the project. 106 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. In the end, the problems of organizational communication cannot be considered in isolation, they are certainly related to other obstacles encountered, such as officer resistance to the project, the reassignment of personnel during the project and lack of continuity at meetings, and the precarious position of CAPS within the CPD, and ultimately may have been a manifestation of all of these issues. Individual commitment and resistance. After problems with communication, the obstacle most identified by police and civilian facilitators was commitment and resistance to the project. Some expressed a general belief that there were individuals within the CPD who “just don’t care”, “lay down on the job”, or were simply “lazy” and had not given sufficient attention to the project. Within these responses, lack of commitment to project objectives was mostly alluded to rather than directly mentioned, although a few stated this concept quite plainly. One beat sergeant suggested failure to carry out project tasks by some individuals was because “they think it’s just more crap and ‘I gotta hand out this damn thing.’” Another beat sergeant summed it up by saying “You can’t rely on people who aren’t as committed.” Others, however, believed some individuals were actively resistant to the project, primarily due to a reluctance to embrace new programs and fear of criticism. Civilians more frequently felt that fear of negative feedback had kept police from both promoting resident participation in the project and discussing survey results; one civilian facilitator stated, “Police didn’t pass out the flyers because they don’t want people to fill out the survey. They don’t want to hear the bad things that people have to say about them.” This view was supported by a beat sergeant who said, “Maybe we are like anyone else, we can be afraid of criticism, of being the subject of criticism…we might know we’re wrong.” However, the issue is not as straightforward as merely having failed to achieve officer buy in. Apart from other obstacles enumerated that contributed to poor implementation, there is also the impact that the UIC observers’ presence had on the attitude police took towards the project. By design, observers were kept blind to experimental conditions beats in order to prevent them from making judgments regarding the presence or absence of certain project tasks (e.g. discussion of results). To this end, they were instructed to not interfere with the running of meetings, especially if police had not made survey results available or it appeared they were not going to discuss them. Beyond introducing the project and administering questionnaires, they were not expected to speak at meetings and understood the police to assume sole responsibility for administration of and explanations about the project. The only exception pertained to the survey flyers; as we did not want residents to be denied the opportunity to participate in completing Internet surveys, observers were allowed to provide the flyer information at meetings when police failed to do so. Even when observers did not have to provide this information, their presence at meetings was known to police. Police in some beats even relied on observers to explain the project to residents and answer questions, even 3 to 4 months into the project, and clearly considered it a University project. During the interviews, some police expressed frustration with observers because they were reluctant or unable to supply information. Several police referred to observers as “UIC representatives” and one even complained sergeants were forced to answer residents’ questions and expand on the project because observers did not know as much as they should. Arguably, the presence of the observers at meetings was misconstrued by some police and they did not assume the responsibility they should have for implementation because of this. 107 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Gauging the true level of commitment or resistance that police had for the project is difficult based solely on implementation. Certainly there were beats where police were stronger advocates of the project and observations support this. However, it must also be considered that demands for accountability were clearly stronger in some districts than others. The idea of formal accountability for implementation was not built into the original project design; rather it arose after poor levels of compliance were witnessed in the first month of police administration of project tasks. District commanders were not approached regarding possible strategies for combating non-compliance and, therefore, it was left to the discretion of each commander as to the appropriate response, a decision no doubt commensurate with the value each assigned to the project. It is impossible to determine whether accountability demands within the district or individual commitment to the project ultimately determined implementation, it can only be stated that both probably played roles. Given the fact that implementation levels steadily increased throughout the project, there is reason to expect that some officer resistance could have been overcome; this was also an opinion shared by some police: it was simply a matter of giving adequate time for the project to become integrated into the beat meeting process and accepted by both police and residents. Nature of beat meetings. The chosen vehicle for the project, the CAPS beat meeting, was also considered by some to present an obstacle to implementation, particularly the time available to cover issues. Many noted meetings are typically just an hour long with numerous concerns to be addressed in that time; as one beat officer stated, “There are so many pressing issues at beat meetings that it is hard to get everything out”. Observations during this project and past observations tend to both support and refute this contention. There is no question this is true for some beats, especially those with serious crime issues and high attendance rates. Just as one sergeant noted, police in such beats tend to proceed through the agenda at a brisk pace out of necessity, ensuring all points are covered or ample time is left for residents to present concerns. It was for this reason researchers requested that the project be placed on the written agenda, to guarantee that it would be given time on an otherwise very full agenda. Yet it would hardly be accurate to state that this was true for all beats. For the beats in our study, the average length for meetings was less than 48 minutes, with the shortest recorded length being a mere fifteen minutes. In all, roughly 20% of meetings lasted between 15 and 30 minutes and 75% under a full hour. This suggests that, for a good number of beats, there was more than enough time to carry out project tasks, particularly discussion of survey results, and time management was not the reason for failing to do so. There were other occurrences, however, that stand as obstacles to fully realizing project objectives within the particular context of the beat meeting. For the most part, beat meetings are held on a monthly basis and tend to follow the same agenda, yet there were exceptions to this that further contributed to weakening treatment dosage. In beats that have low attendance and/or crime rates, it is not unusual for meetings to be held only every other month. Such was the case with four of the 51 beats in the two feedback conditions, reducing the number of opportunities for residents to participate in completion of surveys and results to be discussed at meetings; two of these beats only had two opportunities, while the other two had three. Meetings were also cancelled in other beats for various reasons numerous times. Sometimes beats did not hold typical meetings; on a handful of occasions, various beats held picnics or parties, walked as a group to a problem corner, and showed a video about burglary prevention. At other times, one pressing issue dominated the proceedings so that the normal agenda was not actually followed. 108 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. For one beat, it was a wave of robbery/batteries that had residents concerned. In another, a wellpublicized clash between a patrol officer and residents walking their dogs off leash needed to be addressed at the meeting and drew media attention. Such occurrences are inevitable within the framework of beat meetings. While there are certainly identifiable patterns across meetings as to interaction and information shared, each beat is in many ways unique in that such often spontaneous occurrences cannot be controlled for. Disappointment with the quality of interaction at meetings was a common theme that ran through interviews with police and civilian facilitators. While civilians often commented on the willingness of the police to share information with residents, police often commented on the lack of resident participation both in attending meetings and offering only highly personalized concerns during meetings. The lines were not clearly drawn; each side often recognized and was critical of their own members as well. Noted also as an obstacle to resident participation, the expectations about the roles of police and residents at meetings possibly stand as the greatest obstacle to the heart of implementation: discussion and problem solving about survey results. The problem of interaction at meetings unfolds in multiple parts, beginning with the state of problem solving at beat meetings. Observations during the course of the project indicated very little joint problem solving occurred at the meetings, something frequently acknowledged by police and civilians alike during interviews. And this absence of genuine problem solving has arguably led to narrowed expectations for police and citizen as to the role they play at meetings. A prevailing view of these expectations was summed up by one beat sergeant who stated, “There’s not much going on…The citizens’ attitude is we gripe, you at CPD take care of it.” Introducing survey results about broader topics into this context was a difficult prospect because findings did not necessarily fit into the narrow framework for interaction at meetings. While police and residents alike seemed generally pleased to receive new information, generating discussion and problem solving about this information was certainly hampered by the usual pattern of exchanging information at meetings. Police tend to read crime statistics, report on their progress regarding previously identified problems, and then essentially take resident reports about problems in the beat. The survey results were often incorporated into this framework as just another set of statistics to be read to residents before going to the next point of the agenda; as one beat officer put it, “whatever, okay, move on.” Discussions about survey results and exploration of the issues that were raised by the results most commonly occurred within beats that had received the training and were following a problem solving exercise we provided. This suggests that, if anything, the modes of interaction between police and residents are not intractable, but rather additional training about facilitation of meetings and guidance in some form can help to redefine interaction and the possibility of joint problem solving at beat meetings. The organizational focus and CAPS. The current focus of the CPD on the redeployment of police to hot spots of violent crime should also be mentioned as a factor that has impeded project implementation via CAPS. As with most large police organizations, the CPD has long been plagued by problems of limited manpower that left some police feeling the burden of increased workloads and caused a lack of continuity among beat personnel that was once a signature feature of its community policing program as the organization attempts to meet shifting demands for service. Support for the CAPS program within the organization has been on the decline for some time, in part a victim of the scramble for limited resources. Two related 109 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. problems were cited by those interviewed as obstacles: the heavy workload and reassignment of personnel. Some police felt that most personnel, whether sergeant, community policing officer, or patrol officer, already had “too many duties” and that the addition of project tasks was “creating more work” for them. This perception was related to another complaint, the continual reassignment of personnel, both permanently and temporarily. As with any large police organization, promotions and transfers occurred frequently during the course of the project and resulted in turnover of personnel both within the CAPS Project Office and participating beats, something in part responsible for problems with communication and awareness about the project. This was most evident among beat sergeants, which caused the CAPS Project Office to rely heavily on CAPS Lieutenants and Community Policing Officers for the administration of project materials until sergeants could be contacted and brought up to speed about the project. As noted above, this strategy worked better in areas where the CAPS office had good working relationships with sergeants, but unquestionably left a gap in those where they did not. Such disruptions in the CAPS Project Office were also somewhat problematic. The original coordinators were promoted and transferred before the end of the project. While new liaisons at the CAPS Project Office did a laudable job of coordination for the remainder of the project, the working relationships forged by the original liaisons were largely lost and new relationships needed to be established in much less time. More problematic was the temporary reassignment or “detailing out” of personnel from their normal duties. The lack of continuity in personnel meant there was very little “beat integrity” as to who was involved in beat meetings, and civilians and police alike noted this was disruptive both to police-community partnerships and implementation of new initiatives. One officer summed up the current situation in the CPD this way: “The beat officers are sent all over God’s green earth. The regular beat officer is seldom on the beat…Flyers didn’t get passed out due to reassignments, officers are pulled twenty different ways.” Undoubtedly the CPD experiences the same problems of insufficient manpower as other large agencies do; vacations and furloughs will occur and temporary reassignments will be needed to ensure adequate coverage. The most detrimental consequence has been the weakening of the CPD’s community policing agenda. Disruption of officers’ ties to their beats means a disruption of their relationship with residents in their beats, a central premise of the community policing mission and important for the ability to engage in effective problem solving. The constant reassignment of personnel and reallocation of limited resources has also taken a more direct toll on the CAPS program. CAPS lieutenants are routinely assigned to fill roles that fall outside the domain of CAPS. Because the CAPS lieutenant is not a budgeted position, some districts do not even have one, which was the case in three of the 18 districts involved in the project, and CAPS sergeants are required to fill that role. Community policing officers are regularly detailed out, particularly those on the third watch, for patrol, special events, and directing traffic. As one community policing officer told us, “There are many days of the week when the officer is not in the CAPS office.” In one district, community policing officers spoke of being charged with all sorts of tasks that had little to do with their own jobs, such as pulling statistics and inputting data for other units, with one describing himself as a “jack of all trades, master of none”. Beat sergeants and officers also acknowledged this. One sergeant noted that the monthly meetings between beat team leaders and CAPS had ceased due to an “overtime conflict” and that this had caused information to not be received. Others noted their heavy workload and a lack of resources as preventing them from attending to issues in the beat, a 110 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. problem they also attributed to a lack of support within the organization for CAPS: “CAPS is put on the backburner. Pretty much the police department frowns on CAPS.” Resistance to CAPS within the department has always existed and this resistance was encountered numerous times during interviews with police. Some felt CAPS was simply a “public relations tool” and that “our job is to deal with crime. We’re not just PR servants.” Regardless of those feelings the fact remained that shifting organizational resources away from community policing was a serious obstacle to implementing the Chicago Internet Project. The lines of communication between those in CAPS and those in patrol were not always effective, something compounded by the continual reassignment of personnel. Lack of continuity among personnel at beat meetings made it difficult to identify individuals to assign responsibility for ensuring project tasks were carried out and probably accounted for implementation failure on numerous occasions. Resistance to the project, or simply a lack of commitment, on the part of a good number of the police in participating beats must be taken in context of existing feelings about CAPS in general and beat meetings in particular. In the final analysis, the chosen vehicle for implementing this project was far more flawed than we had expected. Nevertheless, by redoubling efforts within the CPD and the University, we were able to implement the program in most beats and advance knowledge of this process. B. Testing the Effects on Residents and Officers: Hypotheses and Methods 1. Hypotheses The Chicago Internet Project tested the theoretical expectations of collecting data about community residents’ concerns and the subsequent use of this data to inform problem solving at the CAPS beat meetings. Theoretically, the collection of information about resident concerns through Internet surveys and discussion of survey results provided residents with a more direct avenue for participation in community policing initiatives. Because such activities represented processes that more actively engage residents, as well as provided them with a stronger voice in problem solving at meetings, it was expected that it should not only enhance resident perceptions of their capacity for contributing to public safety, but also increase participation in public safety activities and their communities. The act of collecting and discussing information about community concerns also signals a mutual willingness for more open communication between police and residents and assigns new value to resident input for use in problem solving activities, and thus should lead to improvements in police and resident attitudes about their partnership. The greater value assigned to community input also represents greater community-orientation, which in turn should increase the extent to which police work is based on community concerns and priorities. Based on these theoretical premises, we hypothesize that when compared to the residents from the control beats, residents from the experimental beats will report or exhibit: • Greater confidence in their abilities for solving local problems 111 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. • More knowledge about crime prevention • More favorable views of the police-citizen partnership • More interaction within their beat • More engagement in problem solving activities We also hypothesize that, when compared to the police from the control beats, residents from the experimental beats will report: • More favorable views of the police-citizen partnership • Greater engagement in community-oriented police work 2. Measurement and Scale Construction In all, 21 dependent variables were used to test the hypotheses using data from questionnaires administered to residents and police as well as observations of beat meetings. Dependent variables covered the following constructs: Residents’ perceptions of their capacity. Through both having a stronger avenue for participation and new value being assigned to use of resident input in problem solving activities at beat meetings, it was hypothesized that resident perceptions of their capacity to contribute to public safety would be enhanced. This includes resident assessments of their ability to both solve local problems and engage in crime prevention activities. Collective efficacy. This construct measures informal social control regarding resident feelings about their ability to achieve desired outcomes and solve problems within their neighborhoods. A two-item scale was designed to capture perceptions of resident capacity to work towards solving local problems. These items explain 80% of the variance and exhibited strong internal consistency (α =.742) Scale Items: (4=Strongly Agree, 1=Strongly Disagree) 1. Residents believe in themselves and what they can accomplish 2. Residents have the knowledge to solve area problems Final Scale Properties: (N=640) M=2.92, SD=.723, Range=1-4. Higher scores indicate greater belief in resident capacity to work towards solving local problems. Problem solving. A single item was used to additionally measure perceptions of resident effectiveness at solving neighborhood problems. Item: (4=Strongly disagree, 1=Strongly agree) 112 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 1. Residents are not effective at solving neighborhood problems Item Properties: (N=601) M=2.67, SD=.842. Higher scores indicate greater belief in resident effectiveness to solve neighborhood problems. Knowledge about crime prevention. This construct further measures capacity as it pertains to the individual’s knowledge about crime prevention with a three-item scale that includes basic behaviors such as keeping themselves and their property safe, as well as working with police. These items account for 68% of the variance and demonstrated internal reliability (α =.755). Scale Items: (4=Strongly Agree, 1=Strongly Disagree) 1. I know the things I need to do to stay safe when I’m out on the streets 2. I know the things I need to do to keep my home and property safe from crime 3. If I work with the police my neighborhood will be a safer place Final Scale Properties: (N=652) M=3.36, SD=.492, Range=1-4. Higher scores indicate stronger belief in one’s knowledge about crime prevention behaviors. Resident assessments of the police-citizen partnership. Collecting and disseminating community-based information indicates a greater willingness for open communication between police and citizens about community concerns, as well as emphasizes the importance of resident input for problem solving activities. It was hypothesized this openness would improve resident attitudes about their partnership with police; including satisfaction with both their partnership with police in the neighborhood and more specifically their partnership with police in the context of their CAPS beat meetings. Partnership in a general context. This five item scale captures residents’ attitudes about their partnership with police (i.e., in their beat) and includes key components of such a partnership as working together, openness of the police to sharing information with citizens, and satisfaction with the partnership. These items explain 51% of the variance and their internal consistency is strong (α =.746). Scale Items: (4=Strongly agree, 1=Strongly disagree) 1. The police and residents work well together when trying to solve beat problems 2. The police are good at sharing crime information with residents 3. I am satisfied with the partnership our neighborhood has created with the police 4. The police are open to input and suggestions from residents 5. The police are good at keeping residents informed about what actions they are taking 113 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Final Scale Properties: (N=643), M=2.96, SD=.571, Range=1-4. Higher scores indicate greater satisfaction with the police-citizen partnership. Partnership in the beat meeting context. Individual items measure resident perceptions of the quality of their partnership with police at their beat meeting on four different dimensions relevant for effective partnerships: communication, definition of roles, problem solving skills, and group cohesion. Item 1: (4=Strongly agree, 1=Strongly disagree) In general, there is open communication among beat meeting participants Item Properties: (N=610) M=3.25, SD=.611, Range=1-4, with higher scores indicating a stronger belief that there is open communication among beat meeting participants. Item 2: (4=Strongly agree, 1=Strongly disagree) Our group has been successful at defining specific roles and responsibilities Item Properties: (N=574) M=3.03, SD=.664, Range=1-4, with higher scores indicating a stronger belief that roles and responsibilities within the group have been successfully defined. Item 3: (4=Strongly disagree, 1=Strongly agree) As a group, our problem solving skills have not improved since we began this effort Item Properties: (N=559) M=2.81, SD=.770, Range=1-4, with higher scores indicating a stronger belief that group problem solving skills have improved. Item 4: (4=Strongly disagree, 1=Strongly agree) Beat meeting participants are not a close-knit group Item Properties: (N=561) M=2.63, SD=.773, Range=1-4, with higher scores indicating a stronger belief that participants at beat meeting are a close-knit group. Expressions of support for police. Resident attitudes about the police-citizen partnership were further measured through observations of resident statements about police made during beat meetings. A two-item scale captures attitudes about police as demonstrated by residents; this scale accounts for 90% of the variance and has strong internal consistency (α =.886). Scale Items: Of the residents that spoke… 1. How many seemed negative towards police (1=All, 2=More than half, 3=Half, 4=Less than half, 5=None) 2. How many seemed supportive of the police? (1=None, 2=Less than half, 3=Half, 4=More than half, 5=All) 114 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Final Scale Properties: (N=49) M=4.10, SD=.700, Range=1-5. Higher scores indicate greater expressions of support for police. Resident interaction within the beat. The more direct avenue for participation in public safety initiatives is hypothesized to provide and encourage greater resident interaction with other residents in their beat. General interaction. This construct measures resident interaction using a general scale that include joint activities beyond the beat meeting setting, such as attending other meetings with beat participants and working on beat problems together. This scale was originally developed and used in the annual evaluations of CAPS (e.g. Skogan & Hartnett, 1997). The four-item scale measuring resident interaction accounts for 58% of the variance and has high internal reliability (α =.758), Scale Items: Thinking about the people that you see at beat meetings, have you… (0=No, 1=Yes) 1. Seen them around the beat? 2. Attended any other kinds of meetings with them? 3. Talked with them on the phone? 4. Worked on any beat problems with them? Final Scale Properties: (N=640) M=.529, SD=.380, Range=0-1. Higher scores indicate greater resident interaction with other beat meeting participants outside of the beat. Resident problem solving behaviors. Given a more direct avenue for participation through the Internet surveys and discussion of survey results, it was hypothesized that residents will become more engaged in problem solving activities. This includes participation in meeting discussions, identifying and proposing solutions to problems, responding to problems, and leaving meetings committed to responding problems. General discussion. Two items measure the extent to which residents participated in the discussions at their beat meetings. One item simply measures the number of residents who spoke during the meeting, while the other item is concerned with who dominated (here defined as controlled or talked the most) discussions. Item 1: (Recorded number) Approximately how many residents spoke during the meeting? Item Properties: (N=49) M=7.35, SD=3.34, Range=2-15. Item 2: (1=Police, 2=Both equally, 3=Residents) Overall, who dominated the discussion that took place during the meeting? 115 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Item Properties: (N=49) M=1.94, SD=.690, Range=1-3, with higher scores indicating greater participation by residents in beat meeting discussions. Participation in problem solving at meeting. To measure the extent to which residents are engaged in the problem solving process, a three-item scale captures resident involvement in both identifying problems and proposing solutions for problems discussed at beat meetings. This scale accounts for 71% of the variance and has strong reliability (α =.789). Scale Items: During the meeting, who… (1=Police, 2=Both equally, 3=Residents) 1. Identified most of the neighborhood problems? 2. Brainstormed or proposed solutions for problems? 3. Proposed most of the solutions discussed? Final Scale Properties: (N=48) M=1.90, SD=1.08, Range=1-3, with higher scores indicating greater participation by residents in the problem solving process. Response to problems. Two items measure the extent to which residents are involved in responding to problems, through both resident reports at beat meetings on efforts they have made to solve problems as well as leaving the meeting committed to engaging in some activity designed to solve problems. Item 1: During the meeting, who…(1=Police, 2=Both equally, 3=Residents) Reported back on previous problem solving efforts? Item Properties: (N=48) M=1.15, SD=.619, Range=1-3, with higher scores indicating greater participation by residents in engaging in problem solving efforts. Item 2: (0=No, 1=Yes) Did residents leave the meeting with a commitment to future action? Item Properties: (N=48) M=.79, SD=.410, Range=0-1, with higher scores indicating more residents leaving meetings committed to engaging in some activity related to problem solving. Police assessments of the police-citizen partnership As with residents, greater willingness of police and residents to openly communicate about community concerns is hypothesized to improve police attitudes about their partnership with residents. This includes general assessments of their relationship with residents at beat meetings, as well as satisfaction with the nature of participation in the beat meeting partnership. Relationship in the beat meeting context. This single item measures police attitudes about their general relationship with citizens at beat meetings. Item 1: (1=Very strained, 4=Very congenial) 116 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. What is the relationship between police and residents at meetings in this beat? Item Properties: (N=176) M=3.63, SD=.518, Range=1-4, with higher scores indicating perceptions that a more friendlier relationship exists between police and citizens. Needs of partnership in beat meeting context. This five-item scale was originally developed for the initial evaluation of CAPS beat meetings (Skogan & Hartnett, 1997) and has since been well-tested in subsequent evaluations. Measuring police perceptions of the need for increased participation in the beat meeting partnership, particularly in relation to problem solving activities on the part of citizens, this scale accounts for 60% of the variance and has strong internal consistency (α =.828). Scale Items: “In this beat we need more…” (1=Strongly disagree, 4=Strongly agree) 1. More reports from residents about what they have been doing to solve problems 2. More reports from police on what they have been doing to solve problems 3. More civilian leadership of the meetings 4. More discussion of what residents should be doing before the next meeting 5. More training on what residents can do to solve neighborhood problems Final Scale Properties: (N=179) M=2.96, SD=.580, Range=1-4, with higher scores indicating that police regard there to be a greater need for improvement within the partnership. Community-orientation of police work. Because collection and discussion of community-based data assigns a greater value to the use of resident input for informing police work, we hypothesized that police will engage in more community-oriented police work. This includes both greater interaction with residents in the beat and greater use of resident input as a basis for police activities. General interaction. This construct measures police interaction using a general scale that include joint activities beyond the beat meeting setting, such as attending other meetings with beat participants and working on beat problems together. This three-item scale was originally developed for the annual evaluations of CAPS (e.g. Skogan & Hartnett, 1997); it accounts for 56% of the variance and also demonstrates solid internal consistency (α =.607). Scale Items: Thinking about the people that you see at beat meetings, have you… (0=No, 1=Yes) 1. Seen them around the beat? 2. Attended any other kinds of meetings with them? 3. Worked on any beat problems with them? 117 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Final Scale Properties: (N=183) M=.589, SD=.337, Range=0-1. Higher scores indicate greater police interaction with other beat meeting participants outside of the beat meeting. Community-oriented work. This construct measures the extent to which police work is based on community-oriented principles in terms of three dimensions: use of citizen input to prioritize problems, assignments stemming from community-identified problems, and on-duty interaction with citizens in the beat. Item 1: (0=Not very often, 1=Somewhat often, 2=Very often) Does your beat team consider resident input to help identify priority problems? Item Properties: (N=165) M=1.66, SD=.535, Range=0-2, with higher scores indicating greater frequency of using resident input to identify priority problems. Item 2: (0=Not very often, 1=Somewhat often, 2=Very often) How often are you sent on an assignment because of a problem identified at a beat meeting? Item Properties: (N=172) M=1.16, SD=.723, Range=0-2, with higher scores indicating greater frequency of being sent on assignments based on problems identified at meetings. Item 3: (0=Not very often, 1=Somewhat often, 2=Very often) When you’re not involved in answering a call, how often do you make personal contact with people who live or work in this beat? Item Properties: (N=171) M=1.12, SD=.710, Range=0-2, with higher scores indicating greater frequency of interacting with people in the beat. 3. Sample Table 6.6 shows characteristics of citizens who completed paper questionnaires at the time of the second administration in the study beats (N=668). The sample characteristics are comparable to those that Skogan (2006a) reported as typical CAPS participants in terms of age, education, homeownership, and race. Over half of the citizens in each experimental condition are over the age of 50 and have an education level of at least some college and least 80% are homeowners. African Americans and Whites are almost equally represented, while Latinos make up only between 8 and 11% of the sample population. There are also more females than males, with females comprising over half of the participants in each experimental condition. Over 80% of the sample had attended at least one meeting in the twelve months prior to the time the questionnaire was administered, with over 50% having attended at least four meetings. Citizens also reported high rates of Internet access across all experimental conditions: 174 (69.6%) citizens in the control group, 141 (66.2%) citizens in the feedback group, and 135 (70.7%) citizens in the training group indicated having Internet access either at home or work. Of all the individuals in with Internet access, 350 (82%) indicated they had access at their homes. Bivariate analyses showed no significant differences gender (X2=2.482, p>.05), race/ethnicity 118 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.6 Demographic Characteristics of Citizen Beat Meeting Participants by Experimental Conditions (N=668) Control Feedback N % N 91 37.9 149 African American Hispanic/Latino Training % N % 86 42.2 81 45.5 62.1 118 57.8 97 54.5 83 34.7 84 41.6 72 40.4 26 10.9 17 8.4 17 9.6 124 51.9 95 47.0 80 44.9 18-30 9 4.1 3 1.6 11 6.6 31-41 32 14.6 23 12.5 29 17.4 41-50 46 21.0 33 17.9 35 21.0 51-60 44 20.1 49 26.6 38 22.8 61-70 55 25.1 42 22.8 33 19.8 71 and older 33 15.1 34 28.5 21 12.6 205 85.8 182 90.1 147 81.2 33 13.8 20 9.9 34 18.8 Did not complete high school 26 11.0 18 8.9 10 5.6 High school diploma 38 16.0 40 19.8 26 14.6 Further technical/vocational training 15 6.3 15 7.4 10 5.6 Some college 66 27.8 59 29.2 36 20.2 College graduate 92 38.8 70 34.7 96 53.9 0 meetings 48 19.8 43 22.4 41 22.4 1-3 61 25.2 42 20.7 41 22.4 4-6 48 19.8 39 19.2 39 21.3 7-9 32 13.2 31 15.3 21 11.5 10-12 53 21.9 48 23.6 41 22.4 Gender Male Female Race/Ethnicity White Age Home Owner Own/pay mortgage Rent Education Level Attendance in past 12 months 119 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. (X2=3.274, p>.05), age (F=1.271, p>.05), education (F= 1.969, p>.05), or attendance (F=.499, p>.05). Homeownership, however, was found to be significantly different across the experimental conditions (X2=6.260, p<.05). The second sample consisted of police who completed paper questionnaires at the time of the second administration in the study beats (N=184). The demographic characteristics are displayed in Table 6.7. Approximately 60% of the officers in all three experimental conditions were White and at least 75% in each condition were male. The average age for joining the CPD was approximately the mid-twenties for each of the experimental conditions: Control (M=28.8, SD=6.14), Feedback (M=26.5, SD=5.41), and Training (M=27.6, SD=5.05). The length of time officers had been working in the beat varied across the conditions: Control (M=5.1, SD=4.56), Feedback (M=4.6, SD=4.46), and Training (M=3.7, SD=5.05). The third sample consisted of observations conducted in each of the 51 study beats at the end of the project. Characteristics of the participants are presented above. 66% of beats provided a printed agenda (Control, N=8; Feedback, N=11; Training, N=14), while only half of the beats provided minutes from the previous meetings (Control, N=7; Feedback, N=7; Training, N=11). 44% of these meetings were conducted solely by a police facilitator (Control, N=8; Feedback, N=10; Training, N=4), while only 22% were conducted solely by a civilian facilitator (Control, N=3; Feedback, N=4; Training, N=4). Crime statistics were provided to residents in over 81% of the meetings (Control, N=11; Feedback, N=13, Training, N=16). Table 6.7 Demographic Characteristics of Citizen Beat Meeting Participants by Experimental Conditions (N=184) Control Feedback Training N % N % N % Male 38 77.6 35 75.5 51 76.1 Female 11 22.4 12 24.5 16 23.9 African American 8 20.5 12 27.9 17 30.4 Hispanic/Latino 8 20.5 5 11.6 5 8.9 23 59.0 26 60.5 34 60.7 Less than 25 1 2.0 1 2.0 1 1.5 25-29 4 8.2 10 20.4 11 16.9 30-39 19 38.8 18 36.7 23 35.4 40-49 18 36.7 9 18.4 21 32.3 7 14.3 11 22.4 9 13.8 Gender Race/Ethnicity White Age 50 and older 120 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 4. Statistical Techniques and Analysis Prior to any analyses, a visual examination of the data was conducted using graphicallyoriented statistical systems and exploratory analysis methods developed by Tukey (Hoaglin et al., 1991; Mosteller and Tukey, 1977; Tukey, 1977). Data was screened for such problems as missing data (Little and Rubin, 2002; Schafer, 1997) and violations of statistical assumptions (Tabachnick and Fidell, 2001). Skewness values, kurtosis values, normal, and detrended normal probability plots (P-P plots) were examined to detect violations of normality. Multicollinearity was looked for using both bivariate correlations and variance inflation factors (VIF). Finally, scatterplots of the regression residuals were examined to determine whether the data met the assumption that the standard deviations of the errors of prediction were equal for all predicted dependent variable scores (i.e., the assumption of homoscedasticity). The screening process did not indicate problems with any of the variables. Three stages of multivariate analyses were conducted. The first stage was to test for main effects according to the hypotheses using post-test data from citizen and police questionnaires and the observation forms. The main hypotheses for the CAPS sample are summarized in Table 6.8 and 6.9. With the nested structure of the questionnaire data, a possibility existed that traditional analysis might result in biased estimates of the experimental data. To this end, dependent variables from the citizen questionnaire data were tested for significant variation at the beat level. Based on these results, hierarchical linear modeling (HLM) was determined to be the appropriate statistical technique and was used for the hypotheses tests with the citizen questionnaire data. Ordinary least squares (OLS) regression was used to test whether program effects were found according to experimental conditions for those hypotheses with continuous dependent variables from police questionnaire and observation form data. Additional analyses were conducted to insure the data do not violate the assumptions of least squares regression. Logistic regression was used to test for program effects concerning a single dichotomous dependent variable (HCAPS5, Commitment to action). Regressions tested the impact of receiving feedback by comparing scores for the feedback group with those of the training and control groups, as well as training by comparing scores for the training group with those of the feedback and control groups. Because prior research has shown certain citizen demographics to be related to outcomes similar to those present in this study, particularly in relation to the CAPS setting (Skogan and Hartnett, 1997), regression models included control variables. All models involving community hypotheses controlled for the following variables: gender (coded 0=female, 1=male), age (coded in years), education (coded 1=did not finish high school, 2=high school graduate/GED, 3=further technical/vocational training, 4=some college, did not graduate, 5=college graduate), homeownership (coded 0=renter, 1=homeowner), and attendance at beat meetings in past year (coded in number). All models involving police hypotheses controlled for the following variables: gender (coded 0=female, 1=male), age (coded 1=Less than 25, 2=25-29, 3=30-39, 4=40-49, 5=50 or more), age when joined the CPD (coded in years), and time working in beat (coded in years). The second stage of multivariate analysis stepped outside the random experimental design to a quasi-experimental one to assess whether implementation and participation influenced hypothesized outcomes. This involved retesting the main hypotheses using levels of implementations to see whether they affected the outcome variables. Implementation levels measured the average level of implementation by police of project tasks based on data collected 121 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.8 Summary of Community Hypotheses for the CAPS Experiment Individual/Community Hypotheses When compared to residents from the control beats, residents from the experimental beats will: Dependent Variable(s) HCAPS1 Report greater confidence in their abilities for solving local problems. M SD Resident ability 2.92 .72 Problem solving 2.67 .84 HCAPS2 Report more knowledge about crime prevention. Crime prevention knowledge 3.36 .49 HCAPS3 Report or exhibit more favorable views of the police-citizen partnership. General partnership 2.96 .57 Beat meeting: Communication 3.25 .61 Beat meeting: Role definition 3.03 .66 Beat meeting: Problem solving 2.81 .77 Beat meeting: Group cohesion 2.63 .77 Expressions of support 4.10 .70 HCAPS4 Engage in more interaction within their beats. General interaction HCAPS5 Be more engaged in problem solving activities. Participation in meeting discussions Domination of meeting discussions Problem solving at meetings 7.35 3.34 1.94 .69 1.90 1.08 Reporting on problem solving efforts Commitment to future action 1.15 .62 .79 .41 122 .529 .38 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.9 Summary of Police Hypotheses for the CAPS Experiment Individual/Community Hypotheses When compared to police from the control beats, police from the experimental beats will: Dependent Variable(s) HCAPS6 HCAPS7 Report more favorable views of the police-citizen partnership. Be more engaged in community-oriented police work. M SD Beat meeting relationship 3.63 .52 Needs of beat meeting partnership General interaction 2.96 .58 Use of resident input to identify problems Assignments based on meetingidentified problems Personal contact with individuals in beat 1.66 .54 1.16 .72 1.12 .71 123 .589 .34 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. by observers. For each experimental condition, each task police were expected to perform was assigned a value of one, with the steps actually taken summed and divided by the maximum number of steps to be taken; the average implementation rate was then calculated by summing the rates across all observations and dividing by the number of observations. The following steps were used for these calculations: (1) Control beats: Distribution of project flyers; (2) Feedback beats: Distribution of flyers, distribution of printed results, and discussion of results; (3) Training beats: Distribution of flyers, distribution of printed results, discussion of results, and use of problem solving exercise. Each hypothesis was tested using OLS regression for the continuous variables and logistic regression for the dichotomous dependent variable. The final stage of multivariate analysis concerned testing factors with the potential to predict levels of citizen participation in Internet survey completions, including citizen demographics and attitudes, as well as beat characteristics. Citizen demographics include aggregated scores from the pre-test citizen questionnaire: Gender (coded 1=Male, 0=Female); Age (coded in years); Education (coded 1=Did not finish high school, 2=High school graduate/GED, 3=Further technical/vocational training, 4=Some college, 5=College graduate); Homeownership (coded 1=Own home, 0=Rent); and Attendance at meetings (coded number of meetings attended in the past 12 months). Also using scores from the pre-test citizen questionnaires, three measures of citizen attitudes are used: Satisfaction with partnership (Item: I am satisfied with the partnership our neighborhood has created with the police, coded 1=Strongly disagree, 2=Disagree, 3=Agree, 4=Strongly Agree); Feelings of safety (Item: How safe do you feel or would you feel being alone outside in your neighborhood at night?, coded 1=Very unsafe, 2=Somewhat unsafe, 3=Somewhat safe, 4=Very safe); and Beat meeting communication (Item: In general, there is open communication among beat meeting participants, coded 1=Strongly disagree, 2=Disagree, 3=Agree, 4=Strongly Agree). Three characteristics of the beat were also used: Crime rate in the beat, using census data from 2000; Citizen meeting attendance rate, calculated as described above for city official attendance by using beat logs to average their attendance rates for meetings held between March and August of 2005; and Citizen participation rate, calculated by summing the number of questionnaires completed by citizens in each beat and dividing by the number of citizens in attendance. OLS regressions were employed as the statistical test for carrying out all of these analyses. C. Results The results for the community hypotheses are presented in Tables 6.10-6.12. Resident beliefs about their effectiveness in solving local problems were not effected. No support was found for the experimental effects as detailed the community hypotheses (HCAPS2 -HCAPS5) regarding knowledge of crime prevention, satisfaction with the police-citizen partnership, and engagement in problem solving activities. Results for the police hypotheses (see Table 6.13) indicate some support for the hypothesis regarding satisfaction with the police-citizen partnership (HCAPS6), again an effect that was limited to police in the feedback only condition. Police in these beats reported more favorable views of their relationship with residents at beat meetings than police in either the control or feedback/training beats, although there was no evidence views regarding needed improvements of the beat meeting problem solving partnership were impacted. We found no support for the hypotheses regarding experimental effects on increasing the community-orientation of police work (HCAPS7). 124 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.10 A Summary of Multilevel Regression Estimates for Community Hypotheses of CAPS Experiment (Resident Questionnaire) Feedback (vs. Controls) Dependent Variables n Feedback/Training (vs. Controls) b SE b SE Residual Variance Est. Confidence in Abilities Resident ability 513 .16* .09 .12 .10 .46*** Problem solving 491 .03 .10 -.02 .10 .66*** 524 .02 .06 -.01 .06 .23*** General Partnership 518 -.02 .10 -.03 .10 .24*** Beat meeting: Communication 491 -.03 .08 -.01 .08 .34*** Beat meeting: Role definition 464 -.06 .09 -.07 .09 .41*** Beat meeting: Problem solving 450 .02 .09 -.01 .10 .51*** Beat meeting: Group cohesion 454 .04 .11 -.08 .11 .51*** 511 -.01 .06 -.02 .06 .11*** Knowledge of Crime Prevention Crime prevention knowledge Attitudes Towards Police-Citizen Partnership Interactions in the Beat General Interaction *p≤.05 Note: All models control for gender, age, education, homeownership, and attendance. 125 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.11 OLS Regression Estimates for Community Hypotheses of CAPS Experiment (Observation Form) Feedback (vs. Controls) Dependent Variables n b SE Feedback/Training (vs. Controls) β b SE β R2 Attitudes Towards Police-Citizen Partnership Expressions of support 49 -.332 .233 -.229 -.028 .234 -.019 .33 49 .951 1.192 .137 1.321 1.196 .190 .24 49 .200 .245 .140 .421 .246 .293 .24 48 .250 .230 .184 -.037 .230 -.027 .26 -.028 .205 -.029 .08 Problem Solving Activities Participation in meeting discussions Domination of meeting discussions Problem solving at meetings 48 .147 .205 .152 Reporting on problem solving efforts Note: All models control for gender, age, education, homeownership, and attendance. 126 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.12 Logistic Regression Estimates for Commitment to Future Action for Community Hypotheses/CAPS Experiment (N = 48) b SE Odds Ratio -.192 1.054 .825 .401 1.104 1.494 Gender -.895 3.336 .409 Age -.108 .067 .898 Education Level -.415 .766 .660 Homeowner 1.752 2.763 5.767 .379 2.221 Feedback (vs. Controls) Feedback/Training (vs. Controls) Attendance .798* Pseudo R2 Statistics: Cox & Snell R2 = .204 Nagelkerke R2 = .318 *p≤.05 127 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.13 OLS Regression Estimates for Police Hypotheses of CAPS Experiment (Police Questionnaire) Feedback (vs. Controls) Dependent Variables n Feedback/Training (vs. Controls) b SE β b SE β R2 Attitudes Towards Police-Citizen Partnership Beat meeting relationship 176 .370** .132 .328 .059 .119 .057 .17 Needs of beat meeting partnership 179 .047 .149 .040 -.015 .133 -.014 .03 General interaction 183 -.070 .070 -.112 -.039 .063 -.068 .21 Use of resident input 165 .167 .160 .133 .022 .143 .019 .04 Assignments based on meeting identified problems 172 .263 .196 .166 -.052 .177 -.036 .06 Personal contact within beat 171 .340 .198 .212 .111 .179 .075 .05 Community-Orientation **p≤.01 Note: All models control for gender, age, age when joined CPD, and time working in beat. 128 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. These findings suggest collection and discussion of community-based data achieved modest perceptual effects, improving both resident perceptions of their own abilities and police perceptions of their relationship with residents. Yet it is important to note that such findings were limited to residents and police in the feedback only beats. Similar effects were not found for residents and police in the feedback/training beats despite the fact that both experimental groups engaged in collecting and discussing community-based survey results. In fact, while not statistically significant, residents in the feedback/training beats negatively assessed problem solving by residents and the beat meeting partnership, crime prevention knowledge, and group cohesion, and exhibited less engagement in problem solving efforts, while residents in feedback only beats exhibited positive assessments and behaviors in these areas. The training component, use of problem solving exercises, was intended to enhance both police and resident participation in the problem solving process. With the lack of problem solving activities observed in the study beats, training efforts may have had the unintended consequence of heightening awareness of both residents and police lack of problem solving skills, thus contributing to a lack of findings within the feedback/training beats. Given the varying levels of program implementation, there was reason to believe effects would be found for those beats in which implementation had been greatest; to this end, the community and police hypotheses was retested using the overall implementation score as the predictor in place of experimental condition (Tables 6.14-6.18). Interestingly, better program implementation levels were associated with lower levels of police engagement in some community-oriented tasks (see Table 6.17). More consistent program implementation was not Table 6.14 OLS Regression Estimates for the Level of Implementation for Community Hypotheses of CAPS Experiment (Resident Questionnaire) Level of Implementation Dependent Variables R2 n b SE β Resident ability 50 .071 .165 .065 .134 Problem solving 50 .185 .200 .131 .240 50 -.122 .121 -.155 .106 Beat meeting: Communication 50 .153 .136 .165 .195 Beat meeting: Role definition 50 .012 .141 .013 .093 Beat meeting: Problem solving 50 -.017 .174 -.014 .207 Confidence in Abilities Knowledge of Crime Prevention Crime prevention knowledge Attitudes Towards Police-Citizen Partnership Beat meeting: Group cohesion 50 -.177 .233 -.120 Note: All models control for gender, age, education, homeownership, and attendance. 129 .063 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. related to both police reporting being sent out on assignments based on problems identified at beat meetings less frequently and having less personal contact with individuals within the beat when not answering calls for service. These may be treatment effects; greater adherence to project tasks may have heightened police awareness of their limitations with community engagement. But these findings may also be indicative of an existing disjuncture between what occurs at beat meetings and how police carry out day-to-day operations. As interviews with police personnel indicated, some did not necessarily consider the beat meeting as valuable for informing their work; in this sense, consistent implementation of project tasks may represent faithful adherence to conducting beat meetings in whatever the prescribed manner may be, but does not mean officers embrace community-oriented values beyond the beat meeting setting. Additionally there is a trend that echoes the support for the main hypotheses and the feedback/training condition. While implementation levels were not found to be associated with any of the other outcome measures, over half of the scores indicated negative attitudes and fewer problem solving behaviors with greater implementation scores. More consistent implementation may have acted in a comparable manner to training in that it increased participant awareness of their lack of skills and/or experience with the behaviors the project was attempting to foster. Table 6.15 OLS Regression Estimates for the Level of Implementation for Community Hypotheses of CAPS Experiment (Observation Form) Level of Implementation Dependent Variables R2 b SE β 49 .191 .444 .060 .30 49 .872 2.235 .057 .22 49 -.035 .469 -.011 .19 48 .429 .430 .144 .24 n Attitudes Towards Police-Citizen Partnership Expressions of support Problem Solving Activities Participation in meeting discussions Domination of meeting discussions Problem solving at meetings Reporting on problem solving 48 -.391 .372 -.178 efforts Note: All models control for gender, age, education, homeownership, and attendance. .08 With low rates for participation in completing Internet surveys, it was important to further examine the nature of citizen participation. As Table 6.18 shows, higher average resident education levels at the beat-level were strongly associated with greater participation by residents in survey completions. This is not unexpected given that both Internet users and participants in community crime prevention programs tend to be better educated. As the beat’s crime rate increased, resident participation in survey completions decreased. Although participation in CAPS is rather strong for residents in areas with high crime rates, it may be that low income (rather than a high crime rate) is the causal factor which limits access to the Internet. Participation also appears to have been predicated on a certain amount of dissatisfaction with the police-citizen partnership. Participation was greater for beats where residents reported less 130 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. satisfaction with the partnership established between their neighborhood and the police, suggesting these residents wanted to express themselves beyond the avenue of beat meetings and felt the Internet surveys were a viable method for doing so. Table 6.16 Logistic Regression Estimates for the Level of Implementation for Commitment to Future Action of CAPS Experiment (N = 48) b SE Odds Ratio .531 1.945 1.701 Gender -.610 3.255 .543 Age -.114 .067 .892 Education Level -.390 .752 .677 Homeowner 1.677 2.675 5.352 .392 2.272 Implementation Level Attendance .821* Pseudo R2 Statistics: Cox & Snell R2 = .200 Nagelkerke R2 = .313 *p≤.05 Table 6.17 OLS Regression Estimates for the Level of Implementation for Police Hypotheses of CAPS Experiment (Police Questionnaire) Level of Implementation Dependent Variables n R2 b SE β 45 -.170 .242 -.114 .251 45 -.214 .297 -.132 .046 General interaction 45 -.012 .144 -.014 .117 Use of resident input 45 -.097 .259 -.068 .053 45 -.642* .280 -.365 .277 45 -.826* .309 -.421 .293 Attitudes Towards Police-Citizen Partnership Beat meeting relationship Needs of beat meeting partnership Community-Orientation Assignments based on meeting identified problems Personal contact within beat *p≤.05 Note: All models control for gender, age, age when joined CPD, and time working in beat 131 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 6.18 OLS Regression Estimates for Citizen/Beat Characteristics for Survey Completion Rates (N=50) Model 1: Resident Demographics Gender Age Education Homeownership Attendance 2 R = .254 Model 2: Resident Attitudes Satisfaction with partnership Feelings of safety Beat meeting communication 2 R = .349 b SE β .047 -.001 .061** .012 .003 .055 .055 .017 .077 .006 .115 .115 .521 .028 .060 -.092* .051 -.006 .045 .035 .052 -.357 -.239 -.019 -.133* -.004 .057 .057 .022 .040 -.328 -.025 .234 a Model 3: Beat Characteristics Overall crime rate in beat Citizen meeting attendance rate Citizen participation rate 2 R = .355 a * p<.05, **p<.01 Note: Models 2 and 3 control for gender, age, education, homeownership, and attendance. 132 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Chapter Seven Results from the Random Sample: The Impact of Internet Information on Participants’ Perceptions and Behaviors A. Hypothesized Impact of Interventions on Random Sample One component of the Chicago Internet Project was designed to test the impact of the web-based survey system on the random sample of residents within the 51 target police beats. The research questions are similar to those proposed in the CAPS sample -- will participation in the surveys, the dissemination of survey results, and the provision of educational links work to enhance problem solving knowledge and skill, increase community engagement, and strengthen police-community relations? Both police and community performance measures were used as outcome measures. Police performance measures. Although the random sample did not interact with Chicago police officers as part of the experiment (unlike the CAPS sample), there is some basis to expect that Internet communication might influence their perceptions of problem solving and their general assessments of the police. Because CIP was a joint project with the Chicago Police Department and the University of Illinois at Chicago, there is reason to believe that invited participants might view the CPD's actions in a favorable light. In particular the random sample was invited to have a voice in prioritizing neighborhood problems and evaluating police performance. In addition, the Police Department showed enough interest in their opinions to provide them with the survey results and crime prevention advice in some experimental conditions. The reciprocity model articulated earlier helps to explain these relationships and predict positive assessments of police performance. Given this experimental context, we expected that feedback and feedback/training interventions would register change on several partnership scales. Specifically, we hypothesized that residents in the experimental conditions would view the Chicago police as: (1) more involved in community engagement activities (e.g. providing crime prevention tips) and (2) more responsive to community concerns (e.g. sharing information, being open to input and suggestions, working with residents to solve problems). In turn, we hypothesized that residents in the experimental conditions would be more willing to work with the police as partners (e.g. calling the police to report incidents, serving as an informant) and perhaps be more inclined to attend a community meeting. (see formal collective action below). The collection, analysis, dissemination and interpretation of survey data over the Internet may leave the impression that the Chicago Police Department is a highly professional operation with technical expertise, problem solving skills, transparency and trustworthiness as an organization. Hence, we also hypothesized that residents in the experimental conditions would be more inclined to view the Chicago police as (1) effective at solving problems and fighting crime and (2) knowledgeable about their job. More generally, we hypothesized that these positive images would translate into higher ratings on organizational legitimacy (i.e. confidence and trust in the Department and its leaders) and higher overall satisfaction with police services. 133 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. This theorizing suggests that the effects of Internet-only interventions, if they occur at all, will be limited to general impressions of police officers and their organization and will not extend to more specific performance indicators (e.g. specific neighborhood policing behaviors that are not relevant to the CIP interventions). Thus, for example, we hypothesized that the experimental interventions would have no effect on respondents' assessments of police manners. This type of test (referred to as a non-equivalent dependent variables design), if supported, would strengthen our causal inferences. Community performance measures. Beyond assessments of the police, what were the expected effects of these interventions on the random sample's views of themselves and their communities? Hypotheses about changes in community "performance" measures were straight forward and based on notion of knowledge-based competency. The central assumption is that the Internet can be used as a vehicle to inform and education adults about public safety and community crime prevention. Recall that, upon completion of the online surveys, participants in both experimental conditions were invited to visit the website to view selected survey results for their police beat. Respondents in the feedback-plus training experimental condition were further encouraged to visit links on the CIP website providing information about various public safety topics. As note previously, two types of links on the web page were provided – links internal to the CIP site and links to other sites. Content of the internal links, developed by the research team, covered public safety topics in four broad areas: community participation, citizen and police relationships, problem solving and individual safety behaviors. The external links were the Chicago Alternative Policing Strategy (CAPS) and Citizen-ICAM (CPD’s crime mapping resource) sites. The CAPS site offered participants community policing information such as beat meeting times and locations, crime watch, hotline information, and specific Chicago Police Department contacts. The ICAM website allowed citizens to map crimes based on selected geographic and type of crime parameters. For particular survey topics, additional external links were provided (e.g. National Crime Prevention Council; National Center for Missing and Exploited Children). Based on exposure to these educational materials and/or survey results, we hypothesized that participants in the experimental conditions would "out perform" participants in the control condition. Because of the survey feedback, experimental groups were expected to report more general knowledge about crime. Because of the information contained in the internal links, experimental groups were expected to report more knowledge about staying safe, a better understanding of crime prevention concepts. They were also hypothesized to report more participation in protective actions and informal and formal collective actions against crime. All of these topics were covered specifically in the educational materials. Also, the links to CAPS, for example, might encourage some respondents to attend their local beat meeting, given the now-salient information about time and location. Participation in this online experience, combined with newly acquired knowledge about crime and crime prevention, was hypothesized to increase self-efficacy in the experimental groups. Links to CAPS, ICAM, and NCPC, as well as specific instructions about how to stay safe, may empower residents to feel that they have more control over their environment and can make a difference in neighborhood safety. 134 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Finally, the effects of these interventions on perceptions of their community were uncertain. Information about how your neighbors feel about local crime and disorder may increase or decrease your own fear and disorder perceptions, depending on your prior expectations. Also, to the extent that respondents feel similar to others in the neighborhood as a result of survey feedback, they may feel their neighborhood has grown stronger in terms of informal social control and collective efficacy. Of course, all of these hypotheses are based, in part, on the assumption that the random sample will take advantage of the information links that were provided to them. As noted previously, usage of the links was in fact rare, thus providing a weak test of the hypotheses. However, exposure to the survey feedback was more common, and hence, provides a basis for conducting the tests. Survey results alone may be sufficient to produce some effects on perceptions of self and community. But the absence of strong use of the links contributed to the decision to test both experimental groups together. B. Implementation Problems with Website Delays with the CIP website, especially at the study’s onset, threatened the implementation. Researchers had to accommodate University computing staff schedules, troubleshoot technical glitches and assist some participants with completing on-line surveys and accessing on-line survey results to improve the implementation threats specifically associated with the CIP website. The surveys and the CIP website were housed on the University of Illinois at Chicago’s (UIC) Academic Communications and Computing Center (ACCC) severs. The ACCC is UIC’s computer science administration and is responsible for all on-campus computing. All online aspects of the CIP study, the CIP surveys, the survey results, and the CIP website were hosted on the UIC servers maintained by ACCC. Two ACCC employees served as the main CIP liaisons for the duration of the study. The first ACCC liaison helped study researchers post the surveys and access the results. The second liaison helped the CIP researchers get the CIP website up and running, the survey results posted, and update the monthly link content. The first ACCC liaison was extremely helpful and the surveys were posted in a timely fashion and researchers were readily given access to any survey data. Survey data were stored on UIC servers as database files and these files were accessed, downloaded, and analyzed by CIP researchers on a weekly basis. At the beginning of the study, there were problems with converting the raw survey results from the Perseus database files into SPSS files for analysis. Some text fields were mistakenly converted to number fields and some number fields were converted to text fields. This problem was discovered early in the study and remedied with more specific and careful Perseus survey design and programming. The study required researcher to update on-line postings and website content quickly; however the second ACCC liaison had a busy schedule and was often not able to accommodate the imperative for quick on-line access and this resulted in some delays at the beginning of the project. The website required ongoing updates (i.e. adding survey results) and additions (i.e. changing or updating the feedback link content) but all of these changes had to be cleared by the 135 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. ACCC liaison to prevent the campus computing system from being compromised. UIC’s computing security policy stipulates that only ACCC employees can directly access the server and subsequently all the CGI scripts and the content developed for the CIP website had to be reviewed and approved by this liaison. In the first month of the study, on-line survey result postings were delayed and the feedback content was not updated on time. After working closely with the ACCC liaison to accommodate his schedule and the researchers’ need to access and modify the site quickly, the survey result postings and feedback content updates stayed on schedule. This relationship required constant attention and weekly communication between the researchers and the ACCC staff. There were also some technical problems with the website that may have affected some respondents, especially at the beginning of the study. Early in the study some of the posted survey response graphics were degraded (i.e. blurry and difficult to read). Additionally, some people reported problems accessing the CIP site or certain links within the site. The CIP web master quickly remedied these problems and by the second survey wave the graphic problems were resolved. The website access problems were minimal throughout the study and seemed to be specific to certain respondents’ computer’s functionality or respondents’ computer knowledge rather than a systematic problem with the website. CIP researchers assisted all respondents with their on-line access issues by troubleshooting possible problems over the phone. Overall, solid communication and collaboration between the researchers and the UIC ACCC liaisons helped to curtail most problems and the delays and implementation problems were kept to a minimum. C. Statistical Techniques and Analysis Strategy Data screening. Before analyses were conducted the data was screened for problems including violations of statistical assumptions following procedures outline by Tabachnick and Fidell (2001). The screening process included examining Mahalanobis distance statistic and several variations of Cook’s distance statistics (i.e., modified Cook’s distance, DFFITS and DFBETAS) to check for univariate and multivariate outliers. Skewness values, kurtosis values, normal, and detrended normal probability plots (P-P plots) were examined to detect possible violations of normality. Scatterplots were generated to look for possible non-linear relationships. Additionally, we searched for non-linear relationships by creating high, medium, and low dummy variables for measures with a range of greater than five. Bivariate correlations were used to look for multicollinearity. Last, we examined scatterplots of the regression residuals to determine whether the data met the assumption that the standard deviations of the errors of prediction were equal for all predicted dependent variable scores (i.e., the assumption of homoscedasticity). The data screening process did not reveal problems with most of the measures, including the majority of the scales discussed in the earlier chapter. There were, however, a few notable exceptions. The scale created to measure knowledge about specific crime prevention concepts (i.e., CAPs, ICAM, broken widows, etc) was moderately positively skewed (skewness statistic = 1.28; SE = .095; z = 13.47). To correct for the problem the variable was transformed using the natural log. The scale created to measure resident’s participation in CAPs or community 136 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. meetings also referred to as formal collective action was substantially positively skewed (skewness statistic = 2.166; SE = .095; z = 22.80) and measure for resident’s willingness to work with the police was substantially negatively skewed (skewness statistic = -2.593; SE = .090; z = 28.51). Several transformation were tried to correct the skewness of these variables, however, none were effective. As a secondary strategy, the variables were dichotomized. Formal collective action was dichotomized and coded one if the resident attended a meeting in the last six month and coded zero if the resident did not attended a meeting in the last six months. About 33.5% of the respondents reported attending a CAPs or community meeting in the past six months. Residents’ willingness to work with the police was also dichotomized and coded one if the on average the resident strongly agree to all the statements and coded zero for all other values. Sixty percent of all respondents strongly agreed with the statements regarding working with the police. When conducting the analyses the final matrix for the scale for residents’ general knowledge about crime prevention could not be positively defined raising questions about the validity of the results. As a consequence the scale was dichotomized and coded one if the resident strongly agreed to both statements and zero if they did not. About 32.5% of the residents strongly agreed to both statements. Excluding these four variables, all other dependent variables were used in the analyses as presented in the earlier scaling section (see Table 7.1 and 7.2 for summaries). Statistical techniques. Due to the nested structure of the data there is a possibility that traditional methods of data analysis may produce biased estimates of the experimental effects. In order to assess whether multilevel or hierarchical modeling would be an appropriate statistical technique for assessing the study effects we tested the hypothesis regarding significant beat level variance by estimating unconditional multilevel models for each of the dependent variables. In other words, we estimated a baseline multilevel model with no predictors to test whether there was significant variation of the dependent variable at the beat level. A significant beat level covariance parameter and a large interclass correlation would suggest that ordinary least squares (OLS) analysis of the data would likely produce misleading results (Singer, 1998). All of the covariance parameters produced from the unconditional models for measures used in testing the hypotheses with the random sample of respondents were statistically significant. Based on these results we determined that multilevel or hierarchical linear models (HLM) that accounted for the nested structure of the data would be most appropriate. For continuous dependent variables linear mixed models or hierarchical linear models (HLM) were conducted using SPSS version 15.0 and for dichotomous dependent variables hierarchical generalized linear models (HGLM) with a Bernoulli distribution were conducted using HLM version 5.02 (Bryk and Raudenbush, 2002). All of the multilevel models were assessed for their adequacy of specification assumptions by examining statistical tests and residual plots (Raudenbush and Bryk 2002). Data diagnosis revealed a problem with the scale for residents’ general knowledge about crime prevention. Even after dichotomizing the scale, problems persisted and the variable was dropped from subsequent analyses. Data diagnosis did not reveal problems with any other variables. The hypotheses for the random sample using the internet surveys are summarized in Table X. Residents from the two experiential conditions (i.e., feedback and feedback and training) were compared to residents from the control beats. All of the models included controls for male (coded 0 = female; 1 = male), age (coded in years), education level (coded 1 = less than 137 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.1 Summary of Policing Hypotheses for Random Sample with Descriptive Statistics Dependent Variables Wave M SD When compared to residents from the control beats, residents from the experimental beats will view the police as: HRandom1 More involved in community engagement activities. Engagement of the community 3 3.26 .94 HRandom2 More responsive to community concerns. Responsiveness to the community 3 2.36 .84 HRandom3 More effective at solving problems and fighting crime. Effectiveness in problem solving 3 2.73 .68 Effectiveness in preventing crime 3 2.58 .79 More knowledgeable about their job. Knowledgeable 3 3.16 .54 View the organization as having more organizational legitimacy. Organizational legitimacy 3 2.83 .61 Having higher overall satisfaction with police services. Satisfaction with neighborhood police 6 3.06 .62 HRandom4 When compared to residents from the control beats, residents from the experimental beats will: HRandom5 When compared to residents from the control beats, residents from the experimental beats will: HRandom6 Be more willing to work with the police as partners. Willingness to partner with the police (dichotomized) 3 .60 .49 HRandom7 Have similar assessment of police manners. Manners 4 3.54 .74 138 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.2 Summary of Individual and Community Hypotheses for Random Sample with Descriptive Statistics Dependent Variables Wave M SD General knowledge 2 3.36 .51 Knowledge staying safe 2 2.79 .69 Prevention concepts (logged) 5 2.27 .27 Protective behaviors 5 13.40 3.88 Informal collective action 5 2.37 .98 Formal collective action (dichotomized) 5 .33 .47 HRandom11 Feel more efficacious. Self-efficacy 2 3.67 .75 HRandom12 Have a more positive perception of their community. Fear of crime 6 2.06 .82 Disorder 5 24.72 6.40 Informal social control 2 3.84 .94 Collective efficacy 2 3.82 .76 When compared to residents from the control beats, residents from the experimental beats will: HRandom8 HRandom9 Report more knowledge about crime and crime prevention. Engage in a greater use of protective behaviors. HRandom10 Engage in more informal and formal collective action. 139 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. high school; 2 = high school degree; 3 = some college; 4 = Associates degree; 5 = Bachelor’s degree; 6 = some graduate school, 7 = Master’s degree; 8 = Doctorate/advanced degree) and homeownership (coded 0 = renter/other; 1 = homeowner). D. Results Experimental effects. The results for the policing hypotheses are presented in Table 7.3. We did not find any empirical support for the hypotheses regarding the experimental effects on perceptions of police, policing behaviors or organizational outcomes. Residents from feedback only and feedback and training beats reported similar perceptions of police engagement, responsiveness, problem-solving, knowledge and legitimacy as residents from control beats. Further, residents from the two experimental conditions were no more likely than residents from the control beats to be willing to work with the police as partners. The results for the individual and community hypotheses are presented in Table 7.4. None of the hypotheses were supported except for self-efficacy. Residents from feedback and training beats reported significantly higher levels of self-efficacy than residents from control beats. Survey participation effects. Because of aspects of the experiment required respondent participation to achieve a reasonable "dosage of the treatment," and respondents participated at different levels, we re-tested all of the hypotheses using the number of surveys (coded in actual number of surveys taken; M = 2.32, SD = 2.61) as the predictor instead of experimental group membership (see Table 7.5 and 7.6). We also retested all of the hypotheses using the selfreported measure viewed results (coded as 1 = Yes; 0 = No; 45.5% of the sample reported viewing at least one of the results documents generated from the Internet surveys) (see Tables 7.7 and 7.8). The number of surveys taken was related to increased perceptions of police engagement and responsiveness to the community. Respondents who completed more surveys viewed the police as doing a better job at dealing with problems that really concern residents, sharing information, being open to input and suggestions and working with residents compared to respondent who completed fewer survey. Respondents who completed more surveys also viewed the police as better at providing crime prevention tips and making themselves available to talk with residents. The number of surveys completed was also associated with greater perceived organizational legitimacy. The more surveys a resident completed the more likely they were to agree with statements about the organizations leadership, accountability and protection of basic rights. In terms of the individual and community hypotheses, the number of surveys completed was associated with greater knowledge about crime prevention concepts. Respondents who completed more surveys were more likely to reported knowledge about CAPS, ICAM, the crime triangle, the SARA model, broken windows, and crime prevention though environmental design compared to respondents who completed fewer surveys. Further, the number of surveys completed was associated with greater perceived informal social control. The more surveys a respondent completed the more likely they were to report that their neighbors would intervene in 140 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.3 A Summary of Multilevel Regression Results for Policing Hypotheses with Random Sample of Respondents HLM Linear Models Dependent Variables General Assessments of Police Manners n Feedback Only (vs. Controls) Est. SE Feedback/Training (vs. Controls) Est. SE Residual Variance Est. 664 -.01 .11 -.02 .11 .48*** 637 .06 .08 .06 .08 .25*** Responsiveness to the Community 638 .16 .12 .10 .12 .61*** Satisfaction with Neighborhood Police 603 .04 .12 -.02 .12 .30*** Organizational Legitimacy 712 .01 .08 -.01 .08 .33*** Effectiveness in Problem Solving 656 .08 .11 .09 .11 .40*** Effectiveness in Preventing Crime 669 .04 .15 -.01 .15 .47*** Engagement of the Community 653 .12 .12 .01 .12 .80*** Competency Indices Knowledge Neighborhood-Specific Evaluations Organizational Outcomes HGLM Model with Dichotomous Dependent Variable Organizational Outcomes Willingness to Partner with Police n 716 Feedback Only (vs. Controls) Est. SE .19 .23 *p≤.05 **p≤.01 Note: All models control for gender, age, education and homeownership. 141 Feedback/Training (vs. Controls) Est. SE .06 .22 Residual Variance Est. .10 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.4 A Summary of Multilevel Regression Results for Individual/Community Hypotheses with Random Sample of Respondents HLM Linear Models Dependent Variables Neighborhood Conditions n Feedback Only (vs. Controls) Est. SE Feedback/Training (vs. Controls) Est. SE Residual Variance Est. Fear of Crime 618 -.23 .14 -.01 .14 .50*** Disorder 514 -1.49 1.40 -.67 1.39 29.40*** Knowledge Staying Safe 753 .13 .08 .05 .08 .41*** Prevention Concepts (logged) 652 .01 .03 .02 .03 .07*** Protective Behaviors 653 -.43 .60 .02 .60 12.47*** Informal Collective Action 652 -.04 .15 .01 .15 .84*** Self-Efficacy 755 .11 .07 .19* .07 .52*** Informal Social Control 757 .22 .12 .19 .12 .73*** Collective Efficacy 757 .20 .11 .19 .11 .45*** Individual Resident Performance Collective Performance HGLM Model with Dichotomous Dependent Variable Individual Resident Performance Formal Collective Action n 652 Feedback Only (vs. Controls) Est. SE -.05 .26 *p≤.05 ***p≤.001 Note: All models control for gender, age, education and homeownership. 142 Feedback/Training (vs. Controls) Est. SE -.24 .27 Residual Variance Est. .18* This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.5 A Summary of Multilevel Regression Results for the Number of Internet Surveys Completed for Policing Hypotheses HLM Linear Models Number of Surveys Completed n Est. SE Residual Variance Est. 664 .03 .02 .48*** 637 .02 .01 .25*** Responsiveness to the Community 638 .04* .02 .61*** Satisfaction with Neighborhood Police 603 .02 .30*** Dependent Variables General Assessments of Police Manners Competency Indices Knowledgeable Neighborhood-Specific Evaluations -.01 Organizational Outcomes Organizational Legitimacy 712 .03* .01 .33*** Effectiveness in Problem Solving 656 .01 .02 .41*** Effectiveness in Preventing Crime 669 .03 .02 .47*** Engagement of the Community 653 .04* .02 .80*** HGLM Model with Dichotomous Dependent Variable Number of Surveys Residual Completed Variance Organizational Outcomes Willingness to Partner with Police n Est. SE 716 .01 .05 *p≤.05 **p≤.01 Note: All models control for gender, age, education and homeownership. 143 Est. .14* This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.6 A Summary of Multilevel Regression Results for the Number of Internet Surveys Completed for Individual/Community Hypotheses HLM Linear Models Number of Surveys Completed n Est. SE Residual Variance Est. Fear of Crime 618 .01 .02 .50*** Disorder 514 .25 .16 29.31*** Knowledge Staying Safe 753 -.02 .01 .41*** Prevention Concepts (logged) 652 .02** .01 .07*** Protective Behaviors 653 .08 .09 12.49*** Informal Collective Action 652 .02 .02 .85*** Self-Efficacy 755 .01 .01 .52*** Informal Social Control 757 .03* .02 .72*** Collective Efficacy 757 .02 .01 .45*** Dependent Variables Neighborhood Conditions Individual Resident Performance Collective Performance HGLM Model with Dichotomous Dependent Variable Number of Surveys Residual Completed Variance Individual Resident Performance Formal Collective Action n Est. SE Est. 652 .02 .06 .27** *p≤.05 **p≤.01 Note: All models control for gender, age, education and homeownership. 144 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.7 A Summary of Multilevel Regression Estimates for Viewed Survey Results on Policing Hypotheses HLM Linear Models n Est. SE Residual Variance Est. 496 -.04 .07 .44*** 451 .03 .06 .23*** Responsiveness to the Community Index 448 .08 .09 .58*** Satisfaction with Neighborhood Police 602 -.06 .06 .30*** Organizational Legitimacy 504 -.04 .06 .33*** Effectiveness in Problem Solving Index 461 .08 .07 .40*** Effectiveness in Preventing Crime 470 -.05 .09 .47*** Engagement of the Community Index 483 -.01 .10 .79*** Viewed Results Dependent Variables General Assessments of Police Manners Competency Indices Knowledgeable Index Neighborhood-Specific Evaluations Organizational Outcomes Organizational Outcomes Willingness to Partner with Police n 507 HGLM Model with Dichotomous Dependent Variable Residual Viewed Results Variance Est. SE Est. .04 .30 .32** *p≤.05 **p≤.01 Note: All models control for gender, age, education and homeownership. 145 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Table 7.8 A Summary of Multilevel Regression Estimates for Viewing Results on Individual/Community Hypotheses HLM Linear Models n Est. SE Residual Variance Est. Fear of Crime 617 .07 .08 .50*** Disorder 414 .95 .76 29.20*** Knowledge Staying Safe 490 .06 .07 .38*** Prevention Concepts (logged) 524 .04 .03 .07*** Protective Behaviors 525 .04 .40 12.57*** Informal Collective Action 524 .05 .10 .86*** Self-Efficacy 494 .05 .07 .48*** Informal Social Control 494 .06 .09 .66*** Collective Efficacy 494 .01 .07 .41*** Viewed Results Dependent Variables Neighborhood Conditions Individual Resident Performance Collective Performance Organizational Outcomes Formal Collective Action n 525 HGLM Model with Dichotomous Dependent Variable Residual Viewed Results Variance Est. SE Est. .12 .32 .23* *p≤.05 **p≤.01 Note: All models control for gender, age, education and homeownership. 146 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. an effort to regulate the behavior of residents and visitors in the neighborhood. The number of surveys completed was not related to collective efficacy. The only difference between informal social control and collective efficacy is social cohesion. Completing Internet surveys did not appear to increase cohesion and trust among neighbors. E. Summary A randomized experimental design was employed to test the effects of the Chicago Internet Project on random samples of residents in 51 experimental beats. The only contact with these individuals, aside from the original telephone call to recruit them, was Internet communication. Hence, this component of the Chicago Internet Project was a test of whether exposure to survey results and public safety information would influence their perceptions of the police or their own neighborhoods. In general, the analyses by experimental conditions revealed no support for the hypothesized effects. Only self-efficacy was improved. Analyses by levels of participation (number of online surveys completed), however, found a wide range of positive effects. Persons who were exposed to more survey feedback and more educational materials reported a number of improvements in perceptions of both the police and the community. The null findings from the randomized experiment could easily be attributed to the low levels of exposure to "training" materials available through the website links. The absence of exposure to these educational materials leads to the classic evaluation conclusion: "We didn't try it and it didn't work." This dosage problem also placed the burden on survey feedback alone to produce the hypothesized effects, which apparently was an excessive demand. The dosage problem motivated the research team to explore the possible effects on increased dosage levels, and indeed, numerous positive findings were achieved. These positive findings, however, must be interpreted with caution. Given that the analyses were not conducted within the confines of a randomized design, the effects could be due to self-selection or treatment-by-selection interactions. For example, persons who are favorably disposed to the police may be more inclined to complete additional surveys (rather than the reverse). Less dismissive, perhaps persons so inclined benefit more from additional exposure to educational materials. In the future, a stronger test of these hypotheses will be possible if researchers can make the websites so attractive that respondents cannot resist reading the materials. 147 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CHAPTER EIGHT CONCLUSIONS The dual goals of the Chicago Internet Project were (1) to successfully implement, on a large scale, a comprehensive web-based community survey and identify the challenges to transferring this infrastructure to other settings; and (2) to determine whether a Web-based survey system can enhance the problem solving process, increase community engagement, and strengthen police-community relations. We experienced enormous success with the first goal of building a new online measurement system, but had limited success in producing immediate effects on police-community outcomes. Our conclusion begins with a brief analysis of the experimental efforts to test the effects of survey feedback on the police and community, and speculate about why more success was not achieved. We then turn our attention to the project's primary goal of building a new measurement system. A. The Experimental Interventions The elements of the intervention include the collection of new information through the use of Internet surveys, the dissemination (feedback) of this information to police and residents, the use of these new data elements in a problem solving setting (Chicago Alternative Policing Strategy or CAPS), and training in the use and interpretation of survey findings. Within 51 Chicago police beats, two separate randomized experiments were conducted -- one using CAPS participants and another using a random sample of residents (the majority of whom do not attend CAPS meetings). For the latter study, the participants did not interact with police officers in a problem solving setting, so our outcome measures focused on their perceptions of the police and the community. The results of these two experiments were not encouraging. Detailed analyses indicate that police beats assigned to the experimental conditions (survey feedback or survey feedback plus additional training) were not different from the control beats on a wide range of police and community outcome measures. The question is why? The CAPS experiment. The first question is, what happened with the CAPS experiment, where community beat meetings were expected to be an ideal setting for enhancing problem solving via survey feedback? While debriefing interviews suggest that the police and residents generally supported the concept of using the Internet as a medium for communicating residents’ concerns, participation levels were extremely low. Inconsistent implementation of essential project tasks on the part of the police was the main culprit with regard to low response rates. Police in the test beats were assigned primary responsibility for making residents aware of the opportunity to participate in the Internet survey and leading discussions of the survey results at beat meetings, a strategy intended to invest police more fully in the process. However, the police frequently failed to carry out the project tasks, most notably failing to discuss survey results on more than half of the occasions when they were expected to do so. Multiple obstacles to implementation were identified, including communication breakdowns within the organization, police resistance or lack of commitment to the project, rigid patterns of communication, immutable expectations of police and residents as to their roles at beat meetings, reassignment of personnel, and a general lack of organizational support for CAPS. 148 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Despite these problems, implementation levels were rising over the course of the project as police personnel grew accustomed to handling the material and conducting discussions. With time, there is every reason to believe that survey results would come to be regarded in the same manner as other standard materials at beat meetings, such as a printed agenda, crime maps, and crime statistics. Less certain is the future of using survey results to stimulate problem solving at beat meetings. Although CAPS has achieved worldwide recognition as an exemplary community policing initiative, the reality is that problem solving is the Achilles heel of this model. The absence of serious problem solving at most beat meetings was the most serious obstacle to implementation, and may explain why participants in the experimental conditions did not exhibit positive effects. We have learned that CAPS is a culture unto itself, with strong (and relatively traditional) norms about police and community roles (see Graziano, 2007). Rather than engage in joint problem solving, the police are expected to respond effectively to residents' complaints, similar to 911 calls, but in person. For lower crime neighborhoods, CAPS is a social event and problem solving would be an inconvenience. Hence, the introduction of new survey information and pressure from the police administration (and University) to engage in problem solving was met with some passive resistance at many levels. If CAPS is to be resuscitated, the Chicago Police Department must assign priority to supplying police and residents with the proper training and resources to engage in effective problem solving. Since the implementation of citywide training for police and residents on problem solving in the mid-1990s, problem solving skills have undoubtedly declined among participants. The random sample experiment. On a positive note, we were able to identify a group of residents in each of the 51 police beats who were willing to go online and remain engaged in the panel survey for several months and multiple surveys. These participants generally were younger than the CAPS sample and less inclined to attend community meetings. Hence, through random sampling and telephone outreach, we were able to "democratize" the process of engaging the community in a dialogue about public safety issues. (Skogan et al., 2002 note that CAPS participants represent, on average, only 0.5% of the beat population). These randomly selected individuals represent "the silent majority" within the neighborhood whose input on public safety issues are rarely sought, except via occasional large-scale surveys. Their knowledge, perceptions, beliefs, attitudes and opinions became the primary data for testing a new measurement system. On a less positive note, the findings from the Random Sample experiment did not support the hypothesized effects. Online feedback of survey results and educational supplements were not sufficient to change participants' views of the neighborhood, the police or police-community partnerships. Our understanding of these null findings is more tenuous than our interpretation of the CAPS experiment, as the latter was based on numerous in-person debriefing interviews and extensive field observations. For the random sample, our only knowledge comes from their survey responses. But some speculation is in order. First, implementation fidelity is always a concern: Did the online random sample receive the emails with links to our website containing results and educational materials? Virtually all of the respondents in the experimental conditions acknowledged receiving information on how to access the survey results, but the number of those who actually saw at least one of the survey 149 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. result reports was considerable lower (68% for feedback and 79% for feedback/training), and these numbers dropped steadily when asked about seeing 2, 3 or 4 reports. Larger implementation problems occurred at the level of "training" whereby respondents were electronically encouraged to utilize community resource links relevant to particular survey findings. Only 29% of respondents from the feedback/training beats reported the presence of community resource links and of these, fewer than half (47%) reported clicking on at least one of the links. Part of this problem may have been the salience of the community resource links on the website. The majority of respondents from the training beats reported not knowing if there were community resource links (61%). This suggests that our community resource section did not stand out from the results sections on our website. Certainly, low rates of exposure to the survey results and training resources suggests that the "dosage of the treatment" may have been insufficient to make a difference. In addition, the question remains whether higher dosages would be sufficient in this modality. Perhaps online survey information, with little interpretation and no opportunity to discuss the findings (except perhaps with family members), is insufficient to alter judgments about the community or the police. There are hints in the data, however, that challenge this conclusion. First, when the data were analyzed at the individual level, without controlling for beat-level clustering, a number of significant differences were observed in the direction of more positive evaluations of the police and the community in the experimental conditions. Second, this same pattern of finding was observed for respondents exposed to higher dosages of the treatment (more survey findings and links). The latter may be the result of self-selection, but the findings are suggestive. If more "bells and whistles" had been present on the website, and rates of exposure to feedback and training had been higher, perhaps more significant effects would have been observed. B. New Measurement System The Chicago Internet Project has, without question, demonstrated the utility of web-based surveys for generating knowledge about public safety at the neighborhood level and beyond. This project was initiated, in large part, to build the foundation for a new measurement system, one that would "measure what matters" to the communities served by municipal police worldwide. The measures field tested here are reflective of the period in which we live, with an emphasis on community-oriented government. We are strongly encouraged by the results in a number of areas: 1. Participation: Residents will, indeed, participate in web surveys regarding local neighborhood safety. If the topics are sufficiently interesting (e.g. neighborhood problems, police performance), participants will complete online surveys repeatedly, at least monthly for seven months. From a scientific perspective, drawing the sample is an important component of the process, and indispensable for generalizability to the larger population. Random telephone sampling of individuals within specific police beats was successful for us, but perhaps too costly for many communities. Asking for volunteers (as is done with most online surveys) is the easiest and cheapest method, but not very scientific. Something in-between might be functional. For 150 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. example, we propose the weather service model, where residents volunteer to serve as weather monitors and report weather data on a regular basis for particular locations. In urban neighborhoods, inviting a diverse sample (e.g. local school, business, and community leaders, combined with some random sampling by location) might yield a reasonably good picture of the neighborhood by "knowledgeable informants" or “place monitors.” Empirical validation against random samples would be needed. 2. Validity: This project has demonstrated that Web surveys can yield reliable and valid data in the public safety arena. First, the measures we developed (or selected) cover a wide range of theoretical constructs. They represent many dimensions of police and community performance as well as diverse aspects of neighborhood social conditions. Second, these survey items were subjected to various tests of validity and reliability. Compose indices were constructed only after the use of factor and reliability analyses, where appropriate, to establish that the items formed a unidimensional factor with strong internal consistency. Often measures were taken at two or more points and, thus, test-retest reliability coefficients were computed. Finally, for key indices, additional validity tests were conducted to establish construct and criterion validity. 3. Utility: As we have discussed throughout this report, the potential applications of this methodology are numerous. The ongoing collection of community survey data within small geographic areas, made possible by the efficacy of electronic data collection, can be used: (1) for community planning and problems solving: Residents and police can assess community needs, define and analyze problems, and identify community resources; (2) to measure police performance, (3) to measure community performance, (4) to evaluate new public safety programs; and (5) to query the public about police policies and procedures and other justice programs. Police organizations interested in evidence-based decision making could use data on community perceptions and behaviors for strategic planning of hot spots policing and, if you will, “cold spots policing.” For example, we envision police resources being deployed and/or problem solving projects being initiated in locations with hot spots of fear, hot spots of policecommunity tensions, cold spots of community crime prevention or cold spots of collective efficacy. The potential for community empowerment with information technology is also noteworthy. As residents become more familiar with using the Internet for public safety purposes, the door may be opened wide for citizens to become actively engaged with law enforcement agencies in new and creative ways. The benefits could be substantial if the Internet can be employed to query, educate, notify, and challenge community residents and police officers as co-producers of public safety. The methodology tested here has tremendous potential as an institutionalized system of external police accountability. As public interest in procedurally just policing reaches unprecedented heights, web-based surveys offer one possible solution. Customer satisfaction with police services and police encounters, whether in reference to calls for service or policeinitiated stops, is the next frontier for systematic measurement to address equity concerns. 151 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. From a management or research perspective, the efficiency of the Internet allows for hundreds of new comparisons within and across jurisdictions. This tool can be used, for example, to assess the impact of localized interventions (e.g. the effects of installing cameras in crime hot spots on residents’ awareness, fear, and risk of detection compared to control locations), to compare performance across beats or districts (e.g. police visibility and response times across different Latino beats), or to compare performance across jurisdictions (e.g. perceived police demeanor during traffic stops in African American neighborhoods in Chicago, London, and Los Angeles). 4. Transferability: There is no reason to believe that the measurement system we have developed cannot be applied in cities and communities around the world, assuming some modifications to accommodate language and contextual differences. Our research team at the University of Illinois at Chicago looks forward to working with other cities as possible test sites in the future. The most fundamental restriction on web-based surveys is the "digital divide." The U.S. Census Bureau (2005; www.census.gov) reports that while more than half of all U.S. households have access to the Internet, rates of access are lower for households with persons of color, less income, less education and one parent. This obstacle, however, is not insurmountable. Many inner city youth have access to cell phones, which are increasingly connected to the Internet and can be used for very brief surveys. Dozens of U.S. city governments are installing wireless networks and planning for greater digital equity. Also, a good measurement system for estimating public safety concerns can be constructed by providing Internet access to samples of residents rather than entire populations of the target areas. In sum, the feasibility of establishing an ongoing system for the collection and dissemination of community-based information has been demonstrated through the Chicago Internet Project. Despite the multiple obstacles to participation and implementation that were identified, the basic mechanics for generating and administering Internet surveys and survey results were established successfully. Police and residents alike responded in a largely favorably manner to the concept of Internet communication between police and residents, considering the Internet to be a viable avenue for expanding resident participation and collecting new information. With the rise of communication through new mediums, such as the Internet and text messaging via cell phones, it seems less a question of whether police-citizen communication should be extended to these new mediums but rather how can these mediums can be used to achieve greater citizen participation, stronger police-community partnerships, and increased public safety? In this sense, the priority is not how communication is achieved, but the outcomes of such communication. Information may be collected successfully through the Internet, but the more difficult challenge is to have police and citizens use it successfully to achieve tangible goals. 152 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. REFERENCES Alpert, G.P., & Moore, M.H. (1993). Measuring police performance in the new paradigm of community policing. In G.P. Alpert, & A.R. Piquero (Eds.), Community policing: Contemporary readings (pp. 215-232). Prospect Heights: Waveland Press, Inc. American Association for Public Opinion Research (1998). Standard definitions: Final dispositions of case codes and outcome rates for RDD telephone surveys and in-person household surveys. Ann Arbor, Michigan: AAPOR. Bandura, A. (1997). Self-efficacy. The exercise of control. New York: W.H. Freeman. Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning. Educational Psychologist, 28, 117-149. Bandura, A., Adams, N.E., & Beyer, J. (1977). Cognitive processes mediating behavioral change. Journal of Personality and Social Psychology, 35, 125-139. Bazemore, G. (1998). Restorative justice and earned redemption: Communities, victims and offender reintegration.” American Behavioral Scientist, 41(6),768-813. Bellman, A. (1935). A police service rating scale. Journal of Criminal Law and Criminology, 26(1), 74-114. Bennis, J., Steiner, L., & Skogan, W.G. (2003). The 2002 problem solving study. Working Paper. Evanston, IL: Northwestern University, Institute for Policy Research. Black, Vaughan (1989). Enforcement of judgments and judicial jurisdiction in Canada. Oxford Journal of Legal Studies, 9 (4), 547-556 Blumstein, A. (1999). Measuring what matters in policing. In R. Langworthy (Ed.), Measuring what matters: Proceedings from the policing research institute meetings (pp. 5-10). Washington, D.C.: National Institute of Justice and Office of Community Oriented Policing Services. Boba, R. (2003). Problem analysis in policing. Washington DC: Police Foundation. (with funding from the COPS Office). Brandl, S. G., Frank, J., Worden, R. E., and Bynum, T. S. (1994). Global and specific attitudes toward the police: Disentangling the relationship. Justice Quarterly, 11,: 119-134. Brown, L.P., & Wycoff, M.A. (1987). Policing Houston: Reducing fear and improving service. Crime & Delinquency, 33(1), 71-89. 153 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Brown, M. (2000). Criminal Justice Discovers Information Technology. In G. LaFree, J.F. Short, R.J. Bursik, Sr., & R.B. Taylor (Eds.), Criminal justice 2000, volume 1: The nature of crime, continuity and change (pp. 219-259). Washington, D.C.: U.S. Department of Justice, National Institute of Justice. Bursik, R. J., & Grasmick, H. G. (1993). Neighborhoods and crime: The dimensions of effective community control. New York: Lexington. Chan, J., Brereton, D., Legosz, M., & Doran, S. (2001). E-policing: The impact of information technology on police practices. Queensland: Criminal Justice Commission. Chan, J.B.L. (2001). The Technological Game: How Information Technology is Transforming Police Practice. Criminal Justice, 1(2), 139–159. Clarke, R. V. (ed.). (1992). Situational crime prevention: Successful case studies. Albany, NY: Harrow and Heston. Cohen, L. E., & Felson, M. (1979). Social change and crime rate trends: A routine activities approach. American Sociological Review, 44, 588–608. Compeau, D.R. & Higgins, C. A. (1995). Commuter self-efficacy: Development of a measure and initial test. MIS Quarterly, 19, 189-211. Cordner, G.W. (1997). Community policing: Elements and effects. In G.P Alpert & A.R. Piquero (eds.), Community policing: Contemporary readings (pp. 45–62). Prospect Heights: Waveland Press, Inc. Cronbach, L. J., & Meehl, P. E. (1995). Construct validity in psychological tests. Psychological Bulletin, 52, 281-302. Dalton, E. (2002). COMPASS Overview: Community Mapping, Planning, and Analysis for Safety Strategies. Crime Mapping News, 4 (4), 1-2. Denkers, A., & Winkel, F. (1998). Crime victims' well-being and fear in a prospective and longitudinal study. International Review of Victimology, 5, 141-162. Ditton, J., Khan, F., & Chadee, D. (2005). Fear of crime quantitative measurement instability revisited and qualitative consistency added: Results from a three wave Trinidadian longitudinal study. International Review of Victimology, 12, 247-271. Dubow, F., McCabe, E., & Kaplan, G. (1979). Reactions to crime: A critical review of the literature. Washington, DC: U.S. Department of Justice, National Institute of Justice. 154 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Dunworth, T. (2000). Criminal justice and the IT revolution. In J. Horney, D. Mackenzie, J. Martin, R. Peterson, & D.P. Rosenbaum (Eds.), Criminal justice 2000, volume 3: Policies, processes and decisions of the criminal justice system (pp. 371-426). Washington, D.C.: U.S. Department of Justice, National Institute of Justice. Dunworth, T., G. Cordner, J. Greene, T. Bynum, S. Decker, T. Rich, S. Ward, & V. Webb. (2000). Police department information systems technology enhancement project ISTEP. Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services. Durose, M. R., Smith, E. L., & and Langan, P. A. (2007). Contacts between police and the public, 2005. Bureau of Justice Statistics Special Report. Washington, D.C.: Office of Justice Programs, U. S. Department of Justice. Eck, J. E., & Rosenbaum, D. P. (1994). The new police order: Effectiveness, equity and efficiency in community policing. In D. P. Rosenbaum (Ed.), The challenge of community policing (pp. 3-21). Thousand Oaks: Sage Publications. Eck, J. E., & Spelman, W. (1987). Who ya gonna call? the police as problem-busters. Crime & Delinquency, 33(1), 31-52. Felson, M. F. (2006). Crime and nature. Thousand Oaks, CA: Sage Publications. Ferraro, K. (1996). Women's fear of victimisation: Shadow of sexual assault? Social Forces, 75, 667-690. Fridell, L.A., & Wycoff, M.A. (Eds.). (2004). The future of community policing. Washington, DC: Police Executive Research Forum. Garofalo, J. (1981). The fear of crime: Causes and consequences. Journal of Criminal Law and Criminology, 72, 839-857. Goldstein, H. (1979). Improving policing: A problem-oriented approach. Crime &Delinquency, 25(2), 236-243. Goldstein, H. (1987). Toward community-oriented policing: Potential, basic requirements, and threshold questions. Crime & Delinquency, 33(1), 6. Goldstein, H. (1990). Problem-oriented policing. New York: McGraw-Hill. Grant, H. B., & Terry, K. J. (2005). Law enforcement in the 21st century. Boston: Allyn and Bacon. Greenberg, S. W., Rohe, W. M., & Williams, J. R. (1985). Informal citizen action and crime prevention at the neighborhood level: Synthesis and assessment of the research. Washington, DC: National Institute of Justice. 155 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Greene, J. R. (2000). Changing policing in America: Changing the nature, structure and function of the police. In J. Horney, D. MacKenzie, J. Martin, R. Peterson & D. P. Rosenbaum (Eds.), Criminal justice 2000, volume 3: Policies, processes and decisions of the criminal justice system (pp. 299-368). Washington, D.C.: U.S. Government Printing Office. Haley, K.N., & R.W. Taylor (1998). Police stations in cyberspace: A content analysis of law enforcement agency home pages. In L.J. Moriarty, & D.L. Carter (Eds.), Criminal justice technology in the 21st century (pp.125-147). Springfield, IL: Charles C. Thomas Publisher, Ltd. Harris, D.A. 2003. Accountability: The basis for policing of the future. Luncheon address, The Second Annual National Community Policing Conference, Washington, DC, June 17, 2003. Hartnagel, T. F. (1979). The perception and fear of crime: Implications for neighborhood cohesion, social activity, and community affect. Social Forces, 58(1), 176-193. Hunter, A. (1985). Private, parochial, and public school orders: The problem of crime and incivility in urban communities. In G. D. Suttles, and M. N. Zald (Eds.), The challenge of social control: Citizenship and institution building in modern society. Norwood, NJ: Ablex. Katz, J.E., & Rice, R.E. (2002). Social consequences of Internet use: Access, involvement, and interaction. Cambridge, MA: MIT Press. Lab, S. P. (1992). Crime prevention: Approaches, practices, and evaluations. Cincinnati: Anderson. Lavrakas, P. J. (1985). Citizen self-help and neighborhood crime prevention policy. In L. A. Curtis (Ed.), American violence and public policy (pp. 94–113). New Haven, CT: Yale University Press. Lavrakas, P. J., Normoyle, J., Skogan, W. G., Herz, E. J., Salem, G., & Lewis, D. A. (1981). Factors related to citizen involvement in personal, household, and neighborhood anticrime measures: An executive summary. Washington, DC: U.S. Department of Justice, National Institute of Justice. Lind, E. A., & Tyler, T. R. (1988). Social psychology of procedural justice. New York: Plenum Press. Lorig, K., Chastain, R.L., Ung, E., Shoor, S., & Holman, H.R. (1989). Development and evaluation of scale to measure perceived self-efficacy in people with arthritis. Arthritis Rheum, 32, 37-44. 156 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Maguire, E. R. (2004). Measuring the performance of law enforcement agencies: Part two. Commission on Accreditation for Law Enforcement Agencies, 84. Retrieved 5/14/2006, from http://www.calea.org/newweb/newsletter/No84/maguirepart2.htm Masterson, M. F., & Stevens, D. J. (2002). The value of measuring police performance in Madison, Wisconsin. In D. J. Stevens (Ed.), Policing and community partnerships (pp. 202-217). New Jersey: Prentice Hall. Mastrofski, S.D. (1999). Policing for people. Ideas in American Policing, March issue, Washington, DC: Police Foundation. McDonald, P. (2000). COP, COMPSTAT, and the new professionalism: Mutual support or counter productivity? In G.P. Alpert, & A.R. Piquero (Eds.), Community policing: Contemporary readings (pp. 233-255). Prospect Heights: Waveland Press, Inc. McDonald, P.P. (2005). Information technology and crime analysis. In A. Pattavina (Ed.), Information technology and the criminal justice system (pp.125-146). Thousand Oaks, CA: Sage Publications. McGarrell, E. F., & S. Chermak (2003). Strategic approaches to reducing firearms violence: Final report on the Indianapolis violence reduction partnership. Washington, D.C.: U.S. Department of Justice, National Institute of Justice. McIntyre, J. (1967). Public attitudes toward crime and law enforcement. Annals of the American Academy of Political and Social Science, 374, 34-36. Miller, J., Davis, R.C., Henderson, N.J., Markovic, J., & Ortiz, C. (2005). Measuring influences on public opinion of the police using time-series data: Results of a pilot study. Police Quarterly, 8(3), 394-401. Mirzer, M.L. (1996). Policing supervision in the 21st century. FBI Law Enforcement Bulletin, 65, 6–10. Moore, M. H., & Poethig, M. (1999). The police as an agency of municipal government: Implications for measuring police effectiveness. In R. H. Langworthy (Ed.), Measuring what matters: Proceedings from the policing research institute meetings (pp. 151-168). Washington, D.C.: National Institute of Justice and the Office of Community Oriented Policing Services. Moore, M. H., Thacher, D., Dodge, A., & Moore, T. (2002). Recognizing value in policing: The challenge of measuring police performance. Washington, D.C.: Police Executive Research Forum. Morenoff, J.D., Sampson, R.J., & Raudnebush, S.W. (2001). Neighborhood inequality, collective efficacy, and the spatial dynamics of urban violence. Criminology, 39(3), 517-558. 157 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Oettmeier, T. N., & Wycoff, M. A. (1997). Personnel performance evaluations in the community policing context. Washington, DC: Community Policing Forum. O’Keefe, G. J., Rosenbaum, D. P., Lavrakas, P. J., Reid, K., & Botta, R. A. (1996). Taking a bite out of crime: The impact of the National Citizens’ Crime Prevention Media Campaign. Thousand Oaks, CA: Sage. Parratt, S. D. (1938). A scale to measure effectiveness of police functioning. Journal of the American Institute of Criminal Law and Criminology, 28(5), 739-756. Pate, T., Wycoff, M. A., Skogan, W. G., & Sherman, L. W. (1986). Reducing fear of crime in Houston and Newark: A summary report. Washington, DC: Police Foundation. Perkins, D., & Taylor, R. (1996). Ecological assessment of community disorder: Their relationship to fear of crime and theoretical implications. American Journal of Community Psychology, 24(1), 63-107. Reisig, M. D. (1999). Measuring performance in the era of community policing. Executive Summary Series. East Lansing, MI: Regional Community Policing Institute. Reiss, A.J., Jr. (1971). The police and the public. New Haven: Yale University Press. Reiss, A. J. Jr. (1986). Why are communities important in understanding crime? In A. J. Reiss Jr. & M. Tonry (Eds.), Communities and crime, volume 8 (pp. 1–33). Chicago: University of Chicago Press. Reiss, A. J., Jr., & Roth, J. A. (eds.) (1993). Understanding and preventing violence. Washington, DC: National Academy of Sciences. Reuland, M.M. (1997). Information management and crime analysis: Practitioner’s recipes for success. Washington, DC: Police Executive Research Forum. Roehl, J., Rosenbaum, D.P., Costello, S.K., Coldren, J.R., Schuck, A.M., Kunard, L. & Forde, D. (2006). Strategic approaches to community safety initiative (SACSI) in 10 U.S. cities: The building blocks of project safe neighborhoods. Washington, DC: U.S. Department of Justice, National Institute of Justice. Rosenbaum, D.P. (Ed.). (1986). Community crime prevention: Does it work? Beverly Hills, CA: Sage. Rosenbaum, D.P. (Ed.). (1994). The challenge of community policing: Testing the promises. Newbury Park, CA: Sage. Rosenbaum, D.P. (1987). The theory and research behind neighborhood watch: Is it a sound fear and crime reduction strategy? Crime & Delinquency, 33(1), 103-134. 158 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Rosenbaum, D. P. (1988). Community crime prevention: A review and synthesis of the literature. Justice Quarterly, 5, 323–395. Rosenbaum, D.P. (2002). Evaluating multi-agency anti-crime partnerships: Theory, design and measurement issues. Crime Prevention Studies, 14, 171-225. Rosenbaum, D.P. (2004). Community policing and web-based communication: Addressing the new information imperative. In L.A. Fridell & M. Wycoff (Eds.), The future of community policing (pp.93-113). Washington, DC: Police Executive Research Forum. Rosenbaum, D.P. (2006). The limits of hot spots policing. In D. Weisburd & A. A. Braga (Eds.), Police innovation: Contrasting perspectives (pp. 245-263). New York: Cambridge University Press. Rosenbaum, D.P. (2007). Police innovations post 1980: Assessing effectiveness and equity concerns in the information technology era. Paper prepared for the Knowledge Review Project, Institute for the Prevention of Crime, University of Ottawa. Rosenbaum, D.P., & Baumer, T.L. (1981). Measuring fear of crime: A set of recommended scales. Evanston, IL: Westinghouse Evaluation Institute, June. (Prepared for the National Institute of Justice). Rosenbaum, D.P., Graziano, L., & Stephens, C. (2004). Municipal police websites as a portal for measuring organizational investment in community policing: Results of a national probability sample. Manuscript in preparation. University of Illinois at Chicago. Rosenbaum, D. P., & Heath, L. (1990). The “psycho-logic” of fear reduction and crime prevention programs. In J. Edwards, E. Posavac, S. Tindel, F. Bryant, & L. Heath (Eds.), Applied social psychology annual (Vol. 9, pp. 221–247). New York: Plenum. Rosenbaum, D.P., Lurigio, A.J., & Davis, R.C. (1998). The prevention of crime: Social and situational strategies. Belmont: Wadsworth Publishing Co. Rosenbaum, D.P., Schuck, A.M., S.K. Costello, S.K., Hawkins, D.F., & Ring, M.K. (2005). Attitudes toward the police: The effects of direct and vicarious experience. Police Quarterly, 8(3), 343-365. Roth, J.A., J.F. Ryan, S.J. Gaffigan, C.S. Koper, M.H. Moore, J.A. Roehl, C.C. Johnson, G.E. Moore, R.M. White, M.E. Buerger, E.A. Langston, & D. Thacher (2000). National evaluation of the COPS program: Title I of the 1994 Crime Act. Washington, DC: U.S. Department of Justice, National Institute of Justice. Sampson, R. J. 1998. What Community Supplies. In R.F. Ferguson & W.T. Dickens (Eds.), Urban Problems and Community Development (pp.241–92). Washington, D.C.: Brookings Institute Press. 159 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Sampson, R.J. & Groves, W. B. (1989). Community structure and crime: Testing socialdisorganization theory. American Journal of Sociology, 94 (4), 774-802. Sampson, R. J., & Raudenbush, S. W. (1999). Systematic social observation of public spaces: A new look at disorder in urban neighborhoods. The American Journal of Sociology, 105(3), 603-651. Sampson, R. J., Raudenbush, S. W., & Earls, F. (1997). Neighborhoods and violent crime: A multilevel study of collective efficacy. Science, 277, 918-924. Scheider, M. C., Rowell, T., & Bezdikian, V. (2003). The impact of citizen perceptions of community policing on fear of crime: Findings from twelve cities. Police Quarterly, 6(4), 363-386. Schuck, A.M., & Rosenbaum, D.P. (2005). Global and neighborhood attitudes toward the police: Differentiation by race, ethnicity and type of contact. Journal of Quantitative Criminology, 21, 391-418. Schuck, A.M., & Rosenbaum, D.P. (2006). Promoting safe and healthy neighborhoods: What research tells us about intervention. In Fulbright-Anderson, K. (ed.), Community change: Theories, practices and evidence (pp. 61-140). Washington, DC: The Aspen Institute. Scott, E. J. (1981). Calls for service: Citizen demand and initial police response. Washington, DC: U.S. Government Printing Office. Scott, M. (2000). Problem-oriented policing: Reflections on the first 20 years. Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services. Shadish, W.R., Cook, T.D., and Campbell, D.T. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin Co. Skogan, W.G. (1986). In Fattah E. (Ed.), The fear of crime and its behavioural implications (pp.167-188). London: Macmillan. Skogan, W. G. (1987). The impact of victimization on fear. Crime and Delinquency, 33, 135154. Skogan, W.G. (1990). Disorder and decline: Crime and the spiral of decay in American neighborhoods. New York: Free Press. Skogan, W.G. (Ed.). (2003a). Community policing: Can it work? Belmont, CA: Wadsworth. Skogan, W.G. (2003b). Accountability in management in the Chicago police department. Program Evaluation Summary, 1(2): 1-4. 160 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Skogan, W.G. (2005). Citizen satisfaction with police encounters. Police Quarterly, 8(3), 298321. Skogan, W.G. (2006). Police and community in Chicago: A tale of three cities. Oxford: Oxford University Press. Skogan, W., & Frydl, K. (Eds.). (2004). Fairness and effectiveness in policing: The evidence. Washington, D.C.: The National Academies Press. Skogan, W.G., & Harnett, S.M. (1997). Community policing, Chicago style. New York: Oxford University Press. Skogan, W.G., S. M. Hartnett, J. DuBois, J. T. Comey, M. Kaiser, & J H. Lovig (1999). On the beat: Police and Community Problem Solving. Boulder, CO: Westview Press. Skogan, W. G., & Lehnen, R. (1985, eds.). The National crime survey working papers, volume II: Methodological studies. Washington, DC: US Government Printing Office, 1985. Skogan, W. G., & Maxfield, M. G. (1981). Coping with crime: Individual and neighborhood reactions. Beverly Hills, CA: Sage. Skogan, W.G., D.P. Rosenbaum, S.M. Hartnett, J. DuBois, L. Graziano, & C. Stephens (2005). CLEAR and I-CLEAR: A status report on new information technology and its impact on management, the organization and crime fighting strategies. Chicago: The Illinois Criminal Justice Information Authority. Skogan, W.G., & Steiner, L. (2004). Community policing in Chicago: Year ten. Criminal Justice Information Authority, Chicago, IL, Skogan, W.G., & Steiner, L. (2004). CAPS at Ten: Community Policing in Chicago. Chicago: The Illinois Criminal Justice Information Authority. Skogan, W.G., L. Steiner, S.M. Hartnett, J. DuBois, J. Bennis, B. Rottinghaus, S. Young Kim, K. Van, & D.P. Rosenbaum (2002). Community policing in Chicago, Years eight and nine: An evaluation of Chicago’s alternative policing strategy and information technology initiative. Final report submitted to the Illinois Criminal Justice Information Authority. Evanston, IL: Northwestern University, Institute for Policy Research. December. Skolnick, J. H., & Fyfe, J. J. (1993). Above the law: Police and the excessive use of force. New York: Free Press. Sousa, W. H., & Kelling, G. L. (2006). Of 'broken windows,' criminology, and criminal justice. In D. Weisburd & A. A. Braga (Eds.), Police innovation: Contrasting perspectives (pp. 77-97). Cambridge: Cambridge University Press. 161 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Stajkovic, A.D., & Luthans, F. (1998). Self-efficacy and work-related performance: A metaanalysis. Psychological Bulletin, 124, 240-261. Sunshine J., & Tyler, T.R. (2003). The role of procedural justice and legitimacy in shaping public support for policing. Law & Society Review, 37, 3 513 Surette, R. (1992). Media, crime, and criminal justice: Images and realities. Pacific Grove, CA: Brooks/Cole. Tabachnick, B.G., & Fidell, L.S. (2001). Using multivariate statistics (4th Edition). Allyn and Bacon. Taylor, R.B. (1999). The incivilities thesis: Theory, measurement, and policy. In R. H. Langworthy (Ed.), Measuring what matters: Proceedings from the policing research institute meetings (pp.65-88). Washington, D.C.: National Institute of Justice and the Office of Community Oriented Policing Services. Taylor, R. B. (2006). Incivilities reduction policing, zero tolerance, and the retreat from coproduction: Weak foundations and strong pressures. In D. Weisburd & A. A. Braga, (Eds.), Police innovation: Contrasting perspectives (pp. 98-114). New York: Cambridge University Press. Trojanowicz, R. C., & Harden, H. A. (1985). The status of contemporary community policing. East Lansing, MI, Michigan State University. Tyler, T. R. (1990). Why People Obey the Law. New Haven: Yale University Press. Tyler, T.R. (2004). Enhancing police legitimacy. The Annals of the American Academy of Political and Social Science, 593(1), 84-99. Tyler, T.R. (2005). Policing in Black and white: Ethnic group differences in trust and confidence in the Police. Police Quarterly, 8(3), 322-342. Ullman, S. E. (1996). Do social reactions to sexual assault victims vary by support provider? Violence and Victims, 11, 143-156. Walker, S. (2005). The new world of police accountability. Sage Publications. Walker, S., & Katz, C.M. (2008). The police in America. Boston: McGraw Hill. Warr, M. (2000). Fear of crime in the United States: Avenues for research and policy. In D. Duffee (Ed.), Criminal justice 2000, volume 4: Measurement and analysis of crime and justice (pp.451-490). Washington, D.C.: U.S. Department of Justice. Weisburd, D., & Braga, A.A. (Eds.). (2006). Policing innovation: Contrasting perspectives. Cambridge: Cambridge University Press. 162 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Weisburd, D., S.D. Mastrofski, J.L. Willis, & R. Greenspan (2006). Changing everything so that everything can remain the same: Compstat and American policing. In D. Weisburd & A.A. Braga (Eds.), Policing Innovation: Contrasting Perspectives (pp.284-301). Cambridge: Cambridge University Press. Weitzer, R. (2002). Incidents of police misconduct and public opinion. Journal of Criminal Justice,30(5), 397-408. Weitzer, R. and Tuch, S.A. (2005). Determinants of public satisfaction with the police. Police Quarterly, 8(3), 279-297. White, M. D. (2007). Current issues and controversies in policing. Boston: Pearson Education. Wilson, J.Q., & Kelling, G.L. (1982), Broken windows. Atlantic Monthly, 249, 29-38. Wood, R., & Bandura, A., (1989). Impact of conceptions of ability on self-regulatory mechanisms and complex decision making. Journal of Personality and Social Psychology, 56, 407-415. 163 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX A PRE-TEST AND POST-TEST POLICE QUESTIONNAIRES 2005 University of Illinois at Chicago Beat Meeting Questionnaire Your cooperation will help us understand the opinions of CPD officers. Your responses to this survey will be kept strictly confidential! Thanks for your help. 1. Besides this meeting, how many other beat community meetings have you attended in this beat during the past 12 months? 0 1 2 3 (please circle the number) 4 5 6 7 8 9 10 11 12 Thinking about the people that you see at beat meetings, have you: (please circle the numbers) YES NO 2. Seen them around the beat?…………………… ……. 1 0 3. Attended any other kinds of meetings with them?…... 1 0 4. Worked on any beat problems with them?………….. 1 0 5. How satisfied are you with community attendance at meetings in this beat? (please circle the number) very satisfied……………… 1 somewhat satisfied….…….. 2 somewhat dissatisfied…….. 3 very dissatisfied……… ….. 4 6. How well do the residents who come represent this beat? (please circle the number) very representative……….. 1 somewhat representative…. 2 somewhat unrepresentative.. 3 not representative at all…… 4 7. What is the relationship between police and residents at meetings in this beat? (please circle the number) very congenial…………… 1 somewhat congenial……... 2 somewhat strained……….. 3 very strained……………... 4 A1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 8. These are questions about your daily work. (please circle the numbers) Very Somewhat Not very Not part often often often Never of my job a. Is there discussion of your Beat Plans at beat team meetings?…………………… 4 3 2 1 8 b. How often are your Beat Plans updated?… 4 3 2 1 8 c. Does your beat team consider resident input to help identify priority problems?…. 4 3 2 1 8 d. How often are you sent on an assignment because of a problem identified at a beat meeting?…………………………………… 4 3 2 1 8 e. When you’re not involved in answering a call, how often do you make personal contact with people who live or work in this beat?…………………………………… 4 3 2 1 8 9. We hear suggestions for improving beat meetings. Are these a priority in this beat? In this beat we need… (please circle the numbers) Strongly Agree Agree Strongly Disagree Disagree a. More reports from residents about what they have been doing to solve problems……………… 1 2 3 4 b. More reports from police on what they have been doing to solve problems…………………… 1 2 3 4 c. The same police officers to attend more regularly.. 1 2 3 4 d. More civilian leadership of the meetings………… 1 2 3 4 e. Less talk about personal problems………………. 1 2 3 4 f. More discussion of what residents should be doing before the next meeting…………………… 1 2 3 4 g. More training on what residents can do to solve neighborhood problems…………………………. 1 2 3 4 A2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 10. These are questions about this beat. (please circle the numbers) Strongly Agree Agree Disagree Strongly Disagree a. ICAM is useful for problem solving…………... 1 2 3 4 b. This beat is too busy for problem solving; all the time is spent answering calls…………… 1 2 3 4 c. The department’s new computing technology has improved police service to the public……… 1 2 3 4 11. In your opinion, what is the most important problem affecting this beat right now? _________________________________________________________________ 12. Are you: Male………… 1 Female……… 2 13. What is your racial background? (please circle) African-American/Black… 1 Latino/Hispanic………….. 2 White (Non-Hispanic)…… 3 Asian…………………….. 4 Other…………………….. 5 14. What is your age? (please check one) __ less than 25 __ 25-29 __ 30-39 __ 40-49 __ 50 or more 15. How old were you when you joined the CPD? __________ (age) 16. Is your usual assignment: Beat team……………………………………………….. 1 Rapid response………………………………………….. 2 Swing shift/swing job…………………………………… 3 Other; Community policing Office; beat sergeant, etc….. 4 17. How long have you been working in this beat? ________ (years) _________ (months) _______ don’t work here Your responses will be kept strictly confidential. Thank you for your assistance. A3 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 2005 University of Illinois at Chicago Beat Meeting Questionnaire II Your cooperation will help us understand the opinions of CPD officers. Your responses to this survey will be kept strictly confidential! Thanks for your help. 1. Besides this meeting, how many other beat community meetings have you attended in this beat during the past 12 months? 0 1 2 3 (please circle the number) 4 5 6 7 8 9 10 11 12 Thinking about the people that you see at beat meetings, have you: (please circle the numbers) YES NO 2. Seen them around the beat?…………………… ……. 1 0 3. Attended any other kinds of meetings with them?…... 1 0 4. Worked on any beat problems with them?………….. 1 0 5. How well do the residents who come represent this beat? (please circle the number) very representative……….. 1 somewhat representative…. 2 somewhat unrepresentative.. 3 not representative at all…… 4 6. What is the relationship between police and residents at meetings in this beat? (please circle the number) very congenial…………… 1 somewhat congenial……... 2 somewhat strained……….. 3 very strained……………... 4 7. These are questions about your daily work. (please circle one number for each question) Very often Somewhat Not very often often Never Not part of my job a. Is there discussion of your Beat Plans at beat team meetings?…………………… 4 3 2 1 8 b. How often are your Beat Plans updated?… 4 3 2 1 8 A4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. c. Does your beat team consider resident input to help identify priority problems?…. 4 3 2 1 8 d. How often are you sent on an assignment because of a problem identified at a beat meeting?…………………………………… 4 3 2 1 8 e. When you’re not involved in answering a call, how often do you make personal contact with people who live or work in this beat?…………………………………… 4 3 2 1 8 f. How often does this beat community meeting follow the 5-step problem solving model?......... 4 3 2 1 8 8. These questions are about job satisfaction. Strongly Agree Agree Neutral Strongly Disagree Disagree a. I enjoy nearly all the things I do on my job………………………. 5 4 3 2 1 b I am satisfied with the amount of challenge in my job…………. 5 4 3 2 1 I am dissatisfied with the amount of work I am expected to do…… 5 4 3 2 1 d I like the kind of work I do very much……………………… 5 4 3 2 1 c 9. We hear suggestions for improving beat meetings. Are these a priority in this beat? Strongly Agree Agree a. More reports from residents about what they have been doing to solve problems……………… 1 2 3 4 b. More reports from police on what they have been doing to solve problems…………………… 1 2 3 4 c. More civilian leadership of the meetings………… 1 2 3 4 d. More discussion of what residents should be doing before the next meeting…………………… 1 2 3 4 e. More training on what residents can do to solve neighborhood problems…………………………. 1 2 3 4 In this beat we need… A5 Disagree Strongly Disagree This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 10. These are questions about this beat. Strongly Agree Agree Disagree Strongly Disagree a. ICAM is useful for problem solving…………... 1 2 3 4 b. This beat is too busy for problem solving; all the time is spent answering calls…………… 1 2 3 4 c. The department’s new computing technology has improved police service to the public……… 1 2 3 4 d. The Chicago Internet Project has improved police service to the public……………………. 1 2 3 4 11. Think about the problems in your beat. How often do the following sources of information contribute to your recognition of a problem? Almost Never Sometimes Often Always a. Citizen complaint………………….. 1 2 3 4 b. Community Internet survey……….. 1 2 3 4 c. Discussion at CAPS meeting……… 1 2 3 4 d. Departmental data…………………. 1 2 3 4 e. Personal observation………………. 1 2 3 4 f. Supervisor…………………….…… 1 2 3 4 g. Council person (or public official)… 1 2 3 4 h. Other city department/agency……... 1 2 3 4 12. In your opinion, what is the most important problem affecting this beat right now? _________________________________________________________________ A6 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 13. Please indicate whether you agree or disagree with the following statements about police work. Strongly Agree Somewhat Somewhat Strongly Don’t Agree Disagree Disagree Know a. The primary job of the police is to fight violent crime……………………………. 1 2 3 4 8 b. When the police arrest a criminal, it is usually because of good detective work where they piece together all the clues……… 1 2 3 4 8 c. Every time a police officer sees someone break the law, the officer should take action against the person…………………………… 1 2 3 4 8 d. It is Ok for citizens to use 911 for non-emergency calls………………………… 1 2 3 4 8 e. Sending police officers to community meetings reduces their ability to fight crime… 1 2 3 4 8 Adding more police officers on the street will reduce the crime rate……………………. 1 2 3 4 8 g. Adding more citizen patrols and neighborhood watches will reduce the crime rate…………... 1 2 3 4 8 h. Police officers should try to solve non-crime problems in their beat…….. 1 2 3 4 8 The use of foot patrols is a waste of personnel…………………………... 1 2 3 4 8 In certain areas of this city, an aggressive manner is more useful to a officer on the beat than is a courteous manner…………………….. 1 2 3 4 8 k. Lowering citizens’ fear of crime should be just as high a priority for this department as cutting the crime rate…………………………….. 1 2 3 4 8 f. i. j. A7 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 14. How much of the Department’s resources should be committed to the activities listed below? None Small Amount Moderate Large Amount Amount a. Patrolling in squad cars…………………….. 1 2 3 4 b. Patrolling on foot in neighborhoods………… 1 2 3 4 c. Investigating gangs and drug dealing……….. 1 2 3 4 d. Getting to know juveniles…………………… 1 2 3 4 e. Helping settle family disputes……………….. 1 2 3 4 Explaining crime prevention techniques to citizens…………………………………….. 1 2 3 4 g. Special aggressive enforcement units………… 1 2 3 4 h. Understanding problems of minority groups…. 1 2 3 4 Coordinating with other agencies to improve the quality of life in the city…………………. 1 2 3 4 Working with citizen groups to be responsive to local problems……………………………… 1 2 3 4 k. Using community feedback on surveys to improve problem solving……………………… 1 2 3 4 f. i. j. 15. Are you: Male………… 1 Female……… 2 16. What is your racial background? (please circle) African-American/Black… 1 Latino/Hispanic………….. 2 White (Non-Hispanic)…… 3 Asian…………………….. 4 Other…………………….. 5 17. What is your age? __ Under 25 __ 25-29 __ 30-39 __ 40-49 __ 50 or more 18. How old were you when you joined the CPD? A8 ________ (age) This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 19. Is your usual assignment: Beat team……………………………………………….. 1 Rapid response………………………………………….. 2 Swing shift/swing job…………………………………… 3 Other; Community policing Office; beat sergeant, etc….. 4 20. How long have you been working in this beat? ________ (years) _________ (months) _______ don’t work here Your responses will be kept strictly confidential. Thank you for your assistance. A9 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX B PRE-TEST AND POST-TEST CITIZEN QUESTIONNAIRES 2005 University of Illinois at Chicago CAPS Beat Meeting Questionnaire Your cooperation will help us understand the opinions of community members. Your responses to this survey will be kept strictly confidential. Thanks for your help. 1. Besides this meeting, how many other beat meetings have you been able to attend during the past 12 months? (please circle the number) 0 1 2 3 4 5 6 7 8 9 10 11 12 Thinking about the people that you see at beat meetings, have you: (please circle the numbers) YES 1 NO 0 Attended any other kinds of meetings with them? 1 0 4. Talked with them on the phone?……………… 1 0 5. Worked on any beat problems with them?……. 1 0 2. Seen them around the beat?…………………… 3. 6. How long have you lived in this beat? _________ _________ Years Months 7. Have you ever used the Chicago Police website or online ICAM crime mapping? Yes……… 1 No……… 0 8. Do you have Internet access at home or at work? Yes……… 1 No……… 0 9. Please think about the residents and police in your beat. (Please indicate if you agree or disagree) In this beat… Strongly Agree (please circle the number) Agree Disagree Strongly Disagree a. Residents believe in themselves and what they can accomplish……………………… 1 2 3 4 b. Residents have the knowledge to solve area problems…………………. 1 2 3 4 c. Residents are not effective at solving B1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. neighborhood problems……………………………. 1 2 3 4 d. The police and residents work well together when trying to solve beat problems……………….. 1 2 3 4 e. The police are not open to input and suggestions from residents……………………………………… 1 2 3 4 f. The police are good at sharing crime information with residents………………………………………. 1 2 3 4 g. The police are not good at keeping residents informed about what actions they are taking…………………. 1 2 3 4 h. I am satisfied with the partnership our neighborhood has created with the police………………………… 1 2 3 4 10. How safe do you feel or would you feel being alone outside in your neighborhood at night? Would you say: (please circle the number) Very safe…………………. 1 Somewhat safe…………… 2 Somewhat unsafe………… 3 Very unsafe………………. 4 11. Please think about the relationships among your beat meeting’s participants. (please circle the number) Strongly Agree Agree Disagree Strongly Disagree a. In general, there is open communication among beat meeting participants………………….. 1 2 3 4 b. Our group has been successful at defining specific roles and responsibilities…………………………. 1 2 3 4 c. As a group, our problem solving skills have not improved since we began this effort……………… 1 2 3 4 d. Beat meeting participants are not a close-knit group… 1 2 3 4 B2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 12. Please think about how you view your role in the community. (Please indicate if you agree or disagree) In my neighborhood: (please circle the number) Strongly Agree Agree Disagree Strongly Disagree a. I know the things I need to do to stay safe when I’m out on the streets………………………. 1 2 3 4 b. I know the things I need to do to keep my home and property safe from crime………………. 1 2 3 4 c. If I work with the police my neighborhood will be a safer place………………………………. 1 2 3 4 We are asking these questions to learn about people’s different experiences. Please tell us about yourself. 13. Are you: Male……… 1 Female…… 2 14. In what year were you born? ________ Year 15. What is your background? Please circle the appropriate number. Black/African American………… 1 Filipino…………………… 6 Latino/Hispanic American………. 2 Korean…………………… 7 White/Caucasian………………… 3 Vietnamese or Cambodian 8 Middle Eastern………………….. 4 Chinese…………………… 9 South Asian/Indian Subcontinent.. 5 Other East Asian…………. 10 Other (please write in): ______________________________________________ 16. Do you own or rent your home? Own or paying mortgage 1 Rent…………………… 2 17. Please circle the highest level of education you completed. Did not finish high school………… 1 High School graduate/GED………. 2 Further technical/vocational training 3 Some college, but did not graduate.. 4 College graduate…………………… 5 B3 11 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 18. To receive emails with a link for each month’s survey, please let us know your email address: _________________________________________________________________________________ THANK YOU B4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 2005 University of Illinois at Chicago CAPS Beat Meeting Questionnaire II Your cooperation will help us understand the opinions of community members. Your responses to this survey will be kept strictly confidential. Thanks for your help. 1. Besides this meeting, how many other beat meetings have you been able to attend during the past 12 months? (please circle the number) 0 1 2 3 4 5 6 7 8 9 10 11 12 Thinking about the people that you see at beat meetings, have you: (please circle the numbers) YES 1 NO 0 Attended any other kinds of meetings with them? 1 0 4. Talked with them on the phone?……………… 1 0 5. Worked on any beat problems with them?……. 1 0 2. Seen them around the beat?…………………… 3. 6. How long have you lived in this beat? _________ Years _________ Months 7. Have you ever used the Chicago Police website or online ICAM crime mapping? Yes……… 1 No……… 0 8. Do you have Internet access at home or at work? Yes……… 1 No……… 0 9. Please think about the residents and police in your beat. (Please indicate if you agree or disagree) Strongly Agree Agree In this beat… Strongly Disagree Disagree a. Residents believe in themselves and what they can accomplish……………………… 1 2 3 4 b. Residents have the knowledge to solve area problems 1 2 3 4 B5 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. c. Residents are not effective at solving neighborhood problems……………………………. 1 2 3 4 d. The police and residents work well together when trying to solve beat problems……………….. 1 2 3 4 e. The police are not open to input and suggestions from residents……………………………………… 1 2 3 4 f. The police are good at sharing crime information with residents………………………………………. 1 2 3 4 g. The police are not good at keeping residents informed about what actions they are taking…………………. 1 2 3 4 h. I am satisfied with the partnership our neighborhood has created with the police………………………… 1 2 3 4 10. How safe do you feel or would you feel being alone outside in your neighborhood at night? Would you say: Very safe…………………. 1 Somewhat safe…………… 2 Somewhat unsafe………… 3 Very unsafe………………. 4 11. Please think about the relationships among your beat meeting’s participants. (please circle the number) Strongly Agree Disagree a. In general, there is open communication among beat meeting participants………………….. 4 Agree Strongly Disagree 1 2 3 b. Our group has been successful at defining specific roles and responsibilities…………………………. 4 1 2 3 c. As a group, our problem solving skills have not improved since we began this effort……………… 4 1 2 3 1 2 3 d. Beat meeting participants are not a close-knit group… 4 12. Please think about how you view your role in the community. (Please indicate if you agree or disagree) B6 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. In my neighborhood… Strongly Strongly Agree Agree Disagree 1 2 3 b. I know the things I need to do to keep my home and property safe from crime………………. 4 1 2 3 c. If I work with the police my neighborhood will be a safer place………………………………. 4 1 2 3 Disagree a. I know the things I need to do to stay safe when I’m out on the streets………………………. 4 13. There have been six (6) Internet surveys as part of this project. To the best of your knowledge, how many of these surveys did you complete? I completed ________ Internet surveys (If none, indicate 0) 13a. If you completed no surveys, why not? ______________________________________________________________________________ _____ 14. Have you seen any of the Internet survey results? (Charts may have been distributed at your beat meetings) Yes………….. No…………… 1 0 15. Do you recall any discussion of the Internet survey results at your beat meeting? Yes………….. No…………… 1 0 16. Would you be interested in seeing the Internet survey results for your beat? Yes………….. No…………… 1 0 17. Overall, how would you rate the Chicago Internet Project? Would you say this experiment was…. A big success……………… 1 A moderate success……….. 2 B7 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Neither success nor failure… 3 A moderate failure…………… 4 A big failure…………………. 5 Don't know…………………… 8 18. How often do you go online to access the Internet or World Wide Web from home? I don't have Internet access at home……………… 1 I go online every day……………………………… 2 Several times a week……………………………… 3 Several times a month…………………………….. 4 Just a few times a year……………………………. 5 Never……………………………………………… 6 19. How often do you go online to access the Internet or World Wide Web from work? I don't have Internet access at work……………… 1 I go online every day……………………………… 2 Several times a week……………………………… 3 Several times a month…………………………….. 4 Just a few times a year……………………………. 5 Never……………………………………………… 6 I am not working at this time……………………... 7 Please tell us about yourself. 20. Are you: Male……… 1 Female…… 2 21. In what year were you born? ________ Year 22. What is your background? Please circle the appropriate number. Black/African American………… 1 Filipino…………………… 6 Latino/Hispanic American………. 2 Korean…………………… B8 7 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. White/Caucasian………………… 3 Vietnamese or Cambodian Middle Eastern………………….. 4 Chinese…………………… 9 South Asian/Indian Subcontinent.. 5 Other East Asian…………. 10 Other (please write in): ______________________________________________ 8 11 23. Do you own or rent your home? Own or paying mortgage 1 Rent…………………… 2 24. Please circle the highest level of education you completed. Did not finish high school………… 1 High School graduate/GED………. 2 Further technical/vocational training 3 Some college, but did not graduate.. 4 College graduate…………………… 5 25. To receive the results from the Internet surveys, please let us know your email address: ______________________________________________________________________________ ___ THANK YOU B9 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX C BEAT MEETING OBSERVATION FORM UIC BEAT MEETING OBSERVATION FORM 2005 1.________ Interviewer ID # 2.________ District and Beat (4 digits) 3.________ Month (2 digits) 4.________ Day (2 digits) 5. Status of this meeting (circle one) 1 meeting held 2 meeting not held: explanation_______________________________________________________________________ _________________________________________ (end observation) 6. ______ ______ Time meeting began (military time; 7 pm is 19:00; 6:30 is 18:30) hh mm Count the house 30 minutes after the meeting begins. Exclude police in street clothes, city agency representatives, guest speakers, and others that you can identify as non-residents. 7._______ Total number residents attending Breakdown of Residents (count the numbers) 7a. ______ number black/African-American 7b. ______ number Latino/Hispanic 7c. ______ number East Asian (Filipino, Korean, Chinese, Japanese) 7d. ______ number South Asian (Indian-subcontinent) 7e. ______ number Middleastern (Palestinian, Iranian, Assyrian) 7f. ______ number Caucasian/white 7g. ______ number Don’t Know - Can’t Tell Note Problems Discussed by Residents 1. Drugs (includes _____ _____ big small 6. Parking and Traffic “possible”) ____ ____ Drunk driving Street sales or use big Traffic congestion small Drug house, building used for Parking; double parking drugs Speeding; reckless; run stop Drug-related violence, signs shooting Drug-gang involved is coded here 10 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) 2. Gangs 7. Public Involvement and do not necessarily reflect the official position or policies of the U.S. Department of Justice. _____ _____ big small Gang violence Gang intimidation ____ _____ big small organizing/turnout problems Need block clubs, local Gang graffiti organizations Gang recruiting Not working together Gang loitering Need to follow through on crimes/problems Fear of retaliation for involvement 3. Personal Crime _____ _____ big small Robbery, purse snatching, mugging 8. Police Negatives ____ _____ big small Criticize 911/emergency response Domestic violence Not enough police/need more Assault or battery – general visibility Shooting, murder, homicide Criticize CAPS, Rape/sex assault implementation Criticize beat meetings-police role commitment to CAPS, location Criticize performance or actions 4. Property Crime _____ _____ Burglary – home or business; big theft small 9. Physical Decay ____ _____ big small Abandoned buildings Run-down, dilapidated Burglary – garage buildings Car theft Abandoned cars Car break-ins/car damage Graffiti & vandalism car vandalism; theft from cars Trash, junk, littering General theft; steal from Illegal dumping, fly dumping patrons Walks/street repair Other property crime; bombs Garbage: loose, in alleys, Shoplifting dumpsters Con game; con elderly; pigeon Lights out, too dark drop Vacant/abandoned lots Deceptive practices Empty stores; commercial abandonment ____ _____ big small 5. General Crime Conditions ___ _____ 10. City Services Unspecific crime; generic big small Discuss/criticize services as a activity ___ _____ problem Fear of crime big small 11. Public Officials Impact on area Discuss/criticize officials as a problem 11 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. ____ _____ big small 12. Social Disorder Playground/school ground problems Public drinking Bad stores, businesses, street vending Liquor outlet problems Fights/brawls (not gang) Bad residents/build out of control Noise, loud parties, loud street noise (not cars) Illegal conversion Carrying, owning guns; hear gunfire Suspicious people, vehicles Loud cars: muffler, loud music, etc. Loitering, congregating, mob action Dog/animal problems; poop Panhandling, begging, street musicians Prostitution; homeless, squatters Curfew or truancy; loitering Bad landlords; don’t keep up, lose control Teenager disturbance, noise, disorder Runaway youths; squatting by runaway Disruption, trouble in schools Garbage pickers; dumpster divers Gambling; trespassing Car repair on the street Public urination; public indecency, indecent exposure ____ ___ big 13. Societal Problems small e General social problems: pregnancy, unemployment, politics, etc. Count Identified or Introduced Persons (afterward write in zeros if there were none) 8a._____ number of city agency representatives (describe)___________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ __________________________________________ Count police even if they come and go (not including police trainee group); include police in civilian dress 9. ________ Total Number of CPD Police Attending (do not count police not from the CPD) Male 9a. ________ Female 9b. ________ number white officers C1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 9c. ________ 9d. ________ number black officers 9e. ________ 9f. ________ number Latino officers 9g. ________ 9h. ________ 9i. ________ 9j. ________ number other race officers 9k. ________ 9l. ________ number can’t tell race number Asian officers 10. ______ ______ Time meeting ENDED (military time; 7pm is 19:00; 6:30 is 18:30) hh mm To be completed at the end of the meeting, perhaps before you leave 11. This meeting was conducted: (circle one) 1 English only 3 English with some Spanish translation 5 essentially bi-lingual 2 Spanish only 4 Spanish with some English translation 6 Other (describe below) ___________________________________________ 12. Printed materials, brochures or handouts were available: (circle one) 1 English only 3 No materials or handouts 2 Some/all in Spanish 4 Other (describe)____________________________________________ 13. Presence of an agenda for the meeting: (circle one) 1 Printed 3 No clear agenda/not mentioned 2 Announced 4 Other (describe)___________________________________________ 14. Did someone read or summarize the minutes from the previous meeting? (circle one) 1 Yes 2 No 15. Were crime maps passed out or made available? (circle one) 1 Yes C2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 2 No 16. Were crime/arrest reports (not maps) made available? (Ex: "Top Ten" list of crimes) (circle one) 1 Printed list(s) distributed 3 Printed and verbal reports 2 Verbal report only 4 None made available 17. Who principally chaired/conducted the meeting: (circle one) 1 Police 2 Civilian 3 Joint or shared leadership between police and a civilian 18. Was the Chicago Internet Project or presentation of web-survey results included… (circle all that apply) 1 On printed agenda 3 No clear agenda/not mentioned 2 Announced agenda 4 Other (describe)________________________________________ 19. Were the Chicago Internet Project web survey results made available? (circle one) 1 Printed copies distributed 3 Printed and verbal reports 2 Verbal report only 4 None made available 20. Were the Chicago Internet Project web survey results mentioned or discussed in any way? (circle one) 1 Yes 2 No IF YOU ANSWERED NO TO Q20, SKIP THE CIP BOX AND PROCEED TO Q26 C3 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. CIP RESULTS 21. To what extent were CIP web survey results covered by the person running the meeting? (circle one) 1 All results read 3 Residents told to read results 2 Selected results read 4 Results not covered 22. How were residents encouraged to respond to CIP survey results? (circle all that apply) 1 Asked for feedback 3 Presenter gave his/her own feedback 2 Asked if any questions 4 Results not covered/No encouragement 23. Who dominated the discussion that took place about the CIP survey results? (circle one) 1 Police 2 Residents (civilians if not sure) 3 Roughly equal 24. Did discussion of the CIP survey results include: (circle all that apply) 1 Causes and nature of identified problems 2 Police proposed solutions 3 Residents proposed solutions 4 Agreed course of action for police 5 Agreed course of action for residents 25. Roughly how long did they talk about the CIP results/project? 1 Less than 5 minutes 2 5-10 minutes 3 11-15 minutes 4 More than 15 minutes C4 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 26. Overall, who dominated the discussion that took place during the meeting: (circle one) 1 Police 2 Residents (civilians if not sure) 3 Roughly equal 27. ________ Approximate number of residents who spoke during the meeting Of residents who spoke during the meeting, how many were…. (circle one) 28a. Negative towards the police? 28b. Supportive of the police? 1 None 1 None 2 Less than half 2 Less than half 3 Half 3 Half 4 More than half 4 More than half 5 All 5 All 29. Sources of information for problems discussed: (circle all that apply) 1 Officer (non-report) 2 Citizen 3 Chicago Internet Project survey results 4 ICAM or crime reports 5 Crime Maps 6 Personal experience 7 Newspaper stories 8 Other: (describe) _________________________________________________________ During the meeting, WHO referred to: (circle all that apply) Police Residents Both Doesn’t Apply 30a. CIP Project survey results 1 2 3 4 30b. ICAM or crime reports 1 2 3 4 30c. Crime maps 1 2 3 4 30d. Personal experience 1 2 3 4 30e. Newspaper stories 1 2 3 4 30f. Other (as described above) 1 2 3 4 C5 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. During the meeting, who… (circle all that apply) Police Residents Both Equally Not discussed 31. Identified most of the neighborhood problems brought up? 1 2 3 1 2 3 1 2 3 1 2 3 4 32. Brainstormed or proposed solutions for identified problems? 4 33. Proposed most of the solutions discussed? 4 34. Report back on previous problem solving efforts? 4 35. Were volunteers identified or sign-up sheets passed around for a particular activity? (circle one) (This would commit individuals to ACTION.) 1 Yes 0 No 36. Were announcements made about specific meetings, rallies, marches, smokeouts, etc. (circle one) and participants urged to attend? (This would be INFORMATIONAL.) 1 Yes 0 No 37. Were participants encouraged to call 311 or city service agencies to get problems resolved? (circle one) 1 Yes 0 No 38. Were participants encouraged to contact public officials, call/write letters to government? (circle one) (not including the 311 calls, etc. just above) 1 Yes 0 No C6 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. 39. JUDGMENT: Did residents leave the meeting with a commitment to future action? (circle one) 1 Yes 0 No EVERYTHING! CHECK BACK AND MAKE SURE YOU FILLED IN C7 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX D DIRECTIVES MEMO FOR CPD PERSONNEL BUREAU OF CRIME STRATEGY AND ACCOUNTABILITY CAPS Project Office TO: 10 April 2005 Beat Team Leader & Alternate Beat Team Leader Beat 2012 & 2013 FROM: SUBJECT: Frank Limon Assistant Deputy Superintendent CAPS Project Office Chicago Internet Project Refer to the following directive regarding your Beat’s participation in the Chicago Internet Survey Project. Objective: To maximize the number of CAPS attendees who participate in the online surveys as part of the Chicago Internet Project. Participation rates should increase each month in each beat. (You will be notified of your monthly progress). To achieve this objective, the following tasks should be completed each month: Task 1: List the Chicago Internet Project on the meeting agenda: Every month, from April to September, make sure the Chicago Internet Project is listed as an agenda item at beat community meetings. Also, leave at least 10 minutes for discussion. Task 2: Download information for the next beat community meeting: Several days prior to the beat meeting, the CAPS office will send you two things electronically: (1) the survey results from last month and (2) a flyer showing residents how to access the Internet survey for the following month. Download both of these documents, print them, and make enough copies for CAPS attendees. Task 3a: Introduce the Chicago Internet Project: "The next agenda item is the Chicago Internet Project. With this new program, the Chicago Police Department is using the Internet to measure how you feel about neighborhood problems, police services, and community activities in your police beat. Each month CAPS attendees and other residents in your beat complete a short online survey." Task 3b: Make sure CAPS attendees provide their e-mail address on the sign-in sheet: D1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. "When you sign in, make sure you give us your e-mail address so we can keep in touch with you about this survey and other events. We will pass around the sign-in sheet now if you forgot to include this information when you arrived. This would be the time to fill in your e-mail address." (Make sure the sign-in sheet has adequate space for e-mail addresses). Task 4: Distribute and discuss survey results from the last internet survey: "We are now going to pass out the results from the last survey and discuss them." {Describe key findings. Encourage discussion.) Task 5: Pass out the flyer and read the following: "Now I am going to pass out a flyer that will explain how you can go online to complete next month’s Internet survey. Please take the time to fill out the survey. Your participation is very important to us and will only take 5 to 10 minutes of your time. You should take the survey this month, even if you did not fill one out last month. Remember your individual answers are completely confidential and not available to us. If you have any questions you can call the phone number on the handout for more information. That number is 312-996-0764. Please go home and do this right away -- tonight or tomorrow so you don't forget." Task 6: Help those who claim they can't do it or seem reluctant: "If you don't have a computer or don't have access to the Internet, here are some things you can do: 1. Go to the public library nearest your home. The library staff will help you out and give you free access to the Internet. But you must bring the flyer with you because it has all the necessary information on it! Does anyone need a copy of the flyer?" 2. (For modern police stations only): "Go to your local police station and ask for assistance. Because you are a CAPS participant, they will gladly let you use the computer for 10 minutes." "If you have a computer, but don't know how to work it or how to get to the Web: Ask your kids or grandkids or best friend to help you out. I'm sure they will gladly lend you a hand. Don't be embarrassed or afraid to ask for assistance. This is an important service to your neighborhood and to your beat." "Are there any other reasons you haven't completed the Internet survey yet?" (Problem solve with them). Task 7: Give them an incentive to participate: (For beats with raffles): "When you finish the survey online, print out the last page and bring it to the next beat meeting. That will be your ticket to enter the raffle that we're going to hold at the next meeting. If you don't complete the survey and don't bring the last page with you, your D2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. name can't be entered in the raffle. We will be giving away some nice things to say 'thank you' to those of you who took the time to express your opinions about your beat." D3 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX E INSTRUCTIONS FLYER FOR COMPLETING WEB SURVEY MAKE THE CONNECTION ABOUT PUBLIC SAFETY THE CHICAGO INTERNET PROJECT Your beat is part of the Chicago Internet Project (CIP), a new program to increase knowledge about public safety issues in your neighborhood. Your opinions and experiences will be captured through a monthly online survey that is easy to complete and takes about 10 minutes. This is a joint project of the Chicago Police Department and the University of Illinois at Chicago, but your answers are completely confidential. The Chicago Police Department not will see your individual responses. The survey results will be presented and discussed at your next community beat meeting. In the next 10 days, please complete this month’s survey for your beat: http://thesurveywebsite Your password is: XXX If you have questions or problems related to the survey, please contact us: Email: chicagointernetproject @yahoo.com Telephone: (312) 996-0764 The Chicago Internet Project is managed through University of Illinois at Chicago. Any questions about this project may be directed to: Dr. Dennis Rosenbaum E1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Center for Research in Law and Justice, UIC MC 141 1007 W. Harrison Street, Chicago, IL 60607 Phone: (312) 996-0764 [Protocol # 2004-0801] E2 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX F EXAMPLE OF RESULTS DISTRIBUTED AT BEAT MEETINGS CHICAGO INTERNET PROJECT Beat 611 April 2005 Survey Results The following information was collected from people in your neighborhood with an Internet survey. Graph 1. Individual Safety Behavior This graph shows how neighborhood residents feel about the following questions: How often do you lock your doors when you are home? 100% How often do leave the radio and TV on when you go out at night? 25% How often do you ask a neighbor to watch your home while you are away? 38% 0% 25% 50% 75% 100% Percent of people w ho answ ered "Alw ays" or "Frequently" Graph 2. Community Participation This graph shows how neighborhood residents respond to, “People are willing to help their neighbors.” 100% 75% 56% 50% 25% 22% 22% 0% 0% Strongly agree Somewhat agree F1 Somewhat disagree Strongly disagree This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. Graph 3. Police – Community Partnership This graph shows how neighborhood residents feel about the following statements: I do not have any ideas about how to reduce crime in my neighborhood 25% I know where to find information about crime prevention 86% I know how to work with the police to solve crime problems in my neighborhood 50% 0% 25% 50% 75% 100% Percent of people w ho answ ered "Strongly" or "Som ew hat agree" Graph 4. Public Safety Activities of Residents This graph shows the percent of neighborhood residents responding “Yes” to the following activities. Talked with your neighbors about crime issues 56% Attended CAPS beat meeting 22% Called 311 city services to request help or information 56% 0% 25% 50% 75% Percent of people answ ered "Yes" F2 100% This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. APPENDIX G PROBLEM SOLVING EXERCISE FOR CPD USE ONLY: DO NOT DISTRIBUTE AT MEETING COMMUNITY BEAT MEETING PROBLEM SOLVING EXERCISE (AUGUST, 2005) In order to make the Internet survey useful and improve the problem solving process, it is important that the police engage residents in a discussion about the survey results. Use the space below to summarize the points of discussion at your beat meeting. A police officer is required to fill out this form and fax it to Sgt. Daly of the CAPS Project Office (Fax: 312-745-6854) no later than 48 hours after the beat meeting. The focus of this exercise is fear of crime. Points to Cover in Discussion The survey results from all beats in this study show that 30% of residents feel very or somewhat unsafe when walking alone at night in their neighborhood, about 9% feel unsafe when in a lobby or stairway, 50% feel unsafe in the local park, and 35% feel unsafe when walking to or from transportation and when riding public buses or trains (see graph below): The Percent of Residents Reporting Feeling Very or Somewhat Unsafe at Night 60 50 Percent 40 30 20 10 0 Walking alone Lobby or stairway Local park Walking to/from On public buses transportation or trains Question 1. Are there any places in this neighborhood where you feel unsafe. If so, what are they? Question 2. What is causing the fear? (Certain people? Certain aspects of the environment? Lighting?) Question 3. What kinds of activities do you think you could do, perhaps with neighbors, to reduce fear of crime in the community? (Suggestion: Mention again the results from Graph 5. Beliefs about Community and Public Safety -- Most survey respondents in your beat believe that “If I work with G1 This document is a research report submitted to the U.S. Department of Justice. This report has not been published by the Department. Opinions or points of view expressed are those of the author(s) and do not necessarily reflect the official position or policies of the U.S. Department of Justice. community members, my neighborhood will be a safer place to live.” Do CAPS participants have any ideas about how they can work with community members to reduce fear?) TIPS FOR DISCUSSION: Let residents vent their feelings Consider community activities like block parties, Take the Night Back, walks through the parks Consider inviting a friend to walk, run or bike with you G2
Source Exif Data:
File Type : PDF File Type Extension : pdf MIME Type : application/pdf PDF Version : 1.5 Linearized : No Page Count : 219 Page Layout : SinglePage XMP Toolkit : XMP toolkit 2.9.1-13, framework 1.6 About : uuid:76ddc31f-1d3d-499b-9c72-944620a1628d Producer : Acrobat Distiller 7.0.5 (Windows) Keywords : NIJ Grant Report Source Modified : D:20071231191652 Headline : Create Date : 2008:01:02 14:14:20-05:00 Creator Tool : Acrobat PDFMaker 7.0.7 for Word Modify Date : 2008:01:07 11:11:26-05:00 Metadata Date : 2008:01:07 11:11:26-05:00 Document ID : uuid:404b2087-ec47-4f48-9e3c-86e8307e0fe1 Instance ID : uuid:a912ab81-1f73-454e-8c8a-a42bd18fd316 Version ID : 16 Format : application/pdf Title : MEASURING POLICE AND COMMUNITY PERFORMANCE USING WEB-BASED SURVEYS: FINDINGS FROM THE CHICAGO INTERNET PROJECT Creator : Dennis Rosenbaum, Amie Schuck, Lisa Graziano, Cody Stephens Description : NCJ 221076 Subject : NIJ Grant Report Tagged PDF : Yes Author : Dennis RosenbaumEXIF Metadata provided by EXIF.tools