Research Summary: This study investigates the impact of scientific research findings on public views of policing topics. Specifically, we conducted an original survey experiment to determine whether research information treatments influence respondents’ views on the effectiveness of the police in reducing crime, defunding and refunding police budgets, and the use of body-worn cameras. Our results indicate that presenting confirmatory research information has a significant positive impact on perceptions of police effectiveness in reducing crime and use of body-worn cameras. Conversely, presenting “negative” research information has a significant negative effect on these perceptions. Interestingly, neither positive nor negative research findings related to defunding versus refunding the police had a statistically significant impact on respondents, suggesting that research has limited effects on more ideologically complex policing topics. Policy Implications: Scientific research can effectively shape public perceptions of police effectiveness in reducing crime and the use of body-worn cameras, but it has limited effects on politically charged issues, such as defunding and refunding the police. To enhance the impact of evidence-based policing, we suggest that police administrators collaborate with researchers to evaluate new policies and disseminate these findings widely to the public. Additionally, researchers should strive to make their research more accessible to the general public, beyond academic journals, scientific conferences, and paywalls. We recommend using open-access platforms, social media, and other media outlets to disseminate unbiased, evidence-based research on policing that is digestible to the public.
Keywords: police budgets, body-worn cameras, police effectiveness, public perceptions of police, survey experiment
As one measure of scholarly success, scientific communities evaluate the ability of individual scholars, journals, or disciplines to “impact” the broader communities in which they operate. For example, Worrall and Gordon (2022) recently considered the impact of a flagship American Society of Criminology journal (Criminology & Public Policy); they asked, “Has its scholarship reached the masses? Has it been influential from a policy standpoint?” (p. 840). These are fundamental questions to answer for criminology broadly, as the topical concerns of scholars are often located in critically important questions involving state policing power, violence, trauma, and death (Clear, 2010). While the question of “impact” can be construed in many ways, one obvious potential target of influence is the general public, which has at least normative power to shape public policy related to policing in democratic locales (Nix et al., 2021). Crime, and by extension policing, are areas of democratic contention, and many scholars espouse a desire to inform these areas. Still, scholars harbor doubts about the accomplishments of “public criminology” (Loader, Ian & Sparks, 2013), leading Austin (2003) to comment that “…in terms of having any effect on criminal justice policy, there is little evidence that any criminologist’s career has made much of a difference” (p. 557).
Criminologists did not stand idle in the face of these critiques, as scholars made urgent calls for greater collective effort to translate criminology research into public action. Sherman (2005) urged criminology scholars to consider public views of the field as one measure of how the field was advancing. He suggested that the “future success of the field may depend on a growing public image based on experimental results, just as advances in treatment attract funding for basic science in medicine” (Sherman, 2005, p. 131). One example of this link between experimental emphasis and the general public’s view of criminology is the concerted effort to develop a rigorous evidence base for the effectiveness of policing operations. The evidence-based policing (EBP) movement is grounded in the belief that using scientifically-supported methods and strategies can improve the effectiveness and efficiency of policing (Lum & Koper, 2017). A potential impact of EBP is that emerging scientific evidence can improve, or degrade, public opinion on police-related subjects, with Sherman (2015) noting that EBP can “increase public support for what police do” (p. 18). However, scholarship is limited as to whether providing the public with knowledge of policing research actually impacts their perception of the police.
In 2009, Clear (2010) spoke before the American Society of Criminology, declaring that a new age of evidence-based criminology had arrived. He urged the Society to take a leading role in shaping the use and nature of evidence in policymaking, emphasizing the importance of making research available to policymakers and the general public. This call to action implies that empirical evidence can inform and guide decisions made for the greater good. When it comes to distribution, we know that social media can be an effective channel for disseminating research to the public and potentially to policymakers (Worrall & Gordon, 2022). However, it remains uncertain whether the public is even swayed by such evidence when it comes to issues surrounding law enforcement and the criminal justice system. Recent findings reveal that certain topics in policing, such as police shootings, confront deeply ingrained beliefs seemingly impervious to factual information. For instance, Schiff et al. (2022) show that even when presented with precise statistical data on police shootings, individuals’ beliefs about the rates of these incidents remained unchanged and were primarily driven by political affiliations and race. Similarly, Mourtgos and Adams (2020) observed that widening discrepancies between legal and community standards for police use of force are correlated with non-policy factors such as political ideology, race, and gender.
Using a multi-arm information provision survey experiment, we investigate whether making respondents knowledgeable of police-related research can, in fact, impact their perceptions of three salient policing topics: police effectiveness, police budgets, and body-worn cameras (BWCs). We deploy our experiment across heads of households in the state of South Carolina. Respondents were randomized and presented with either confirmatory, negative, or mixed scientific evidence regarding these policing topics. Findings from our sample of roughly 1,800 South Carolinians suggest that in certain conditions, knowledge of research can impact public perceptions of the police. Our findings have implications for researchers, including calls for greater dissemination of research findings, and the importance of police-researcher collaborations. Criminology research can bring value by informing the public of evidence-based practices, thereby shaping public opinion. We begin by first reviewing the importance of public opinion in shaping policy and current knowledge surrounding the impact of scientific evidence on public opinion.
Public perceptions of police matter a great deal, as public trust in police motivates pro-social behavior and serves as a critical foundation for an effective criminal justice system (Tyler, 1990, 2004). While perceptions of the police may vary by civilians’ race (Pickett et al., 2022), most Americans tend to be supportive and generally hold positive perceptions of the police (Callanan & Rosenberger, 2011; Wentz & Schlimgen, 2012). For example, a nationally representative poll recently found that about 69% of Americans hold a “great deal” or a “fair amount” of confidence in police officers, which is in line with the 77% who possess positive views toward scientists, and well above levels for journalists (40%) and elected officials (24%) (Kennedy et al., 2022).
However, public views on law enforcement have declined in the aftermath of highly publicized police misconduct in recent years, resulting in what some have deemed a police legitimacy crisis. Following the killing of George Floyd in 2020, some scholars have suggested that anti-police sentiment has reached unprecedented levels (Brenan, 2020; Cassella et al., 2022; Reny & Newman, 2021; Washburn, 2023; White et al., 2021). In support of this claim, scholars have linked increases in public hostility to increased numbers of officers leaving agencies (Mourtgos et al., 2022), and even a short-lived increase in the number of officers shot on duty (Sierra-Arévalo et al., 2023). Thus, police may be experiencing greater hostility, animosity, resistance, and violent assaults from civilians daily (Boehme & Kaminski, 2023; Sierra-Arévalo et al., 2023), while simultaneously struggling to maintain a staffed police force in order to effectively serve the public (Archbold, 2022).
The growing polarization of American politics, as well as the emergence of the COVID-19 pandemic, has arguably made matters worse for law enforcement. Political leanings seem to be a key driver in public perceptions and support of police (Brown, 2017; Fine et al., 2019; Liu & Cheng, n.d.; Mourtgos & Adams, 2020). Americans are generally skeptical of governmental power (Cook & Gronke, 2005) and distrust in public and private institutions has increased over time (Citrin & Stoker, 2018; Kennedy et al., 2022). The pandemic further perpetuated mistrust and scrutiny of governmental action (Airoldi & Vecchi, 2021). Police—the most visible agents of governmental coercive power (Bittner, 1990; Lipsky, 1973)—may bear the brunt of scrutiny that is inherently directed at other governmental entities (Reiner, 2010; Sharp & Johnson, 2009).
Given this context, it is imperative to understand what factors may influence the public’s perceptions of police, particularly during a police legitimacy crisis (Nix & Wolfe, 2017; Todak, 2017) and in an era of misinformation (Brashier & Schacter, 2020). It is possible that providing the public with evidence-based findings about police may shape public opinion of police (Bennell et al., 2021), which would be noteworthy in today’s political climate. In this circumstance, Clear’s (2010) message of using evidence to inform policymaking becomes actionable, given that the public can be swayed by evidence in updating their opinions, including opinions about police.
It is unclear whether research evidence can alter public opinion within the larger scientific context. The COVID-19 pandemic, for instance, challenged the scientific community in its attempts to influence the public to make sound health decisions regarding the COVID-19 virus (McLaughlin et al., 2021). Many civilians continued their everyday lives, rejecting peer-reviewed scientific knowledge about the virus, benefits of mask-wearing, and vaccines/boosters (Eberl et al., 2021). Thus, while scientists investigated the negative effects of unhealthy lifestyles and the COVID-19 virus, empirical research did not sway all Americans' perceptions, beliefs, and behaviors (Kroke & Ruthig, 2022). Similarly, despite scientific evidence that points to the negative effects of smoking, unhealthy diets, and a lack of exercise, Americans continue to ignore this scientific knowledge (Kabat, 2017). Cigarette smoking and obesity are two leading causes of preventable deaths in the United States (Wang et al., 2020).
Scholars have previously examined the impact of information provisions on public perceptions of police effectiveness in executing their duties, BWCs, and police reform (Demir, 2019; Pickett & Ryon, 2017; Singer & Cooper, 2009; Vaughn et al., 2022). For example, Donovan and Klahm IV (2018) found that presenting respondents about wrongful convictions impacted respondents’ willingness to admit that police misconduct (e.g., obtaining false confessions through threat or use of force) may contribute to the problem of wrongful convictions. With respect to BWCs, Demir (2019) distributed surveys to drivers who were pulled over by police who were randomly assigned BWCs, and found that drivers who were pulled over by police wearing BWCs had improved perceptions of officer behavior, treatment, and lower beliefs that police are corrupt and/or lawless.
Turning attention to police reforms, Vaughn and colleagues (2022) experimentally surveyed Americans by presenting the slogan (e.g., “defund the police”) and substance (describing to respondents the goals of each movement) of the police reform, defunding the police, and abolishing the police movements. Other than the police reform movements, the authors found that respondents largely did not support the defund and abolish the police movement in both slogan and substance (see other work on criminal justice reforms in Pickett & Ryon, 2017). Schiff and colleagues (2022) experimentally informed respondents about the reported number of police shootings in their city and asked about their support of five proposed police reforms. The findings revealed that providing statistics on police shootings did not impact respondent support for policies around police reform. Instead, political partisanship and race were the key driving factors regarding support of police reform, and the presentation of scientific evidence did not influence deeply rooted beliefs about police reforms. It may be that deeply rooted and ideologically based beliefs are harder to change with the presentation of scientific evidence (see other relevant experiments about various policing topics Mullinix et al., 2021; Mummolo, 2018a, 2018b; Nix et al., 2021; Wozniak et al., 2021). Given some of the mixed findings, more research is needed on how the presentation of information and scientific evidence affects public perceptions of police.
It is relevant to note that other studies have used information presentation to assess the impact on public perceptions of other criminal justice related topics, such as sexual assault myths, punitive attitudes towards offenders, the death penalty, criminal justice system effectiveness, and crime prevention strategies (Bohm et al., 1991; Bohm & Vogel, 2004; Cochran & Chamlin, 2005; Indermaur et al., 2012; McMahon, 2010; O’Donohue et al., 2003; Pickett et al., 2020; Wozniak et al., 2022). For instance, focusing on the death penalty, Norris and Mullinix (2020) experimentally presernted wrongful conviction exoneration statistics and found that respondents’ support for the death penalty decreased, while trust in the criminal justice system was also reduced. Providing wrongful conviction narratives (e.g., stories about wrongfully convicted individuals) influenced attitudes toward the death penalty and support for police reform but had little effect on trust in the criminal justice system. In relation to restrictive housing, Rydberg and colleagues (2018) randomly presented criminal justice actor experiences or a summary of research findings regarding the struggles of residence restrictions for sex offenders as a means of assessing its effect on public perceptions of the effectiveness of such laws. Regardless of experimental condition, participants were unmoved by the stimuli and still had high levels of support for residence restriction laws (also see Novick et al., 2022 for study on a similar topic).
In the interest of further understanding how scientific evidence shapes public opinion, the present study uses a survey experiment designed to test whether providing various police research information influences public opinion on three timely topics: 1) police effectiveness in reducing crime, 2) perceptions of police budgets, and 3) perceptions of body-worn cameras. America has experienced a recent surge in violent crime rates nationally (Brantingham et al., 2021), with some scholars promoting public health approaches as a solution compared to police initiatives (Cerdá et al., 2018). Calls to defund and abolish the police reached new levels in the summer of 2020, and though much waned, remain a nationally relevant and politically charged topic (Baranauskas, 2022; Vaughn et al., 2022). Further, body-worn cameras were originally aimed at improving police accountability and reducing use of force (Lum et al., 2016), though the impact of this technology has been uneven (Gaub & White, 2020; Lum et al., 2019). Substantiating the link between scholarly research on these topics may help police administrators, policy makers, and researchers in efforts to shape public opinion of the police.
We test two related hypotheses on the causal relationship between scientific evidence and public opinion on policing topics.
H1: Respondents exposed to confirmatory research findings (versus mixed research findings) about police effectiveness, budgetary questions, and body-worn cameras will express more positive views of those subjects.
H2: Respondents exposed to negative research findings (versus mixed research findings) about police effectiveness, budgetary questions, and body-worn cameras will express more negative views of those subjects.
To test the above hypotheses, we conducted an original survey using statewide head of household contact data from Mailers Haven (2022), a third-party mailing list provider. Mailers Haven makes multiple attempts per year to verify the accuracy of household addresses and identifies the head of household using various databases (e.g., United States Postal Service). Addresses that cannot be verified, unoccupied households, and households that place themselves on “no contact” lists are excluded. Additionally, we are able to see the age, race, email addresses (if available), and zip code of each head of household. The initial dataset provided a total of 1,723,480 head of household mailing addresses in South Carolina.1 Within this data, 680,745 of the mailing addresses had an email address for the head of household, which is the sampling frame for this study. Although this sampling frame may look different that the full list of all households (including those without email addresses), there is evidence to suggest that our probability sample that excluded those without email addresses still elicits externally valid results, even if there are slight demographic differences between the two groups (Keeter & McGeeney, 2015; McMaster et al., 2017; Patten & Perrin, 2015).
Invitations to participate were emailed via Qualtrics starting on October 12, 2022 and ended on November 12, 2022 with four periodic reminder emails sent throughout the timeframe. Of the original emails, 650,154 were delivered (though an unknown number were seen, and many can be assumed to have landed in spam folders). A total of 2,094 respondents started the survey (0.32%), while 1,814 completed the survey (87% completion rate).2 Among those who were administered the survey but did not complete it, the mean age was 52, with 79%, 16%, and 4% identifying as White, Black, and Hispanic, respectively. Alternatively, those who took the survey reported an average age of 58, with 88%, 8%, and 3% identifying as White, Black, and Hispanic, respectively. Essentially, those who completed the survey were older and identified more often as White versus Black (there was an equivalent percentage of Hispanic individuals in both groups) than those who started the survey and did not complete it.
We relied upon the American Association of Public Opinion Research calculator (AAPOR, 2023) to estimate a response rate using the RR2 calculation, which came to 0.3%. Although a low response rate, studies have shown that the response rate is not linked to nonresponse bias (Pickett et al., 2018; Pickett, 2017). Further, based on an internal beta-testing experiment, we found that only 20% of study team affiliates who were sent emails using the same distribution method received the invitation email in their primary inbox, highlighting that our response rate is likely higher among those who actually received an invitation (i.e., it did not go to spam or an alternative folder). Also, we used an experimental survey method to ensure that a low response rate does not interfere with the ability to derive accurate estimates of the causal link between our treatments and respondents’ perceptions. However, that link rests on the assumption that the observed characteristics of respondents are balanced across treatments. We find that respondents' characteristics were well-balanced and report those results in Appendix Table A1.
Respondent characteristics are reported in Table 1. Our average respondent was a 59-year-old white male with a four-year college degree and self-identifying as having a conservative ideology. The average respondent was married, without children in the home, and had not been the victim of a crime in the last twelve months. These respondent characteristics do contrast to the latest South Carolina US Census demographics, whereby 51.4% of the population identifies as female, while 68.6% identified as White, with a median age of 39.73. As such, we are careful about generalizing the findings to the larger South Carolina population.
TABLE 1 Descriptive characteristics of respondents
Police Procedural Justice
Trust in Police
Liberal (Reference category)
No diploma or GED
Finished High School
Crime Victim (Last 12 months)
Notes: SD = Standard deviation; N = Number of respondents; final sample = 1,519
The study's outcome measures focused on three themes: 1) police effectiveness in reducing crime, 2) police budgets, and 3) BWCs. Prior to administering questions about each topic, all respondents were randomly presented with either "confirmatory," "negative," or "mixed" research findings. Respondents were randomly assigned research findings within each topic, such that respondents received an information treatment for each topic and answered questions about each topic after being presented with the randomized treatments. Respondents were presented with the research information once on its own page and then a second time at the top of the page above the relevant themed questions. Note in Table 2, that we highlighted “buzzwords” in all caps to draw the attention of the respondent to the key research findings. The "mixed" treatment was employed as the control condition, representing exposure to mixed scientific evidence on policing that the general public is likely to encounter. We recognize that using a control condition with no information may have implied complete unawareness of policing operations, which is unlikely for the general public. Table 2 reports each experimental condition. The information treatments were kept brief, to reflect how many Americans consume information through platforms like Twitter (Beck et al., 2017).
Following the randomized treatment, respondents were asked questions to evaluate their perceptions of each topic. Treatment varied for each respondent, based on the outcome of interest. For example, one respondent might receive confirmatory research findings about police effectiveness in reducing crime, followed by questions about perceptions of police effectiveness. Subsequently, the same respondent might be provided with a confirmatory, negative, or mixed research finding about police budgets, followed by questions about police budgets. This process was repeated throughout the survey. Question ordering was also randomized for each outcome of interest, so that respondents were asked questions about police effectiveness, budgets, and BWCs in a varied order.
TABLE 2 Treatment Conditions
Police Effectiveness Treatments
Confirming Info: “Scientific research finds that INCREASING POLICE PRESENCE in neighborhoods leads to LOWER CRIME rates and MORE ARRESTS of violent criminals.”
Negative Info: “Scientific research finds that INCREASING POLICE PRESENCE in neighborhoods DOES NOT LOWER CRIME rates and DOES NOT LEAD TO MORE ARRESTS of violent criminals.”
Mixed Info: “Some scientific research finds that increasing police presence in neighborhoods leads to lower crime rates and more arrests of violent criminals, BUT other scientific research does not find this to be the case.”
Police Budget Treatments
Confirming Info: “Scientific research finds that INCREASING POLICE BUDGETS for hiring, retaining, and training officers LOWERS CRIME rates within neighborhoods.”
Negative Info: “Scientific research finds that DECREASING POLICE BUDGETS and shifting money to social services, alcohol/drug rehabilitation, and mental health resources LOWERS CRIME rates in neighborhoods.”
Mixed Info: “Some scientific research finds that increasing police budgets for hiring, retaining, and training officers lowers crime rates within neighborhoods, BUT other scientific research finds that decreasing police budgets and shifting money to social services, alcohol/drug rehabilitation, and mental health resources lowers crime rates in neighborhoods.”
Confirming Info: “Scientific research finds police body-worn cameras REDUCE POLICE USE-OF-FORCE used on citizens and REDUCE COMPLAINTS, as well as IMPROVE TRANSPARENCY and SAFETY for citizens.”
Negative Info: “Scientific research finds police body-worn cameras DO NOT REDUCE USE-OF-FORCE used on citizens and DO NOT REDUCE COMPLAINTS, as well as DO NOT IMPROVE TRANSPARENCY and SAFETY for citizens.”
Mixed Info: “Some scientific research finds that police body-worn cameras reduce use-of-force used on citizens and reduce false complaints, as well as improve transparency and safety for citizens, BUT other scientific research does not find this to be the case.”
To develop the survey items on police legitimacy, trust in police, police effectiveness, police budgets, body-worn cameras, and demographics, we synthesized previous empirical literature on these topics. We conducted a pilot test of the survey at the University of South Carolina’s Patient Engagement Studio (PES), which assembles a diverse and representative group of “patients” (i.e., South Carolinian residents) to provide feedback on the survey, discuss survey dissemination logistics, and offer any other suggestions for the project. Following the feedback from the PES, we made the necessary modifications to the survey and conducted a second pilot test with various faculty, graduate students, and undergraduate students from different departments within the university. The survey was then updated once more in consideration of the pilot results and feedback received. The study, including the treatment structure, was approved by the University’s Institutional Review Board.
We created four dependent variables by combining items into indices measuring respondent perceptions of police effectiveness, police funding (defund/more funding), and BWCs. Scale reliability was assessed using Cronbach’s alpha (Santos, 1999). Police effectiveness was measured by combining five survey items: 1) police play a key role in preventing crime, 2) police play a key role in maintaining law and order, 3) police are effective at fighting crime, 4) the more visible police are in my neighborhood, the safer it is, and 5) more police in my neighborhood makes my neighborhood safer (a = 0.924). Refund police was measured with three survey items: 1) more money from my local government should be allocated to hiring, retaining, and training police officers, 2) police officers in my neighborhood should be paid more, and 3) more money from my local government should be spent on providing officers in my neighborhood with more equipment (e.g., vehicles, technology, safety equipment (a = 0.856). Defund police was measured with three survey items: 1) money should be taken from my local police department’s budget and given over to social services, 2) in general, I agree with the aims of the “defund the police” movement, and 3) in general, I agree with the aims of the “abolish the police” movement (a = 0.857). Perceptions of BWC were measured using five survey items: 1) BWCs should be worn by all police officers, 2) I would feel safer in my neighborhood if I knew police officers were wearing BWCs, 3) I trust police actions more when I know they wear a BWC, 4) using BWCs will make officers act more professionally, and 5) the use of BWCs will reduce complaints against the police (a = 0.833). Survey items were measured on a five-point Likert scale (1 = strongly disagree, 5 = strongly agree), such that higher values represent greater perceptions of police effectiveness, more support for funding the police, more support for defunding the police, and more support for BWCs, respectively.
The main independent variable of interest was whether the respondent received a confirmatory, negative, or mixed research finding before each block of questions, denoted as confirmatory information, negative information, and mixed information. Each condition was coded as a dummy variable, with the mixed condition used as the reference category. We also added several observational control variables to sharpen parameter estimates in the model, account for theoretically important effects, and test for heterogeneity in the treatment effects (Kern et al., 2016).
We controlled for respondent pre-existing perceptions of police procedural justice, obeying the police, and trust in police, which were a series of questions asked at the beginning of the survey before the experiment began. Each of the measures was drawn from previous research operationalizing these concepts (Pickett et al., 2018; Pryce et al., 2017). Police procedural justice was measured by combining six survey items, where respondents were asked if police 1) treat everyone equally, 2) clearly explain the reasons for their actions, 3) treat people with dignity and respect, 4) treat people fairly, 5) respect people’s rights, and 6) listen to suspects before making any decisions about how to handle a case (α = 0.961). Police legitimacy was captured in the form of both obedience and trust. Respondent legal orientation to obey the police and law was measured using four survey items: 1) people should obey the law even if it goes against what they think is right, 2) I always try to follow the law even when I think it is wrong, 3) you should do what the police tell you even if you disagree, and 4) you should accept police decisions even if you think they are wrong (α = 0.802). Trust in police was measured using four survey items: 1) the police protect people’s basic rights, 2) the police are generally honest, 3) most police officers do their jobs well, and 4) the police can be trusted to do what’s right for my neighborhood (α = 0.940). Responses were on a 5-point Likert scale (strongly disagree to strongly agree), such that higher values on each of these indices suggest greater procedural justice perceptions, obedience toward the police and law, and perceptions of trust in the police.
We also asked about respondents’ political leanings, gender, race, education, age and if the respondent had been a victim of a crime in the past year. A long line of previous research expects these factors might contribute to baseline differences in an individual’s views on criminal justice matters, and are commonly included in scholarship interested in respondent’s views on policing matters (e.g., Metcalfe & Pickett, 2022). For example, Mourtgos and Adams (2020) find that race, political ideology, and education were all significantly related to respondents’ views on the reasonableness of police use of force. Similarly, we asked respondents about their personal experience with crime in the last 12 months because of the known relationship between crime victimization and views on crime (Unnever et al., 2007).
We used the following general model specification to identify treatment effects of interest:
where Confirming, and Negative corresponded with the two treatments providing information on scientific evidence framed around the outcomes, compared to a mixed treatment condition. We identified treatment effects on the four outcome variables of interest described above, and X referred to the vector of covariates that we included, namely the demographic characteristics of our respondents, their self-identified partisanship leanings, whether they have been the victim of a crime in the last twelve months, and their perceptions of police procedural justice, obeying the law, and trust in police.
We report the controlled results of the experiment in Table 3 below. After listwise deletion of observations with missing information on the outcomes and control variables, the final sample size was 1,519 respondents. Recognizing that the inclusion of control variables can introduce bias (Berk et al., 2013; Freedman, 2008; Lin, 2013), the uncontrolled experimental results are also reported in Appendix Table A6. Results were consistent between the two models, though we elected to concentrate on our controlled model given the interesting findings regarding respondent perceptions, demographics, and experience. Variance inflation factors (VIF) suggest low potential for multicollinearity, with VIF results for each model reported in Appendix Table A7. In general, we find that when presented with either confirming or negative scientific information related to police effectiveness and BWCs, respondents’ perceptions of those topics were altered. However, we failed to support the hypothesis that respondents’ perceptions related to “defunding” or “refunding” the police were altered when presented with scientific information that conflicted or supported those policy options. These results are reviewed in detail below.
TABLE 3 Full OLS Regressions Predicting Attitudes about Police Effectiveness, Defund, Refund, and Body-Worn Camera
Police Procedural Justice
Trust in Police
Notes: Unstandardized coefficients presented with standard errors in parentheses. With respect to the experimental treatments, the presentation of mixed information was used as the reference category.
+ p < 0.1, * p < 0.05, ** p < 0.01, *** p < 0.001 (two-tailed).
We found experimental evidence that the public’s views on police effectiveness are causally altered through exposure to information about scientific findings on the subject. In short, public opinion on police effectiveness was more positive when exposed to confirmatory scientific information, and more negative when exposed to information that undercuts police effectiveness. Our analysis supported the hypothesis that respondents’ beliefs about police effectiveness were significantly affected by the provision of relevant scientific information. Both the confirming information (beta = 0.13, 95% CI [0.06, 0.20], p < .001; ß = 0.07, 95% CI [0.03, 0.11]) and negative information (beta = -0.18, 95% CI [-0.25, -0.11], p < .001; ß = -0.10, 95% CI [-0.13, -0.06]) experimental treatments were significant in the hypothesized directions. Effect sizes for the experimental treatments were interpreted as very small (confirming information) and small (negative information) using guidelines suggested by Funder and Ozer (2019).
We found a similar pattern in the effect of scientific information regarding BWCs on how the public views these cameras. Public opinion on BWCs was more positive when presented with confirmatory scientific information, just as it was more negative when confronted with negative scientific information. Our analysis supported the hypothesis that respondents’ beliefs about police body-worn cameras were significantly affected by the provision of scientific information on BWCs, although the r-squared value suggests that other factors not accounted influence these perceptions. Both the confirming information (beta = 0.13, 95% CI [0.04, 0.22], p = 0.004; ß = 0.08, 95% CI [0.03, 0.14]) and negative information (beta = -0.23, 95% CI [-0.32, -0.14], p < .001; ß = -0.15, 95% CI [-0.20, -0.09]) experimental conditions were significant in the hypothesized directions. Effect sizes for the experimental treatments were interpreted as very small (confirming information) and small (negative information).
Based on our hypotheses, we expected uniform impact of scientific information on public opinion regarding each of the policing topics. We rejected those hypotheses, in part, as our experimental evidence showed a null effect on the highly politicized topic of police budgets (i.e., “defund” versus “refund” the police) (Baranauskas, 2022; Vaughn et al., 2022). However, we did not find evidence that respondents’ beliefs about defunding the police were significantly affected by the provision of relevant scientific information. The effect of confirming information was statistically non-significant and negative (beta = -0.03, 95% CI [-0.11, 0.05], p = 0.422; ß = -0.02, 95% CI [-0.06, 0.02]), and the effect of negative information was statistically non-significant and negative (beta = -0.00, 95% CI [-0.08, 0.08], p = 0.995; ß = -0.00, 95% CI [-0.04, 0.04]).
Though effects were in the hypothesized direction, we did not find evidence that respondents’ beliefs about refunding the police were significantly affected by the provision of relevant scientific information. The effect of confirming information was statistically non-significant and positive (beta = 0.05, 95% CI [-0.03, 0.13], p = 0.243; ß = 0.03, 95% CI [-0.02, 0.07]), and the effect of negative information was statistically non-significant and negative (beta = -0.02, 95% CI [-0.10, 0.07], p = 0.714; ß = -0.00, 95% CI [-0.05, 0.03]).
As expected by previous literature, respondents’ pre-existing perceptions of police procedural justice and legitimacy, including their belief in obeying the law and trust in the police, had significant associations with our outcomes of interest. We caution that because these were non-experimental variables, they cannot be interpreted as causally affecting the outcomes of interest and instead are associational. Respondents who emphasized the importance of obeying the law endorsed the effectiveness of police, rejected defunding, supported refunding the police, and considered BWCs to be an important police tool. Higher levels of pre-existing beliefs in police procedural justice was associated with endorsing the effectiveness of police and a greater willingness to refund police. Finally, respondents with higher levels of trust in the police endorsed police effectiveness, rejected defunding, supported refunding police, and believed that BWCs are an important police tool.
At baseline levels, respondents in the control group who identified as moderate (b= 0.249, p < 0.001) or conservative (b= 0.310, p < 0.001) had significantly higher perceptions of police effectiveness compared to liberals. Conservatives (b= -0.917, p < 0.001) and moderates (b= -0.571, p < 0.001) were less supportive of the defund movement, and more supportive of “refunding” through higher budgetary support for police. In comparison to liberals, these two groups also did not view BWCs as an important police technology. Compared to males, females were more supportive of both the defund (b= 0.187, p < 0.001) and refund (b= 0.121, p < 0.001) policy options. Higher levels of education were associated with small but significant decreases in the willingness to refund police (b= -0.035, p < 0.01) and importance of BWCs (b= -0.034, p < 0.05). Older respondents were more willing to endorse police effectiveness (b = 0.005, p <0.001), “refunding” the police (b = 0.005, p <0.001), and BWCs (b = 0.004, p <0.001), while less willing to support defunding police (b = -0.006, p <0.001), though these statistically significant effect sizes were all very small. Contrary to expectations, respondents who had been criminally victimized in the last twelve months did not significantly differ in any outcome compared to those who had not been victimized.
One possible story about our main findings is that observational variables were interacting with the treatments to affect the results. For example, it is theoretically possible that a null experimental result is masking small, statistically significant results that are roughly equal and opposed, drawn along categorical differences in our sample such as sex, race, and partisan identification (Kam & Trussler, 2017). Using interactive model specifications, we conducted exploratory tests of heterogeneity to consider whether different characteristics of respondents were associated with differential responses to information about scientific findings related to policing. These tests were not designed to establish causal moderation, but rather are meant to consider differences in treatment across different groups (e.g., heterogeneity tests), thus a parallel estimation approach was not adopted (Bansak, 2021).
The gender gap in college completion is growing, as women ages 25 to 34 are more likely to have a college degree (Parker, 2021). Therefore, we hypothesized that men and women might update their priors about police differently when presented with our treatment. We found little support for this hypothesis. As reported in Appendix Table A2, women and men largely responded similarly to the informational treatments, except there does appear to be heterogeneity in how female respondents, compared to male respondents, responded to negative scientific information on the impact of BWCs (b = -.192, p < .05).
Next, we explored whether respondents with different political orientations responded differently to informational treatments. Political orientation is a strong predictor of generalized perceptions of police (Gamson & McEvoy, 2017; Pickett, 2016), but is also strongly related to perceptions of scientific evidence. For example, researchers have found that liberals are more trusting of scientific institutions and scientific findings compared to their conservative peers (Agley, 2020), and this gap has been growing (Nadeem, 2020). As reported in Appendix Table A3, we found little support for this hypothesis in our effectiveness, defund, or refund outcomes. However, there does appear to be heterogeneity in how conservative-identifying respondents, compared to liberal-identifying respondents, respond to negative scientific information on the impact of BWCs (b=.308, p < 0.05).
Finally, we investigated whether there were differences in how white and non-white respondents responded to our informational treatments. Research has convincingly demonstrated there are racial divides in Americans’ trust of policing (Pickett et al., 2022) and scientific institutions (Gramlich & Funk, 2020). One possible story about our main results, therefore, is that there are differential responses to informational treatment between individuals identifying as white versus non-white. As reported in Appendix Table A4, we found no evidence of treatment heterogeneity by race. White and non-white individuals responded similarly to both confirming and negative scientific information across all four outcomes.
In sum, while there was a baseline relationship between race, sex, political ideology, and the outcomes in the main results, but heterogeneity tests did not indicate that our experimental treatments differentially impacted the views of people of color, women, and conservatives/moderates. However, we caution against strongly interpreting these exploratory results. First, the relationship between partisanship and opinion is endogenously related to media exposure, and shifts in supply and demand for national news has demonstrable impacts on how that news is covered (Martin & McCrain, 2019). It may be that these types of effects would be found in police-related coverage, as well as which types of scientific research would be sought out or supplied. Analytically, our statistical power here is low to support the interactions explored, and therefore it is possible that a true statistically significant, but small, effect is masked. Taken together, these heterogeneity analyses should be considered exploratory only. Future research interested in this specific result should be careful to construct very large samples capable of detecting very small effects. Future studies might also consider causal moderation (Bansak, 2021).
The present study employed an original experimental survey to assess the effect of providing respondents with scientific research related to policing on their perceptions of law enforcement. We found partial support for our two hypotheses, indicating that presenting scientific information can impact how the public thinks about policing topics, but that effect varies by the topic at hand. On specific issues, scientific research influenced respondents' pre-existing beliefs about the police. For instance, we found that presenting confirmatory scientific information about police effectiveness and body-worn cameras led respondents to express support for the police as effective crime reducers and body-worn cameras as a useful tool for law enforcement. Alternatively, more highly politicized policing issues, such as those related to defunding or refunding the police, appeared to be resistant to change in the face of scientific information. Criminologists should strive to develop an evidence base that is informative to public discourse. However, it remains an open question whether criminological findings have the power to affect public opinion. Our contribution to this debate is the provision of causal evidence that the public can be responsive to generalized descriptions of competing scientific evidence, at least in some relevant areas of policing research and in the short-term.
The findings of this study are important as they provide evidence that scientific research can influence public opinion on policing topics. This is particularly relevant today, with issues related to policing and law enforcement becoming highly politicized. The fact that scientific information can impact beliefs about the police has important implications for policymakers and law enforcement agencies. The use of scientific research could potentially lead to more informed decision-making and policy development in the field of policing, as politicians are aware of public preferences on certain criminal justice policies (Vaughn et al., 2022). For example, politicians are generally likely to follow the sentiment of the public on certain topics such as crime rates (Pickett, 2019), even if public perception is skewed by the media (and misinformation). While there is debate as to whether all politicians make evidence-informed legislative decisions (Parkhurst, 2017), informing the public of scientific research findings may affect public sentiment towards policing policies, motivating lawmakers to pass laws that align with this public sentiment. Nonetheless, there is a consistent feedback loop between public sentiment and political decision-making (Gastil & Richards, 2017), whereby a more informed public can influence policy.
Findings from this study should motivate researchers to promote their scholarship beyond paywalls that often accompany peer-reviewed journals. The field of criminology and its academic institutions have struggled to effectively disseminate and reward research with practitioners and the public to perpetuate evidence-based policies and practices (Austin, 2003; Currie, 2007). Still, policymakers and criminal justice administrators should be consistently apprised of the latest scholarly evidence and base policies and practices on evidence-based approaches (Lum & Koper, 2015).
Our findings lend support to the notion that dissemination of these findings can have an influence. One option may involve, time permitting, creating personal websites to upload accepted publications, technical reports, and other research that informs evidence-based policing, which may help the public and invested stakeholders have access to up-to-date research. There are open-science collective action efforts underway in this area, such as the CrimRxiv website (https://www.crimrxiv.com/), which provides a centralized repository for criminology researchers to easily translate their published work (including pre- and post-prints) into publicly accessible webpages, which can directly and easily be shared with policy makers, practitioners, and the public without worry for paywalls. As Worral and Gordon (2022) demonstrate, social media may serve as a valuable platform to deliver evidence-based policing research to the public—in fact, our information treatments were similar to what could be expected from a Twitter post. Other outlets include ResearchGate and socarxiv.org, which are publicly accessible research forums to upload research. Nonetheless, we align with open science literature to encourage transparency with the public and policy makers (Ashby, 2020; Chin et al., 2021) and possibly reduce the cost for open access so as to incentivize researchers to pursue this route.
It is relevant to note that our research treatments were either one to two brief sentences, which were enough to affect respondent perceptions. This finding suggests that short posts on social media (e.g., Twitter) of research findings can potentially sway public opinion. While outside the scope of our study, scholars have made efforts to promote lower word counts/page lengths of manuscripts, so that manuscript development and reviewer turnaround time is more efficient, more research is published, and manuscripts are more digestible for public and interested stakeholders (Maddan, 2018). Our findings lend support to the fact that even brief one to two sentence synopses of the research can be informative to outside readers. Academia largely does not incentivize op-eds, technical reports, and other translational pieces that are more digestible to the general public and may not result in peer-reviewed publications (Lum et al., 2012). While the public may be susceptible to short headlines (some of which may include faulty information), academics can promote credible research which may help the public parse out this faulty information. Essentially, promoting rigorous research findings through media outlets can be avenues in promoting evidence-based scholarship and the diffusion of research to the public.
Our findings also have implications for the practitioner-researcher relationship. Local law enforcement agencies may be positively impacted in collaborating with researchers to inform evidence-based policies and practices, and this evidence, to a certain extent, can bring the public in line with policy initiatives made by the police. Researchers can help design, implement, and evaluate new and existing policies/practices to assess effectiveness. Based on the evaluation, police agencies can distribute findings to the public to inform the public of (in)effective policies/practices. These findings can be frequently disseminated on law enforcement and researcher institutional websites, as well as social media accounts, to keep the public apprised of effective police strategies that are implemented locally. Importantly, police agencies collaborating with researchers may bring forth evidence-based practices, potentially increasing police legitimacy to the public. That is, if the public is aware that police departments are engaging in evidence-based efforts with researchers, the public may see such policies as trustworthy, potentially encouraging police legitimacy (Sherman, 2013; Telep, 2016). Additionally, providing practitioners with relevant scientific information may help inform local politicians of the relevant scientific evidence, influencing evidence-informed policies.
While our study presents some positive implications for public dissemination of research findings, it is not without its limitations. Our study design causally linked the provision of scientific information to public opinion but did not explain the exact mechanism as to why this effect varied by topic. We have some speculations that may prove useful to future research in this area. Topics like police effectiveness and body-worn cameras may have a more personal or immediate (micro-level) effect on respondents compared to police budgets. These topics are likely to impact civilians more regularly, either directly or vicariously, as opposed to the potential long-term consequences of police budgeting. For example, respondents may personally or vicariously (e.g., read about in the media) experience or witness crime. Therefore, when presented with scientific research on police effectiveness in reducing crime, respondents may be personally invested in that knowledge. Additionally, since police are the most visible agents of the criminal justice system, interacting with police wearing body-worn cameras (e.g., during a traffic stop or on the street) may be a relatively common experience for respondents, and thus more “front of mind,” compared to questions of police budgets.
Additionally, discussions around defunding or refunding the police have become highly politicized in recent years (Jackson et al., 2022), with opinions about these topics more ideologically ingrained. In this way, scientific research findings related to police budgets may not be able to adjust these opinions. Police budgeting is a national debate that is split across political lines. While defunding/abolishing the police has become an emerging political discussion, Americans largely do not support the goals of these movements. For example, Vaughn and colleagues (2022) experimentally surveyed U.S. adults and found that respondents opposed defunding and abolishing police in slogan, substance, and because these proposals suggest removing police from their regular police roles. In other words, it appears respondents feel strongly about the topic of police budgets, and as such, we did not find that people given confirmatory or negative research findings about police budgets were affected. Unfortunately, we did not ask about existing participant knowledge about these topics, which could have given us some insight into this issue. Moreover, the concept of defunding the police is not clearly conceptualized or defined, leading to difficulty in understanding what, and how, to measure such efforts (Koziarski & Huey, 2021; Lum et al., 2022).
In addition to this limitation, relevant questions about each topic were asked immediately following each information treatment. In this respect, the study was not designed to test whether these treatments have lasting impacts on respondent perceptions of police. An important follow-up to our study would be an assessment, perhaps longitudinally, of whether the presentation of this information is fleeting, as well as what factors make it more likely for information to remain salient. Based on our current findings, we can only suggest that researchers consistently promote their research findings so that the evidence remains salient in people’s minds. One possible model of this type of effort is the Criminal Justice Expert Panel (2022) series. This model brings together identified experts in criminology, economics, political science, and affiliated fields to answer questions of public import. For example, on the topic of policing and public safety, the series asks experts to weigh in on the topic: “Do police actually make communities safer? And are there other ways to achieve that goal?” Within that topic, experts are asked to give their opinions on the following: “Increasing police budgets will improve public safety” and “Increasing social service budgets (e.g. housing, health, education) will improve public safety.” By weighting their confidence in these statements, experts can signal how strongly they feel about these policies in the context of available evidence. This model allows any member of the public to quickly ascertain, for example, that most serious scholars agree that increasing funding of policing would likely benefit public safety, but that it is critical to direct those funds carefully towards police activities known to lower serious crime. Not only could this approach of presenting scientific evidence influence public opinion over time, it would continuously dispel misinformation from media and other outlets which skews public opinion (Altheide, 2018).
We focused on heads of household in South Carolina that had email addresses available. The internal validity of a survey experiment is well known, as are the challenges in understanding the external validity of the findings. It is worth noting that the findings may not be applicable to the entire United States or even the general population of South Carolina, as the study was not conducted on a representative sample of either area. Further replication is needed to confirm the external validity of the findings, as well as the reliability of the findings across various populations. Further, given that political polarization may be at play in influencing some of the findings, we expect that the exact relationship between scientific information and public opinion will vary across time – politicization varies the tenor and focus of public debates depending on the most highly salient topics and that saliency varies over time.
One further limitation related to public impact is the question of whether, and how, public opinion translates to policy maker action, whether it be through appointed positions such as police chiefs and city managers, or elected officials such as sheriffs and senators. This question represents longstanding debates within political science scholarship. Recent scholarship suggests that even amongst policymakers with longstanding resistance to democratic reforms, such as police executives’ pushback to Civilian Review Boards (CRBs), strong indicators of local opinion in favor of CRBs is moderately effective at softening police executives’ opinions (Adams et al., 2022; McCrain et al., 2020). It is outside the scope of our study to comprehensively map out the pathways from public sentiment to actionable policy. However, our findings can point to the effectiveness of the science to public sentiment pathway, which would necessarily be involved in evidence-based policy.
Finally, this study raises questions about the broader implications of how criminology research is presented and communicated to the public. The use of short, Twitter-like statements to inform respondents was intentional, as it emulates how many Americans ingest information. Further, we capitalized buzzwords to draw attention to the important aspects of the statements and did not include a pure control condition. However, it is unclear whether this method is the most effective way to communicate scientific research on policing topics and whether including a pure control condition would have made a difference. Further research is needed to determine the most effective methods for presenting scientific research to the public, and whether different strategies are needed for different types of topics or audiences. Overall, these findings contribute to the growing body of research on the intersection of science and public opinion and highlight the importance of considering how scientific research is communicated to the public.
This study highlights the importance of researchers engaging with the public and sharing their work, because it can have an impact on public opinion related to policing issues. This finding is particularly relevant in today’s society, where debates about policing and criminal justice reform are ongoing and often highly contentious. By sharing their findings with the public, researchers can help inform these discussions and contribute to a better understanding of the complexities and nuances of policing issues.
Moreover, the findings underscore the importance of conducting high-quality research on policing, even on the most politically charged issues. Research can have significant impacts on policy and other important outcomes, and it is important for researchers to provide evidence-based insights that can inform these decisions. While it may be more challenging to influence public opinion on highly politicized issues, there are still other avenues, such as policymaking bodies, where research can have a meaningful impact. We have presented compelling evidence that police research can play a valuable role in shaping public opinion, and it is advisable for researchers to consider how they present their research to the public. By doing so, researchers can help to inform public discourse and contribute to a better understanding of policing issues among the general public.
AAPOR. (2023). American Association for Public Opinion Research. https://www-archive.aapor.org/Education-Resources/For-Researchers/Poll-Survey-FAQ/Response-Rates-An-Overview.aspx
Adams, I. T., McCrain, J., Schiff, D. S., Schiff, K. J., & Mourtgos, S. M. (2022). Public Pressure or Peer Influence: What Shapes Police Executives’ Views on Civilian Oversight? SocArXiv. https://doi.org/10.31235/osf.io/mdu96
Agley, J. (2020). Assessing changes in US public trust in science amid the COVID-19 pandemic. Public Health, 183, 122–125. https://doi.org/10.1016/j.puhe.2020.05.004
Airoldi, G., & Vecchi, D. (2021). The road from evidence to policies and the erosion of the standards of democratic scrutiny in the COVID-19 pandemic. History and Philosophy of the Life Sciences, 43(2), 1–5.
Altheide, D. L. (2018). Creating fear: News and the construction of crisis. Routledge.
Archbold, C. A. (2022). A Look at Police Accountability Through the Lens of the George Floyd Case. In Rethinking and Reforming American Policing (pp. 259–288). Springer.
Ashby, M. P. (2020). The open-access availability of criminological research to practitioners and policy makers. Journal of Criminal Justice Education, 32(1), 1–21.
Austin, J. (2003). Why criminology is irrelevant. Criminology & Public Policy, 2(3), 557–564.
Bansak, K. (2021). Estimating causal moderation effects with randomized treatments and non-randomized moderators. Journal of the Royal Statistical Society Series A: Statistics in Society, 184(1), 65–86.
Baranauskas, A. J. (2022). Racial resentment, crime concerns, and public attitudes toward defunding the police. Journal of Ethnicity in Criminal Justice, 20(1), 48–72.
Beck, A. L., Lakkaraju, K., & Rai, V. (2017). Small is big: Interactive trumps passive information in breaking information barriers and impacting behavioral antecedents. PloS One, 12(1), e0169326.
Bennell, C., Alpert, Geoffrey P., Andersen, J. P., Arpaia, J., Huhta, J.-M., Kahn, K. B., Khanizadeh, A.-J., McCarthy, M., McLean, K., Mitchell, R. J., Nieuwenhuys, A., Palmer, A., & White, M. D. (2021). Advancing police use of force research and practice: Urgent issues and prospects. Legal and Criminological Psychology, n/a(n/a). https://doi.org/10.1111/lcrp.12191
Berk, R., Pitkin, E., Brown, L., Buja, A., George, E., & Zhao, L. (2013). Covariance adjustments for the analysis of randomized field experiments. Evaluation Review, 37(3–4), 170–196.
Bittner, E. (1990). Florence Nightingale in Pursuit of Willie Sutton. In Aspects of Police Work. Northeastern University Press.
Boehme, H. M., & Kaminski, R. J. (2023). Suspect resistance, police use of force, and officer injuries in a post-Floyd era: An analysis of two large police departments. Police Practice and Research, 1–9.
Bohm, R. M., Clark, L. J., & Aveni, A. F. (1991). Knowledge and death penalty opinion: A test of the Marshall hypotheses. Journal of Research in Crime and Delinquency, 28(3), 360–387.
Bohm, R. M., & Vogel, B. L. (2004). More than ten years after: The long-term stability of informed death penalty opinions. Journal of Criminal Justice, 32(4), 307–327. https://doi.org/10.1016/j.jcrimjus.2004.04.003
Brantingham, P. J., Carter, J., MacDonald, J., Melde, C., & Mohler, G. (2021). Is the recent surge in violence in American cities due to contagion? Journal of Criminal Justice, 76, 101848.
Brashier, N. M., & Schacter, D. L. (2020). Aging in an era of fake news. Current Directions in Psychological Science, 29(3), 316–323.
Brenan, M. (2020). Amid Pandemic, Confidence in Key U.S. Institutions Surges. Gallup. https://news.gallup.com/poll/317135/amid-pandemic-confidence-key-institutions-surges.aspx
Brown, A. (2017). Republicans more likely than Democrats to have confidence in police. Pew Research Center. https://www.pewresearch.org/fact-tank/2017/01/13/republicans-more-likely-than-democrats-to-have-confidence-in-police/
Callanan, V. J., & Rosenberger, J. S. (2011). Media and public perceptions of the police: Examining the impact of race and personal experience. Policing & Society, 21(2), 167–189.
Cassella, C., Epp, D., Fredriksson, K., Roman, M., & Walker, H. (2022). The George Floyd Effect: How Protests and Public Scrutiny Change Police Behavior in Seattle.
Cerdá, M., Tracy, M., & Keyes, K. M. (2018). Reducing urban violence. Epidemiology, 29(1), 142–150.
Chin, J. M., Pickett, J. T., Vazire, S., & Holcombe, A. O. (2021). Questionable research practices and open science in quantitative criminology. Journal of Quantitative Criminology, 1–31.
Citrin, J., & Stoker, L. (2018). Political trust in a cynical age. Annual Review of Political Science, 21, 49–70.
Cochran, J. K., & Chamlin, M. B. (2005). Can information change public opinion? Another test of the Marshall hypotheses. Journal of Criminal Justice, 33(6), 573–584. https://doi.org/10.1016/j.jcrimjus.2005.08.006
Cook, T. E., & Gronke, P. (2005). The Skeptical American: Revisiting the Meanings of Trust in Government and Confidence in Institutions. The Journal of Politics, 67(3), 784–803. https://doi.org/10.1111/j.1468-2508.2005.00339.x
Criminal Justice Expert Panel. (2022). Criminal Justice Expert Panel. https://cjexpertpanel.org/surveys/
Currie, E. (2007). Against marginality: Arguments for a public criminology. Theoretical Criminology, 11(2), 175–190.
Demir, M. (2019). Citizens’ perceptions of body-worn cameras (BWCs): Findings from a quasi-randomized controlled trial. Journal of Criminal Justice, 60, 130–139.
Donovan, K. M., & Klahm IV, C. F. (2018). How priming innocence influences public opinion on police misconduct and false convictions: A research note. Criminal Justice Review, 43(2), 174–185. https://doi.org/10.1177/0734016817707809
Fine, A. D., Rowan, Z., & Simmons, C. (2019). Do politics Trump race in determining America’s youths’ perceptions of law enforcement? Journal of Criminal Justice, 61, 48–57.
Freedman, D. A. (2008). On regression adjustments to experimental data. Advances in Applied Mathematics, 40(2), 180–193.
Funder, D. C., & Ozer, D. J. (2019). Evaluating effect size in psychological research: Sense and nonsense. Advances in Methods and Practices in Psychological Science, 2(2), 156–168.
Gamson, W. A., & McEvoy, J. (2017). Police violence and its public support. In Collective violence (pp. 329–342). Routledge.
Gastil, J., & Richards, R. C. (2017). Embracing digital democracy: A call for building an online civic commons. PS: Political Science & Politics, 50(3), 758–763.
Gaub, J. E., & White, M. D. (2020). Open to Interpretation: Confronting the Challenges of Understanding the Current State of Body-Worn Camera Research. American Journal of Criminal Justice, 45(5), 899–913. https://doi.org/10.1007/s12103-020-09518-4
Gramlich, J., & Funk, C. (2020). Black Americans face higher COVID-19 risks, are more hesitant to trust medical scientists, get vaccinated. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/06/04/black-americans-face-higher-covid-19-risks-are-more-hesitant-to-trust-medical-scientists-get-vaccinated/
Indermaur, D., Roberts, L., Spiranovic, C., Mackenzie, G., & Gelb, K. (2012). A matter of judgement: The effect of information and deliberation on public attitudes to punishment. Punishment & Society, 14(2), 147–165. https://doi.org/10.1177/1462474511434
Jackson, J., Fine, A., Bradford, B., & Trinkner, R. (2022). Social Identity and Support for Defunding the Police in the Aftermath of George Floyd’s Murder.
Kam, C. D., & Trussler, M. J. (2017). At the Nexus of Observational and Experimental Research: Theory, Specification, and Analysis of Experiments with Heterogeneous Treatment Effects. Political Behavior, 39(4), 789–815. https://doi.org/10.1007/s11109-016-9379-z
Keeter, S., & McGeeney, K. (2015). Coverage error in internet surveys: Who web-only surveys miss and how that affects results. Pew Research Center. https://www.pewresearch.org/methods/2015/09/22/coverage-error-in-internet-surveys/
Kennedy, B., Tyson, A., & Funk, C. (2022). Americans’ trust in scientists, orther groups declines. Pew Research Center. https://www.pewresearch.org/science/2022/02/15/americans-trust-in-scientists-other-groups-declines/
Kern, H. L., Stuart, E. A., Hill, J., & Green, D. P. (2016). Assessing Methods for Generalizing Experimental Impact Estimates to Target Populations. Journal of Research on Educational Effectiveness, 9(1), 103–127. https://doi.org/10.1080/19345747.2015.1060282
Koziarski, J., & Huey, L. (2021). #Defund or #Re-Fund? Re-examining Bayley’s blueprint for police reform. International Journal of Comparative and Applied Criminal Justice. http://www.tandfonline.com/doi/abs/10.1080/01924036.2021.1907604
Kroke, A. M., & Ruthig, J. C. (2022). Conspiracy beliefs and the impact on health behaviors. Applied Psychology: Health and Well‐Being, 14(1), 311–328.
Lin, W. (2013). Agnostic notes on regression adjustments to experimental data: Reexamining Freedman’s critique.
Lipsky, M. (1973). Law and Order: Police Encounters. Transaction Publishers.
Liu, S., & Cheng, T. (n.d.). How Online Media Shapes Polarization Towards Policing.
Loader, Ian, & Sparks, R. (2013). Public Criminology? Routledge. https://doi.org/10.4324/9780203846049
Lum, C., & Koper, C. S. (2015). Evidence-based policing. Waveland Press Illinois, IL.
Lum, C., & Koper, C. S. (2017). Evidence-based policing. Oxford Univ. Press.
Lum, C., Koper, C. S., & Wu, X. (2022). Can we really defund the police? A nine-agency study of police response to calls for service. Police Quarterly, 25(3), 255–280.
Lum, C. M., Koper, C. S., Gill, C., Hibdon, J., Telep, C., & Robinson, L. (2016). An Evidence-assessment of the Recommendations of the President’s Task Force on 21st Century Policing: Implementation and Research Priorities. International Association of Chiefs of Police Alexandria, VA.
Lum, C., Stoltz, M., Koper, C. S., & Scherer, J. A. (2019). Research on body-worn cameras: What we know, what we need to know. Criminology & Public Policy, 18(1), 93–118.
Lum, C., Telep, C. W., Koper, C. S., & Grieco, J. (2012). Receptivity to research in policing. Justice Research and Policy, 14(1), 61–95.
Maddan, S. (2018). Presidential address: The declining significance of the literature review in criminal justice scholarship: Towards a new paradigm. American Journal of Criminal Justice, 43(4), 745–753.
Mailers Haven. (2022). The highest quality: Most responsive mailing lists. https://www.mailershaven.com/
Martin, G. J., & McCrain, J. (2019). Local News and National Politics. American Political Science Review, 113(2), 372–384. https://doi.org/10.1017/S0003055418000965
McCrain, J., Adams, I. T., Mourtgos, S. M., Schiff, D. S., & Schiff, K. J. (2020, December 8). Voters Support Civilian Oversight of Policing. Data For Progress. https://www.dataforprogress.org/blog/2022/12/voters-support-civilian-oversight-of-policing
McMahon, S. (2010). Rape myth beliefs and bystander attitudes among incoming college students. Journal of American College Health, 59(1), 3–11.
McMaster, H. S., LeardMann, C. A., Speigle, S., & Dillman, D. A. (2017). An experimental comparison of web-push vs. Paper-only survey procedures for conducting an in-depth health survey of military spouses. BMC Medical Research Methodology, 17(1), 1–9. https://doi.org/10.1186/s12874-017-0337-1
Metcalfe, C., & Pickett, J. T. (2022). Public fear of protesters and support for protest policing: An experimental test of two theoretical models*. Criminology, 60(1), 60–89. https://doi.org/10.1111/1745-9125.12291
Mourtgos, S. M., & Adams, I. T. (2020). Assessing Public Perceptions of Police Use-of-Force: Legal Reasonableness and Community Standards. Justice Quarterly, 37(5), 869–899. https://doi.org/10.1080/07418825.2019.1679864
Mourtgos, S. M., Adams, I. T., & Nix, J. (2022). Elevated police turnover following the summer of George Floyd protests: A synthetic control study. Criminology & Public Policy, 21(1), 9–33.
Mullinix, K. J., Bolsen, T., & Norris, R. J. (2021). The feedback effects of controversial police use of force. Political Behavior, 43(2), 881–898. https://doi.org/10.1007/s11109-020-09646-x
Mummolo, J. (2018a). Militarization fails to enhance police safety or reduce crime but may harm police reputation. Proceedings of the National Academy of Sciences, 115(37), 9181–9186. https://doi.org/10.1073/pnas.1805161115
Mummolo, J. (2018b). Modern police tactics, police-citizen interactions, and the prospects for reform. The Journal of Politics, 80(1), 1–15. http://dx.doi.org/10.1086/694393
Nadeem, R. (2020). Trust in Medical Scientists Has Grown in U.S., but Mainly Among Democrats. Pew Research Center. https://www.pewresearch.org/science/2020/05/21/trust-in-medical-scientists-has-grown-in-u-s-but-mainly-among-democrats/
Nix, J., Ivanov, S., & Pickett, J. T. (2021). What does the public want police to do during pandemics? A national experiment. Criminology & Public Policy, 20(3), 545–571. https://doi.org/10.1111/1745-9133.12535
Nix, J., & Wolfe, S. E. (2017). The Impact of Negative Publicity on Police Self-legitimacy. Justice Quarterly, 34(1), 84–108. https://doi.org/10.1080/07418825.2015.1102954
Novick, R., Socia, K. M., & Pickett, J. T. (2022). Asymmetric Compassion Collapse, Collateral Consequences, and Reintegration: An Experiment. Justice Quarterly, 39(7), 1475–1498. https://doi.org/10.1080/07418825.2022.2119157
O’Donohue, W., Yeater, E. A., & Fanetti, M. (2003). Rape prevention with college males: The roles of rape myth acceptance, victim empathy, and outcome expectancies. Journal of Interpersonal Violence, 18(5), 513–531.
Parker, K. (2021). What’s behind the growing gap between men and women in college completion? Pew Research Center. https://www.pewresearch.org/fact-tank/2021/11/08/whats-behind-the-growing-gap-between-men-and-women-in-college-completion/
Parkhurst, J. (2017). The politics of evidence: From evidence-based policy to the good governance of evidence. Taylor & Francis.
Patten, E., & Perrin, A. (2015). Who’s left out in a web-only survey and how it affects results. Pew Research Center. https://www.pewresearch.org/short-reads/2015/09/22/who-s-left-out-in-a-web-only-survey-and-how-it-affects-results/
Pew Research Center. (2023). Party affiliation among adults in South Carolina by political ideology. https://www.pewresearch.org/religion/religious-landscape-study/compare/party-affiliation/by/political-ideology/among/state/south-carolina/
Pickett, J., Cullen, F., Bushway, S. D., Chiricos, T., & Alpert, G. (2018). The response rate test: Nonresponse bias and the future of survey research in criminology and criminal justice. Available at SSRN 3103018.
Pickett, J. T. (2016). On the social foundations for crimmigration: Latino threat and support for expanded police powers. Journal of Quantitative Criminology, 32(1), 103–132.
Pickett, J. T. (2017). Methodological myths and the role of appeals in criminal justice journals: The case of response rates. ACJS Today, 41(3), 61–69.
Pickett, J. T. (2019). Public opinion and criminal justice policy: Theory and research. Annual Review of Criminology, 2, 405–428.
Pickett, J. T., Graham, A., & Cullen, F. T. (2022). The American racial divide in fear of the police. Criminology, 60(2), 291–320.
Pickett, J. T., Ivanov, S., & Wozniak, K. H. (2020). Selling effective violence prevention policies to the public: A nationally representative framing experiment. Journal of Experimental Criminology, 1–23.
Pickett, J. T., Nix, J., & Roche, S. P. (2018). Testing a social schematic model of police procedural justice. Social Psychology Quarterly, 81(2), 97–125. https://doi.org/DOI:10.1177/0190272518765134
Pickett, J. T., & Ryon, S. B. (2017). Procedurally just cooperation: Explaining support for due process reforms in policing. Journal of Criminal Justice, 48, 9–20.
Pryce, D. K., Johnson, D., & Maguire, E. R. (2017). Procedural justice, obligation to obey, and cooperation with police in a sample of Ghanaian immigrants. Criminal Justice and Behavior, 44(5), 733–755. https://doi.org/10.1177/0093854816680225
Reiner, R. (2010). The Politics of the Police. OUP Oxford.
Reny, T. T., & Newman, B. J. (2021). The opinion-mobilizing effect of social protest against police violence: Evidence from the 2020 George Floyd protests. American Political Science Review, 115(4), 1499–1507.
Rydberg, J., Dum, C. P., & Socia, K. M. (2018). Nobody gives a#% &!: A factorial survey examining the effect of criminological evidence on opposition to sex offender residence restrictions. Journal of Experimental Criminology, 14, 541–550. https://doi.org/10.1007/s11292-018-9335-5
Santos, J. R. A. (1999). Cronbach’s alpha: A tool for assessing the reliability of scales. Journal of Extension, 37(2), 1–5.
Schiff, K. J., Clark, T. S., Glynn, A. N., Owens, M. L., Gunderson, A., & Dobbie, E. (2022). Knowing is Half the Battle? The Effect of Information about Police Shootings of Civilians on Public Support for Police Reforms. https://static1.squarespace.com/static/58d3d264893fc0bdd12db507/t/63e409b790c62c06c962e9b6/1675889081751/OIS_Info_Survey_Experiment.pdf
Sharp, E. B., & Johnson, P. E. (2009). Accounting for Variation in Distrust of Local Police. Justice Quarterly, 26(1), 157–182. https://doi.org/10.1080/07418820802290496
Sherman, L. W. (2005). The Use and Usefulness of Criminology, 1751-2005: Enlightened Justice and Its Failures. The ANNALS of the American Academy of Political and Social Science, 600(1), 115–135. https://doi.org/10.1177/0002716205278103
Sherman, L. W. (2013). The rise of evidence-based policing: Targeting, testing, and tracking. Crime and Justice, 42(1), 377–451.
Sherman, L. W. (2015). A tipping point for “totally evidenced policing” ten ideas for building an evidence-based police agency. International Criminal Justice Review, 25(1), 11–29.
Sierra-Arévalo, M., Nix, J., & Mourtgos, S. M. (2023). The “War on Cops,” Retaliatory Violence, and the Murder of George Floyd. https://jnix.netlify.app/publication/51-crim-war-on-cops-george-floyd/
Singer, L., & Cooper, S. (2009). Improving public confidence in the criminal justice system: An evaluation of a communication activity. The Howard Journal of Criminal Justice, 48(5), 485–500. https://doi.org/10.1111/j.1468-2311.2009.00590.x
Telep, C. W. (2016). Expanding the scope of evidence-based policing. Criminology & Pub. Pol’y, 15, 243.
Todak, N. (2017). The decision to become a police officer in a legitimacy crisis. Women & Criminal Justice, 27(4), 250–270.
Tyler, T. R. (1990). Why people obey the law. Yale Univ. Press.
Tyler, T. R. (2004). Enhancing police legitimacy. Annals of the American Academy of Political and Social Science, 593, 84–99.
United States Census Bureau. (2022). Quick facts: South Carolina. https://www.census.gov/quickfacts/SC
Unnever, J. D., Cullen, F. T., & Fisher, B. S. (2007). “A Liberal Is Someone Who Has Not Been Mugged”: Criminal Victimization and Political Beliefs. Justice Quarterly, 24(2), 309–334. https://doi.org/10.1080/07418820701294862
Vaughn, P. E., Peyton, K., & Huber, G. A. (2022). Mass support for proposals to reshape policing depends on the implications for crime and safety. Criminology & Public Policy, 21(1), 125–146.
Washburn, E. (2023). America Less Confident In Police Than Ever Before: A Look At The Numbers. Forbes. https://www.forbes.com/sites/emilywashburn/2023/02/03/america-less-confident-in-police-than-ever-before-a-look-at-the-numbers/?sh=3db8d22b6afb
Wentz, E. A., & Schlimgen, K. A. (2012). Citizens’ perceptions of police service and police response to community concerns. Journal of Crime and Justice, 35(1), 114–133.
White, M. D., Orosco, C., & Terpstra, B. (2021). Investigating the Impacts of a Global Pandemic and George Floyd’s Death on Crime and Other Features of Police Work. Justice Quarterly, 1–28.
Worrall, J. L., & Gordon, Q. (2022). Is Criminology & Public Policy “influential?” Answers from altmetrics. Criminology & Public Policy.
Wozniak, K. H., Drakulich, K. M., & Calfano, B. R. (2021). Do photos of police-civilian interactions influence public opinion about the police? A multimethod test of media effects. Journal of Experimental Criminology, 17(2), 1–27. https://doi.org/10.1007/s11292-020-09415-0
Wozniak, K. H., Pickett, J. T., & Brown, E. K. (2022). Judging Hardworking Robbers and Lazy Thieves: An Experimental Test of Act-vs. Person-Centered Punitiveness and Perceived Redeemability. Justice Quarterly, 39(7), 1565–1591.
Table A1. Balance Table
N = 479
N = 524
N = 528
57 / 477 (12%)
66 / 523 (13%)
58 / 525 (11%)
184 / 477 (39%)
174 / 523 (33%)
190 / 525 (36%)
236 / 477 (49%)
283 / 523 (54%)
277 / 525 (53%)
240 / 477 (50%)
268 / 524 (51%)
262 / 528 (50%)
237 / 477 (50%)
256 / 524 (49%)
266 / 528 (50%)
397 / 477 (83%)
449 / 524 (86%)
448 / 526 (85%)
80 / 477 (17%)
75 / 524 (14%)
78 / 526 (15%)
No diploma or GED
4 / 478 (0.8%)
1 / 524 (0.2%)
4 / 528 (0.8%)
High school degree
39 / 478 (8.2%)
56 / 524 (11%)
40 / 528 (7.6%)
89 / 478 (19%)
104 / 524 (20%)
118 / 528 (22%)
57 / 478 (12%)
66 / 524 (13%)
64 / 528 (12%)
163 / 478 (34%)
174 / 524 (33%)
159 / 528 (30%)
126 / 478 (26%)
123 / 524 (23%)
143 / 528 (27%)
292 / 477 (61%)
338 / 522 (65%)
352 / 527 (67%)
113 / 476 (24%)
121 / 524 (23%)
101 / 528 (19%)
53 / 479 (11%)
50 / 524 (9.5%)
47 / 528 (8.9%)
238 / 478 (50%)
246 / 523 (47%)
254 / 528 (48%)
Trust in Police
Note: n / N (%); Mean (SD);Pearson’s Chi-squared test; balance across covariates shown only for treatment on one outcome (police effectiveness), as respondents received new treatments for each outcome presented. Balance across other treatment/outcome dyads similar (i.e., balanced).
Table A2. Means and conditional means of outcomes
Overall Mean (SD)
Table A3. Treatment heterogeneity by sex
Confirm * Female
Negative * Female
Note: control variables not shown; + p < 0.1, * p < 0.05, ** p < 0.01, *** p < 0.001
Table A4. Treatment heterogeneity by partisan identification
Confirm × Moderate
Confirm × Conservative
Negative × Moderate
Negative × Conservative
Note: control variables not shown; + p < 0.1, * p < 0.05, ** p < 0.01, *** p < 0.001
Table A5. Treatment heterogeneity by race
Confirm * Nonwhite
Negative * Nonwhite
Note: control variables not shown; + p < 0.1, * p < 0.05, ** p < 0.01, *** p < 0.001
Table A6. Uncontrolled experimental results
+ p < 0.1, * p < 0.05, ** p < 0.01, *** p < 0.001
Table A7. Variance Inflation Factor (VIF) Check
GVIF^(1/(2Df)) reported. The GVIF^(1/(2Df)) is an adjusted VIF value that accounts for the degrees of freedom, and it is more appropriate for comparing multicollinearity across predictors with different numbers of categories. Unadjusted VIF values all indicate non-problematic multicollinearity.