Skip to main content
SearchLoginLogin or Signup

The Irrationalities of Rationality in Police Data Processes

Published onNov 30, 2021
The Irrationalities of Rationality in Police Data Processes
·

Abstract

This paper explores how police bureaucracies, in their pursuit of greater accountability and management efficiencies, create what are intended to be rational data collection and use processes. However, these processes often produce unintended consequences: namely, behaviours, practices, and policies that confound an organization’s goals. Drawing on Ritzer’s McDonaldization thesis and qualitative data from two Canadian police organizations, we argue that although police bureaucracies focus on maintaining efficiency, calculability, predictability, and control when it comes to their data processes, not only do inaccuracies occur, but they happen because an over-emphasis on rational processes can produce forms of irrationality.

Keywords: McDonaldization; Police; Bureaucracy; Data Collection; Data Use

Corresponding Author: Jacek Koziarski – [email protected]

This is a pre-copyedited, author-produced version of an article accepted for publication in Policing & Society, following peer review. The version of record, Huey, L., Ferguson, L., & Koziarski, J. (2021). The Irrationalities of Rationality in Police Data Processes. Policing & Society, is available online at: http://dx.doi.org/10.1080/10439463.2021.2007245. When citing, please cite the version of record.

Introduction

The origins of the present study began from a simple observation: police paperwork is often an irrational process. This observation occurred while one of the authors was conducting fieldwork in the offices of a rural Canadian police station. During this time, she watched while a police constable spent a week struggling to reduce his backlogged queue of motor vehicle collision reports. Curious as to why this process appeared so time-consuming, she asked for and was provided copies of the relevant reports. What this revealed is that the officer was being asked to provide not only duplicate information for different audiences but also highly detailed information on such factors as vehicle tire conditions for external audiences, such as government agencies and insurance companies. The same held true when forms were collected for two other types of events: impaired driving and domestic violence. Impaired driving, it was found, can require an officer to fill out 16 different forms, many of which are related to legal and court purposes, others for use by private insurers or government agencies. Domestic violence calls can require officers to complete up to 13 forms that similarly reveal a mix of audiences and uses.

What preliminary analysis of audiences and uses of these documents revealed is that there is no singular source demanding the production of knowledge gathering efforts, instead of the paperwork burden – at least with respect to these two offense types in this particular province – is the result of the confluence of different institutional actors each (we presume) rationally pursuing the collection of useful information in order to meet their respective mandates. Armed with this insight, we entered the present study to explore the issue of police data more deeply, focusing on how errors are generated and sustained through what are intended to be rational institutional processes. To explore this issue, fieldwork was conducted with the support of two municipal Canadian police services. This paper draws on the interview portion of the study, which included 46 in-depth interviews with police personnel. The analysis of data from this source reveals a classic example of Ritzer’s (1993) concept of the ‘irrationality of rationality,’ wherein situations are produced in which multiple actors, all rationally pursuing information to inform policy-making or other choices, create a combined volume of demands that produce unintended negative consequences.

Framing the Issue

Ritzer’s (1993) influential book, The McDonaldization of Society, applies Max Weber’s (1903, 1921, 1925) theories of modern bureaucracy and rationality to aspects of contemporary living. Notably, Ritzer adopts and expands Weber’s (1905) famous warning that, within capitalist societies, we are increasingly finding ourselves living within ‘iron cages’ – rational, impersonal spaces that value efficiency, rationality, and bureaucratic controls over autonomy and individuality. Using the fast-food giant, McDonald’s, as both the site of his analysis and as a metaphor for how we increasingly live our lives, Ritzer sketches out four significant dimensions of what he terms ‘McDonaldization’: ‘efficiency’ (proficient and productive completion of tasks); ‘calculability’ (ability to produce quantifiable results for analysis); ‘predictability’ (ensuring that processes and outcomes remain the same over time); ‘control’ (replacing autonomous and fallible humans with technology).

As noted, it is not only fast-food restaurants that are concerned with these issues. A perusal of the policing literature on what has become known as the ‘New Public Managerialism’ (NPM) movement – in essence, an approach premised on the belief that government should run like a private enterprise – reveals Ritzer’s dimensions to be of similar concern to police leaders and mid-level managers (Reiner, 1993; Waters, 2000; McLaughlin, 2007). Police managers are increasingly expected to implement policies and procedures that emphasize fiscal responsibility, accountability, standardization of processes, performance measures, and competitiveness, explaining the increasing need for controls over employee performance (McLaughlin and Murji, 2001). Under NPM, such policies and procedures are employed with a specific focus on controlling and producing economy, efficiency, and effectiveness (Butterfield, Edwards, and Woodall, 2004). Frontline policing is typically performed within a chaotic environment in which officers have historically exercised a fair amount of discretion (Goldstein, 1964). Increasing demands for public accountability, combined with pressures from within and outside of governments for public agencies to function akin to corporations – including an emphasis on customer service – means a ratcheting up of internal controls (e.g., performance measures and indicators, auditing and compliance models, accountability structures, increased supervision and oversight), not only within the police environment but also within and across other public sector agents (Butterfield et al., 2004; McLaughlin, 2007; Sanders and Langan 2019). This said, criticisms of NPM within policing have surfaced, namely that NPM is more to do with conserving a myth of control, presents little evidence of any real improvements in operational efficiency, and has an acute focus on managerial rather than practical concerns (see e.g., Power, 1997; Butterfield et al., 2014; Sanders and Langan 2019). Additionally, and ironically, the more control exerted over an environment, the less efficient it can become – an example of what Ritzer (1993) terms the ‘irrationality of rationality.’

Whereas the drive to rationality is a conscious desire manifested in intention, according to Ritzer’s thesis, irrationality is simply the opposite of rationality and occurs not because the individual’s or group’s intentions are objectively irrational but rather as a largely unintended or ill-understood outcome. Simply put, negative consequences occur because “rational systems inevitably spawn irrationalities that limit, eventually compromise, and perhaps even undermine their rationality” (2014: 132). Weber’s ‘iron cage’ and the real and potential negative consequences to society and economy of stripping human autonomy from decision-making represent one such form of irrationality or negative consequence. Aside from disenchantment and dehumanization, rational systems can also produce inefficiency and excessively high costs (Ritzer, 2014). It is with these insights in mind that we approach the subject of police paperwork.

It is one of the truisms of public policing that police officers spend a significant portion of their on-duty hours engaged in paperwork (Ericson, 1982; Ericson and Haggerty, 1997; Chan, 2005; Brodeur and Dupont, 2006), often to the expense of patrol and other proactive duties associated with crime prevention and public safety (Malm et al., 2005). It is also well accepted that the volume of police paperwork has increased significantly over the years (Ericson and Haggerty, 1997; Malm et al., 2005; Brodeur and Dupont, 2006), resulting in enormous organizational pressures (Hedgley, 2007; Lasiewicki, 2007). Today, paperwork is no longer a minor nuisance for officers – consuming a small portion of their shift in the form of occurrence reports (Ericson, 1982) – it imposes a heavy burden on personnel, who are often frustrated and demoralized by filling out what they perceive to be ‘meaningless reports’ (Lasiewicki, 2007), and may feel pressured to work hours of unpaid overtime simply to catch up on incomplete paperwork (Malm et al., 2005).

Although the burden of police paperwork has been reported on within the research literature (Ericson and Haggerty, 1997; Skogan, 2010; Gundhus, 2012) and has been cited within policing circles as necessitating policy and practice reforms (Flanagan, 2007), this key aspect of police work has generated little sustained attention from Canadian researchers or policymakers. Concerning the former, we have a collection of reports in which the paperwork burden is referenced, often appearing within the context of larger police time-use studies (Brehm, Gates and Gomez 2003; Malm et al., 2005). Those few studies that have specifically examined the topic of police paperwork within the Canadian context (Ericson and Haggerty, 1997; Malm et al., 2005) contain conflicting accounts as to its causes and consequences and thus few solutions as to how to address the burden. For example, Malm et al. (2005) attributed increasing paperwork to judicial decisions in criminal case law. Conversely, Ericson and Haggerty (1997) argue that one of the single largest drivers of information collection by police organizations, beyond the desire to establish internal bureaucratic controls, is the need to meet extra-institutional demands for knowledge from insurance companies and government. All in all, the existing literature leaves a gap in our understanding of its organizational effects in Canada.

Method of Inquiry

This study draws upon data gathered as a part of a more extensive study involving in-depth interviews, field observation, and content analysis of reports provided by the police services studied. Specifically, for our analysis, we utilize the interview data only in order to examine police officer experiences and opinions with respect to our topic. To this end, the content collected from the interview data was open-mindedly coded and analyzed to address the following research questions: (1) Do the dimensions of Ritzer’s (1993) McDonaldization thesis surface in police data processes, and how and where?; (2) How do frontline patrol officers and police auditors experience their work with data collection and process checking?; and (3) What are the implications of data collection and process checking on the work of court officers and crime analysis units?

Data Collection

Data for this study were collected from June 2017 to December 2018. Two medium-size1 municipal police services in two Canadian provinces agreed to participate, of which were selected using purposive sampling. For confidentiality, this study uses a pseudonym when referring to each police service. These are, therefore, Kerbyton Police Service (KPS) in Ontario and Belleview Police Service (BPS) in British Columbia. KPS has approximately 600 sworn members and 200 civilian staff members serving a population of some 400,000 residents; and BPS has approximately 200 sworn members and 100 civilian staffers for a population of about 110,000 residents.

To garner participants while avoiding any burden on operations, each police service actively assisted with ensuring access to a potential pool of candidates for the interviews and field observation (e.g., providing potential participants with our contact information and rooms for conducting interviews). Such an approach can introduce risks that threaten to undermine the confidentiality of our study participants. However, to mitigate said potential risks, each interview was conducted in isolated, closed rooms with only one participant and one member of the research team present. Also, as an added measure to protect the confidentiality of our participants, the data are anonymized within the reporting of the findings below. We, for instance, adjusted identifying characteristics like sex/gender to avoid unintentionally revealing identities. We also obtained respondents by visiting these police services and talking with officers about our study, ensuring that our sample does not solely include agency-selected participants and attenuating sampling bias concerns.

As discussed, previous research indicates that paperwork takes up a significant portion of frontline officers’ time during their shifts and is one of the least liked aspects of policing. Following us connecting with officers and BPS and KPS assisting us in obtaining respondents, this was then explained to potential participants that this project could assist in informing how police services operate with respect to paperwork. Armed with this information, individuals could then decide if they wanted to voluntarily participate; in other words, they self-selected to be interviewed. These recruitment processes resulted in a final sample of forty-six (n=46) in-depth, semi-structured, face-to-face interviews being conducted with frontline officers, data auditors, court officers, crime analysts, police association representatives, and other personnel whose functions are relevant to the study’s aims (see Table 1).

[TABLE 1 ABOUT HERE]

Interviews were approximately 45 minutes in length and were recorded either digitally or through handwritten notes (as per the choice of the participant). All interviews were conducted independently by a member of the author’s research team in accordance with Canadian Tri-Council guidelines and our University’s Research Ethics Board. The interview guide consisted of questions about the individuals’ role within the police service; relationship with collecting, checking, and using crime data; perceptions about collecting, checking, and using crime data; and opinions on factors that may impact the quality of police crime data. Although the interviews were semi-structured, the style of questions was open-ended; thus, participants could provide lengthy explanations, and then answers could be expanded and explored further throughout the interview.

Data Coding and Analysis

Data were coded and analyzed using Braun and Clarke’s (2006) six-step approach to thematic analysis, and, in the initial coding, an inductive approach was used. Transcripts were read line-by-line, and notes were taken on potentially interesting themes. This first stage of coding revealed several themes that aligned with concepts explored by Ritzer’s McDonaldization thesis. To further explore this potential relationship, we conducted a second round of focused coding using a deductive approach that drew specifically on key concepts found in Ritzer’s work. To illustrate some of the codes developed from our thematic approach, see Table 2 below. Another member of the research team independently verified all codes.

[TABLE 2 ABOUT HERE]

Further, to determine the extent to which the themes identified, and the codes abstracted from the data were represented across interviews, we provide a table (see Table 3) that indicates the number of participants who expressed beliefs, thoughts, experiences, and/or opinions related to one of the central themes.

[TABLE 3 ABOUT HERE]

Findings

Trying to Create Rational Processes of Data Collection, Sharing, and Use

In this section, we present our analysis, which reveals both the ways in which the police services studied try to impose some sense of rationality upon their data collection processes and the various means by which the goals of efficiency, calculability, predictability, and control are undermined. As we highlight and expand upon below, these goals are often undermined by inherent flaws within the processes put in place and/or as a result of competing internal and external views of necessity and rationality.

Efficiency

As may be recalled, efficiency refers to the most competent or optimal method for achieving a task. Modern police organizations, especially those adhering to the tenets of NPM, are constantly seeking increased efficiencies. This has particularly been the case in Canada since 2012, when governments began citing escalating police budgets as justification for demanding increased police effectiveness and cost-centered efficiencies (Bass, Kinney, and Brantingham, 2014). In relation to data, several police services have gone the route of making substantial investments in data collection and analysis tools to aid with deployment and other efficiencies.

In the case of the two services studied here, neither had yet gone this route. Instead, the emphasis was on using existing software and human-based processes aimed at increasing individual officer efficiency at the front-end of the data process. However, as we show in this section, call volumes and investigation-related issues also vie for police officer time and attention and can create conflicting demands on officers with respect to what gets prioritized and is therefore dealt with ‘efficiently.’ Further, some of the ‘efficiency fixes’ implemented in both the software and the data flow process itself can lead to data and other errors that serve to decrease the efficiency of actors in other parts of the system.

In interviews, officers and other participants in both services observed that their agencies sought to create efficiencies in data entry through increased mobile access and online document templates and auto-fill functions within the software. For example, a BPS crime analyst described her relatively recent ability to access files “anywhere” as “brilliant” because “if somebody says a major case happened, we can login and we can do our work.” Whereas this was seen as increasing work efficiencies for the crime analysts, patrol and other frontline officers were more inclined to focus on software templates that had been internally created for them to aid in completing reports on special case types more accurately and more quickly. As one patrol Sergeant from BPS explained, “there is some specific templates for the missing person investigation, as well as for domestic violence, and the reason for those is to make sure that all of the required information is actually captured.” He then added, “I know some members, especially initially, have a hard time with filling out templates but it really is, to be honest, an efficient way.” His colleague agreed with this assessment. “Bit by bit, we’re adding a few more templates,” she explained, “there’s kind of a mindset that it's asking people to do more paperwork, when in fact it's the opposite. I'm actually asking them to do less paperwork, and I'm showing them this is the information that I'm looking for you to obtain.”

KPS employs a different software package for data collection and reports writing, so officers here spoke more frequently of the relative ease of having software functions that could auto-fill information or ‘drop boxes’ to remind them of what fields they needed to ‘tick off.’ Of particular use to patrol officers in KPS is that the computer-aided dispatch (CAD) reports sync to the software used for Occurrence reports, thus allowing information to be automatically transferred from the E-Comm centre. “If I go to a call and I’m going to write a report, basically [I’ve got] the occurrence number … what the type of call is, what the address is, and then there is some brief information of what the call is. And then there is name and etc.,” a Patrol Officer explained, “so, you can actually prefill that.” Individual officers in KPS also try to create their own efficiency methods to save on call time. An older, highly experienced officer advised that many younger officers employ ‘cheater sheets’ that provide step-by-step instructions on how to remember the steps necessary for certain call types and related reports. These ‘cheater sheets’ help them reduce call and reporting times, by focusing their attention on what needs to be done.

In many Canadian jurisdictions, the reality of policing is that call volumes are frequently high, with officers reporting wait queues of up to forty or more calls. Again, these are mid-sized agencies in smaller cities. Thus, police officers often have to juggle two sets of often conflicting demands on their time: efficiently and accurately completing reports or responding to outstanding calls. Both sets of interviews revealed the stresses of these twin demands and how this creates trade-offs in efficiencies and effectiveness. In relation to the stress, a KPS officer explained, “many times I will be sitting writing a report, or even two or three reports at the same time … I may have to go to hide so I can ensure to get the work done, cause anxiety is building up. I’m on call number three and the others have nothing written up.” Such pressures cause officers to take shortcuts to clear their report queue. “My partners are looking at me,” a BPS patrol officer explained, “What the hell are you doing? We’re wasting time. We got this call stacked and you’re writing your reports, taking you an extra thirty minutes to do all this!” Their solution? “Shut up and send [the report in]. Just send it." In some instances, police supervisors on the same watch may prioritize demands differently creating additional stress. “We had one [supervisor] that was ‘clear the [paper] workflow, the reds are out of control, this is silly’ but on the flip side we had another supervisor who said, ‘you need to go take more calls, go take more calls, worry on your paperwork later!”

The primary consequence of leaving reports unfinished, or hurriedly trying to complete paperwork, is data error, which can cause inefficiencies across several stages of the data process. It was a routine occurrence for participants across each of the interview categories to acknowledge data errors, from incomplete and missing records to factually incorrect details. When such errors are caught by auditing staff, the report is returned to the officer’s workflow to be corrected and resubmitted. In some services – but not all – frontline supervisors may be required to verify the completed work, adding to their own workloads. An officer in KPS acknowledged that “a lot of times we get follow-ups, like [the work] is incomplete.” In some instances, it’s because they “haven’t had a chance to finish it.” In other circumstances, it’s because the paperwork was hurriedly finished to answer incoming calls. A crime analyst in KPS describes hours of reading occurrence reports in order to pull together data on crime trends because the information required from the officers’ reports can be found in the synopsis but was not entered correctly in the rest of the report’s data fields. “Most times, the information is there. It’s just not being put in correctly.” Such activities, which occur frequently for crime analysts, was characterized as “super time consuming.”

Calculability

Calculability refers to the ability to quantify objects or break down tasks into measurable elements for the purposes of increasing organizational effectiveness. Policing has long been obsessed with statistics – particularly in the form of ‘crime stats’ – as police performance measures or as justifications for increased resources (Guilfoyle, 2013). Not surprisingly, then, we see repeated references to the uses of data in statistical form for both internal and external audiences. Internally, data is ‘crunched’ by research and crime analysts to create crime and other analyses that can direct police deployment, investigational, and other organizational activities. For example, a KPS senior officer opined that police data was required to “predict what is going to happen in the following year” and how many officers would be required to fill various spots. A frontline supervisor in the same agency, said “we rely on data … for officer safety,” indicating that the ability to ‘run an address’ to check for previous weapon-related calls was “very important.” Externally, police data is fed to other government agencies, most notably Crown prosecution services, but also provincial Ministries of Transportation, Ministries of Community Safety, Offices of Attorneys-General, among others.

More commonly, however, police officers referenced police data and the statistics produced through clearing their paperwork workloads as serving as direct or indirect performance measures. In KPS, an officer believed that her organization might not categorize such stats “as a performance measure, but it one hundred percent is.” A frontline supervisor in the same organization agreed. “At the end of every month I get their statistics reports and I'll have to go meet with my individual people to tell them whether or not they're meeting the objectives. Or whether or not their call times are too high, whether their call times are too low.” We received similar responses from BPS officers. Statistics, one officer opined, allows police supervisors “to see how people are doing.”

In theory, the collection and use of data to enhance an organization’s understanding of its operations is a perfectly rational activity, one that is the cornerstone of management philosophies everywhere. That said, too much of a good thing can potentially produce disastrous effects. What personnel in both of the police services described was this very thing: increased levels of reporting ostensibly for oversight and efficiency purposes. As one officer in BPS observed, “we have this mental health page that we’ve now added to every call that you type a file, and you have to put ‘is it mental health related or?’. It’s just one extra level text page but it’s just one more thing to do, right?” Then there is the new form required for administering naloxone. And forms required specifically for missing persons cases as a result of provincial legislative changes. And Ministry of Transport forms for vehicle collisions. Some of the reports require specific coding for the Centre for Criminal Justice Statistics (CCJS) to produce annual crime statistics, which again adds another layer of work, as some of the coding is counter-intuitive for many officers. “In terms of ethnicity, while ‘W’ is Caucasian, ‘C’ is Black. Why wouldn’t ‘C’ be Caucasian, and ‘B’ be Black?” In terms of frontline efficiencies, all of this extra data collection begins to become increasingly unmanageable given the volume of calls. In BPS, it was explained this way, “One of the issues we're dealing with is the amount of files our members have to deal with and getting overloaded on the amount of paperwork. So, adding just another page, and if you multiply the little bit of time by how many files we get, so 50,000 files in a year, it adds up.” And, as we discuss shortly, it creates potential backlogs in other parts of the system, particularly when data errors are factored in.

Predictability

To illustrate the concept of predictability as an organizational concern, Ritzer (1993) uses the example of McDonald’s fast-food restaurants, which are consciously developed to look very similar, rely on the same ordering process, and typically feature many of the same staple items on its menus wherever you are in the world. In other words, customers can expect the ‘McDonald’s experience’ wherever they go as a result of the organization’s commitment to standardized services that create that uniform experience. It could be reasonably argued that much of policing is anything but routine. However, significant effort is expended in trying to create standardized forms of data collection to ensure operational needs are met.

As one Quality Control Auditor observed, it is imperative that police services ensure “information is correct and up-to-date” so that quality remains consistent within and across organizations. As he further explained, “When we pull out a report from [another agency], we don’t want to pull up a report that we can barely read or it's wrong or there’s something that’s not right with it. We want to make sure we’re doing consistently the same good job.” In the proceeding sections, we presented some of the ways in which conformity in reporting is encouraged, namely through the use of templates and checklists. For example, in BPS, officers use a domestic violence checklist to ensure they follow a number of steps in documenting their handling of a case.

Police personnel interviewed also spoke of training manuals for ensuring quality and consistency of reporting, as well as creating training content and/or providing ‘on parade refreshers’ – that is, talks to patrol platoons prior to beginning a shift – on the importance of good data quality. One area of data collection and sharing in which this was deemed particularly important was in relation to the creation of data for CCJS statistics. Not only is there a “big manual” on how to code data appropriately, but the software used in KPS allows officers to run electronic checks for errors, even “from your car.” That said, some officers acknowledged they do not run the self-checks, relying instead on auditors to catch errors.

Despite a requirement in both police services for collecting data in manifestly uniform thus consistent formats to satisfy the information needs of both internal and external audiences, there still remains significant scope for error or other problems. Some of these are caused by limitations of the software otherwise intended to facilitate consistent and accurate information. One KPS officer laughed because “the spell check [in the software] is ridiculously bad.” Other issues arise due to incompatible demands from different systems. In KPS, officers use military time for witness statements and in general occurrence reports. However, these times have to be translated to the 12-hour clock for prosecution summaries. “Which is confusing … and obviously a source of error,” an officer advised. Another source of both inconsistency and possible inaccuracies, is the fact that general occurrence reports are not template based, leaving room for “interpretation” by officers. Indeed, both officers and analysts acknowledged that “for your everyday occurrence files, it’s a gamut [of information and styles].” As to how to make such files consistent? “No one knows.” Even something as simple as an address can be “a huge problem”, especially for crime analysts, as one advised, because of the variety of possible locations outside of a regular street address. “This is totally going to be contingent on that officer’s perception.” Inconsistencies were also observed in relation to different standards of quality control placed on individual platoons by frontline supervisors. “There is no standard. That’s the problem.”

Control

Establishing processes to control operations is intended to ensure the effective delivery of efficient services or products through limiting errors. Thus far, much of the focus of this paper has been on the use of technology to limit error and increase output and some of the problems this can create; however, in this section, we will focus more exclusively on the human processes put into place to control the flow of data within police organizations and to other agencies.

Data collection and sharing processes vary across police services within Canada and elsewhere. For ease of understanding and clarity on the varying processes, we provide two different examples that map out how data can flow through police organizations in the Canadian context, starting at a call for service or officer-generated event. To note, Figure 1 displays an example of the data process map involving an auditing unit; whereas Figure 2 shows the same but with auditing being carried out by a supervisor.

[FIGURE 1 ABOUT HERE]

[FIGURE 2 ABOUT HERE]

As can be seen in the charts above, in theory, there are auditing processes in place to catch data errors, incomplete reporting and so on. This is typically done by a dedicated auditing unit or by the supervisor of the reporting officer. In interviews, officers and auditors emphasized the ways in which auditing is necessary to check data quality, particularly in relation to data sharing with external agencies:

We read all the reports that are submitted by the uniformed division, the uniformed patrol. We review all their charges that they lay, both criminal and provincial. We check for accuracy. Of course, there’s government protocols of deadlines that have to be met. So, we check for that. We review all the motor vehicle accidents that are submitted. Checking for accuracy, errors, correct charges that are being laid, because they get submitted to the Ministry.

Errors or missing information can cause a report to be sent back to the reporting officer, whose supervisors must verify the work was corrected before it is released from their workflow. Depending on the nature of the information, it may also go through various database-specific audits to ensure reporting standards for other, external systems have also been met.

In theory, both self-auditing and systematic auditing processes should provide sufficient controls to ensure data quality. However, personnel from both police services acknowledged this was not the case. In KPS, crime analysts were frustrated with the amount of data they had to clean to do their work. To illustrate, one crime analyst cited an internal review on paperwork associated with property-related cases in which it was discovered “there was like a 75% error rate.” What causes such errors to get missed through auditing? Stated factors include significant volumes of paperwork, high error rates, processing backlogs, and often a strict focus on ensuring certain information is present to keep the flow of paperwork moving to other agencies to meet strict timelines.

Irrationality as the Unintended Consequence

Throughout the preceding pages we have highlighted several of the different ways in which efforts by police services to be efficient, calculable, predictable, and in control of the flow of data and data quality are stymied. In this section, we expand on this by showing how efforts to engage in what might appear to be rational and reasonable data collection efforts can create duplication of effort and time and resource wastage, thereby creating operational bottlenecks and increasing resource demands. Compounding such issues is the fact that, despite efforts to ensure quality, data error and missing information remains a significant issue, causing crime analysts to spend significant chunks of time trying to clean and piece together data for their various tasks.

Perhaps the single best example of the ‘irrationality of rationality’ in relation to police data collection is the significant amount of duplication and triplication that occurs as a result of demands that fall under the twin umbrellas of ‘calculability’ and ‘predictability’. Under calculability we have statistics that are ostensibly produced for internal operation needs, and for predictability we find information reporting that must be consistent with both internal systems, as well as the presumed operational and other needs of external agencies. Police officers interviewed were of the view they were “drowning in paperwork” for purposes of which they were not aware and therefore saw little use. For example, one expressed the view that “nobody looks at these. They go into the box of ‘never to be seen again.’ So why are we doing them? In the city, nobody is pulling these forms from the MTO [Ministry of Transportation of Ontario] and saying we need to refigure our roads.” To fill out this form, another officer advised, one doesn’t just spend 30-45 minutes completing one piece of work, “I write in my book and then I go to the car and put it into a report. And then I have to fill out the form. So, why are we doing it three times for something that in the end, doesn’t matter?”

In relation to increased workload as a result of external demands for information to be consistent with how other agencies use information, undoubtedly most comments were in reference to packages for Crown prosecutors and information shared with provincial ministries. “There’s copying if there’s multiple warrants,” one KPS officer explained, “So, if we arrested a guy on a warrant, like three different warrants, we would have to put my statement under all of those.” Another officer spoke of copying video tapes for different cases “in case [the] Crown loses” a video tape. Officers also referenced the duplication of effort involved in the practice of taking notes in their duty book, which then have to be transcribed into statements and other reports. “I’m pretty much writing a statement in my book and then we have to type it out. So that’s probably the most repetitive part.” Another referred to evidence logs, “there was some duplication there because a lot of times you’re writing in your duty book and you’re writing it in the exhibit log.” When asked about duplication, one laughed, “we love duplication, replication, triplication.” In discussing the redundancies that occur in relation to extra forms for collecting statistical data, one officer explained, “sometimes we do things because that’s the way we’ve always done it … you get stuck in a loop that may have made sense ten years ago, but nobody necessarily looks at it today.”

Frustrations were also raised over what many interviewees saw as a system of control over data flow and quality that created unnecessary work, in some instances as a result of an excess of efficiency. Most commonly cited were instances where officers had yet to complete work that was showing in their queues as ‘pending’ and the work was already being returned by auditors for errors or being incomplete. Some observed that inefficiencies were being caused by reports rejected for minor coding errors they believed could have easily been fixed by auditors. Instead, the reports would be returned to the officer’s work queue for correction, necessitating a supervisor to review the corrections. To illustrate, one officer cited CCJS coding errors. “You took the time to [review the work], you found the error, you send the follow up through the software, we have to correct it, we have to update the audit that you sent me through the software and then my supervisor has to sign off on it,” one officer marveled. “Why wouldn’t you just change the code?” An officer in BPS queried the cost effectiveness of having officers doing a lot of this work, “Do you pay a police officer making 100,000 and some odd thousand dollars a year to do that minor data entry for correction based on how busy we are, or do you pay someone [else]?” Another similarly observed, “Now the job of the frontline policing officer is capturing so much data. Is it useful? Do we need it? Is it the best use the frontline police officer to capture that data? Or can we do it somewhere else from the backend that makes much more sense?”

The capture of ‘so much data’, and the requirements for its processing, also create problems as it filters through the control process. In particular, interviewees noted both real and potential bottlenecks as operational and other reports grow in both size and numbers. In BPS, it was acknowledged by one interviewee that additional complexities to the creation and handling of certain file types was creating “time constraints” and producing a “huge backlog”. In KPS, auditors believe that the efficiency of data flow is “at an acceptable level,” but do acknowledge “there’s always room for improvement.” That said, a frontline officer worried about the cumulative effect of continuing to add data demands: “You’re pulling a wheelbarrow full of bricks and what’s one more brick? One brick at a time it’s not that bad, until you realize that somebody had added 30 bricks to the wheel barrel.”

One of the overarching goals of police data processes is the creation of useful information to achieve organizational mandates. However, crime analysts – who are tasked with using data to help guide police investigations, patrol and other internal uses – note that much of the data they rely upon is ‘dirty’, meaning inaccurate or incomplete. To illustrate, one referenced auto theft data as “problematic” because it’s “particularly dirty.” He had tried to “do a long-term check report on auto theft and it was impossible.” Common errors cited include incorrect addresses, incorrect business names, incorrect location identifiers, lack of detailed information on car make, model, colour and year, among others. “One of the worst”, “dirtiest” fields, according to another crime analyst, is “time.” “If we get called for a break and enter, and we call you at 8am and report that my house had been broken into overnight, sometimes the officers will put the occurrence time as 8am. Well, that’s not the time it happened, that’s the reporting time.”

Aside from the inefficiencies such errors create for crime analysts, who expend significant amounts of time ‘cleaning data’ (i.e., identifying, adjusting, replacing, and/or removing incorrect, duplicated, and incomplete data within police record management systems), they also cause problems in terms of the validity and reliability of crime data. “I’m doing many things that I’m iffy about. ‘I don’t love this because I’m not convinced it is accurate’ and you know the person you’re giving it to, the person of authority, is going to be making decisions based on it.” This view is shared by a senior officer, “data is my friend and my enemy. The problem in our organization is our data is not clean.” This presents significant issues for him because “how do I hold people accountable below me for those numbers that I can’t one hundred percent believe in?”

Discussion

Under the guise of the NPM movement, shifts have occurred within the policing profession, where there is an increased focus and emphasis on fiscal responsibility, accountability, standardization, performance, and control. Drawing on Ritzer’s (1993) McDonaldization thesis, we show how two Canadian police organizations encounter the surfacing of efficiency, calculability, predictability, and control within their data processes. For instance, participants identified that their organizations implemented tools such as drop-down options to enhance efficiency with data collection or auditing to not only ensure control over the data flow and process, but to ensure predictability in the data – that is, that the data is consistent. Such measures are important for data that is disseminated outside of the organization or drawn upon internally by crime analysts as part of their workflow to calculate on-going and future issues.

Despite these efforts, overwhelming but seemingly rational informational pursuits from a multitude of actors, both internal and external to the police organization, can generate unintended consequences within these data processes that undermine the intent of creating rational data systems. Such ‘backfire effects’ are what Ritzer (1993) refers to as the ‘irrationality of rationality.’ More specifically, our participants revealed that due to the volume of information demand, there is significant duplication or even triplication of data entry requirements. This not only meant that some participants were ‘drowning in paperwork,’ but also felt pressure from select co-workers and supervisors to rush through data entry so that they could address on-going calls for service that were compiling in their call queue. Consequently, data errors are generated that are either caught by auditors and subsequently sent back to the reporting officer to be addressed or are missed altogether. In the latter case, inefficiencies are generated later in the data process as crime analysts are required to spend large portions of their work time addressing ‘dirty’ data issues.

Beyond irrationalities that impact data quality, and thus the efficiency of the overall data process, it is important to acknowledge the negative ‘trickle effect’ that data-based irrationalities can have later on down the line. For instance, as one of our crime analyst participants identified, despite lacking confidence in their data, their work is ultimately passed to a person of authority in the organization who may use it as part of their decision-making process. If the data is unreliable, and thus does not reflect the true nature of an issue, there is a possibility that decision-makers may be making problematic, ill-informed decisions. For example, scholars largely agree that in order to adequately reflect the underlying spatial patterns in crime data, a certain success threshold needs to be attained when geocoding location data (Andresen et al., 2020; Briz-Redon et al., 2019; Ratcliffe, 2004). As such, even though crime analysts diligently attempt to fix location data issues, the potential exists for crime maps to unintentionally misrepresent spatial crime patterns. Resultantly, police decision-makers may deploy resources in ways that under-police neighbourhoods in which problems exist and/or over-concentrate in spaces which have less need of proactive policing.

The effects and consequences of these findings are many. First, as Ritzer’s (1993) thesis suggests, the McDonaldization dimensions emerging in data systems have adversely affected the quality of police work, produced an array of faults, and decreased and traded-off efficiency and effectiveness. Second, outside of police work, officers reported a range of emotional and mental outcomes like frustration, conflicting demands, pressures and constraints, and separation from their work by needing to take shortcuts and trade-offs to fulfill the requirements. In other words, movements towards rational data processes may dehumanize police officers and influence alienation (Braverman, 1974; Ritzer, 1996). Lastly, an ‘offloading’ occurred within the system too, whereby officers displaced responsibilities to auditors and other members to catch data errors to meet some level of efficiency and effectiveness in their work. Such behaviours and actions could represent a rebelling against rationalization in policing systems, such that police officers are attempting to work against and disrupt such processes (whether through choice and/or because of the adverse consequences). Conclusions on this could not be offered in the current study, but may represent a promising avenue of further inquiry to understand how officers experience and handle irrationality in rational data processes.

Ultimately, these negative effects range from individual-level officer dissatisfaction to structural and systemic data concerns that have wide-ranging implications. For example, research has noted that disorganized and partial records have led to the release of criminal cases and misclassified reports have compromised the utilization of crime data for prevention and recidivism programming (i.e., Canter & Alison, 2003; Nolan, Haas, & Napier, 2011; Loftin et al., 2015; Carmody, 2017). In this way, issues with police data processes represent both a barrier and limit on the dimensions of McDonaldization in policing, leading to various counter-reactions that serve to limit the spread of efficiency, calculability, predictability, and control. In sum, the application of Ritzer’s thesis shows that the various irrationalities of rationality that inevitably accompany McDonaldization are present in police work and data systems, pointing to a tension and/or disconnect between the policing goals of increasing efficiency and effectiveness for fiscal and other reasons, like the movement towards evidence-based policing, and a long line of inefficiencies impeding upon these aims and adversely affecting police servicing.

While the present study offers a valuable contribution to the literature, no study is without limitations. In this instance, we recognize the exploratory nature of the work conducted, which focuses solely on two mid-sized Canadian police services. While it is our experience that the irrationalities documented in the preceding pages are hardly unique to these two services, it is likely the case that police services in other jurisdictions will have very different, and perhaps more effective data collection, analysis and use processes. It is our intention to pursue such research in the future, and we certainly encourage scholars in other countries to engage in similar analyses. Indeed, as the context shifts, there are not only likely different compositions of actors who require police information for their rationally conceived purposes, but there are also likely different and unique attempts at efficiency, calculability, predictability, and control in data processes that in turn cause their own unique irrationalities. Comparative approaches especially could identify data processes with few irrationalities or consequences that could help inform best data practices.

We also note that our sampling efforts for these two services produced results that were hardly exhaustive and that our overall sample size could have been larger. That said, when it came to certain roles and functions within a service, such as auditors and analysts, we conducted the maximum number possible given the limited number of individuals occupying these roles. In relation to frontline officers, we stopped continuing to collect data when we achieved data saturation (Creswell, 1998). Thus, our study includes a range of recruits from across both services to gain a diverse set of perspectives from various levels and units on their respective data issues.

Conclusion

Despite the preliminary nature of the present findings, the irrationalities discussed herein appear to suggest a need for police services that may be struggling with inefficient data systems to consider projects of ‘re-rationalization’ aimed at more deeply examining the types of information collected, the means by which these are collected, avenues for improving data quality, and the extent to which data collection is for non-operational and/or non-useful purposes. For example, given the duplication and triplication present within current processes due to the demand from multiple actors, a plausible solution may be to work with all actors to condense their respective requests to be sufficiently addressed on a single form. In this way, officers can exhaust fewer hours on paperwork while also generating data that is required by government ministries, insurance agencies, and others.

Alternatively, some of our participants revealed that their occurrence report system automatically extracted relevant data from CAD to aid in completing the associated report. Attempts at expanding these technologies should certainly be explored. As a case, allowing certain call-type specific forms to also pull from CAD would likely lead to many being auto-filled to the point where minimal time would be required from the reporting officer to complete the form. Furthermore, duplication could also be minimized through increasing adoption of mobile technologies that could replace an officer’s personal notebook. In this way, frontline officers would no longer be required to re-type content into an occurrence report, thus freeing them up for other efforts.

Modifying and streamlining existing data collection practices on the frontline could have numerous beneficial impacts on the police organization. For instance, less time exhausted on paperwork could result in timelier responses to calls for service, thus clearing call cues. Consequently, no longer having paperwork and call cues vying for officer time could lead to the frontline exhausting their on-shift time through more productive efforts that are beneficial to the community. Similarly, streamlined paperwork could lead to fewer data entry mistakes. This, in turn, could result in fewer requests from auditors thus preventing time from being wasted not only by the reporting officer for having to fix the error, but also their supervisor for having to confirm that the error was addressed. Further, from a crime analyst’s perspective, analysts would likely spend less time cleaning data, instead focusing on other responsibilities that could help the organization address on-going issues. Additionally, analysts’ confidence in the data and their own work would likely increase, which is paramount given that analysts’ work is crucial in some decision-making processes. While these recommendations could be of use, ultimately future research would benefit from studying how to improve data processes and alternative strategies of resistance to the uncovered ills brought on by rationalization.

References

Andresen, M.A., Malleson, N., Steenbeek, W., Townsley, M., and Vandeviver, C. (2020), ‘Minimum Geocoding Match Rates: An International Study of the Impact of Data and Areal Unit Sizes’, International Journal of Geographical Information Science, 1–17.

Bass, G., Kinney, B. and Brantingham, P. 2014. ‘Economics of Policing: Complexity and Costs in Canada, 2014.’ ICURS report. Available at: https://www.cacp.ca/index.html?asst_id=576

Brehm, J., Gates, S. and Gomez, B. (2003), 'Donut Shops, Speed Traps, and Paperwork: Task Assignment in Policing,' in K. Meier and G. Krause (eds.) Politics, Policy and Organizations: Essays on the Scientific Study of Bureaucracy. Ann Arbor, MI: University of Michigan Press. Pp. 133-159.

Briz-Redón, Á., Martinez-Ruiz, F., and Montes, F. (2019), ‘Reestimating a Minimum Acceptable Geocoding Hit Rate for Conducting a Spatial Analysis’, International Journal of Geographical Information Science, 1–23.

Brodeur, J-P and Dupont, B. (2006), ‘Knowledge Workers or “Knowledge” Workers?’, Policing & Society, 16(1): 7-26.

Butterfield, R., Edwards, C., and Woodall, J. (2004), The new public management and the UK police service: The role of the police sergeant in the implementation of performance management. Public management review6(3): 395-415.

Chan, J. (2005), ‘Police and New Technologies,’ in T. Newburn (ed.) Handbook of Policing. Portland, Ore.: Willan Publishing. Pp. 655-679.

Creswell, J. (1998). Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: Sage Publications.

Ericson, R. (1982), Making Crime: A Study of Detective Work. Toronto: University of Toronto Press.

Ericson, R. and Haggerty, K. (1997), Policing the Risk Society. Toronto: University of Toronto Press.

Flanagan, R. (2007), ‘The Review of Policing: Final Report.’ U.K. Home Office. Available at: http://webarchive.nationalarchives.gov.uk/20080910134927/police.homeoffice.gov.uk/publications/police-reform/review_of_policing_final_report/

Goldstein, H. (1964), ‘Police Discretion: The Ideal vs. the Real.’ Public Administration Review, 23(1): 140-148.

Guilfoyle, S. (2013), Intelligent Policing: How Systems Thinking Methods Eclipse Conventional Management Practice. London: Triarch Press.

Gundhus, H. (2012), ‘Experience or Knowledge? Perspectives on New Knowledge Regimes and Control of Police Professionalism’, Policing, 7(2): 176–192.

Lasiewicki, P. (2007), ‘Achieving Congruence Between Individual Commitment to Policing and Organizational Objectives in Police Departments’, PhD Dissertation. University of Phoenix.

Malm, A., Pollard, N., Brantingham, P., Tinsely, P., Plecas, D., Brantingham, P., Cohen, I. and Kinney, B. (2005), ‘A 30 Year Analysis of Police Service Delivery and Costing: ‘E’ Division.’ Available at: https://www.ufv.ca/media/assets/ccjr/ccjr-resources/ccjr-publications/30_Year_Analysis__%28English%29.pdf

McLaughlin, E. (2007), The New Policing. Thousand Oaks, CA: Sage.

McLaughlin, E. and Murji, K. (2001), ‘Lost Connections and New Directions: Neo-liberalism, New Public Managerialism and the Modernization of the British Police,’ in Stenson, K. and Sullivan, R. (eds.) Crime, Risk and Justice: The Politics of Crime Control in Liberal Democracies. Cullompton, UK: Willan. Pp. 104-121.

Power, M. (1997), The audit society: Rituals of verification. OUP Oxford.

Ratcliffe, J. (2004), ‘Geocoding Crime and a First Estimate of a Minimum Acceptable Hit Rate’, International Journal of Geographical Information Science, 18(1):61–72.

Reiner, R. (1993), Police accountability: Principles, patterns and practices. Accountable policing: Effectiveness, empowerment and equity, 1-24.

Ritzer, G. 2014 (1993). The McDonaldization of Society, 8th ed. Thousand Oaks, CA: Sage.

Sanders, C.B. and Langan, D., (2018), New public management and the extension of police control: community safety and security networks in Canada. Policing and society, 29(5): 566-578.

Skogan, W. 2010. Police and Community in Chicago: A Tale of Three Cities. Oxford, Eng.: Oxford University Press.

Waters, I. (2000), ‘Quality and Performance Monitoring,’ in Leishman, F., Loveday, B., & Savage, S. (eds.). Core Issues in Policing. London, UK. Longman

Weber, M. 2003 (1905). The Protestant Ethic and the Spirit of Capitalism. New York: Dover.

Weber, M. 1968 (1921). Economy and society (3 Vols). Totwa, NJ: Bedminster Press.

Weber, M. 1947 (1925). The Theory of Social and Economic Organization. London: Collier Macmillan.

Diagrams and Tables

Table 1. Police Interviewees

Participant Category

Kerbyton (KPS)

n

Police officers

17

Crime Analysts

4

Auditors/Quality Control

4

Belleview (BPS)

Police officers

15

Crime/Research Analysts

3

Auditors/Quality Control

2

Info Management

1

Total

46

Table 2. Codes

Major Theme Codes

Efficiency

‘best method’, ‘save time’, ‘work faster’, ‘easier’, ‘streamline’

Calculability

‘stats’, ‘statistics’, ‘numbers’

Predictability

‘verify’, ‘safeguard’, ‘correct’, ‘audit’, ‘standard’, ‘standardized’

Control

‘process,’ ‘system’, ‘software’, ‘error’, ‘human error’

Irrationality

‘duplicate reports’, ‘triplicate reports’, ‘reports with no known purpose’, ‘problem’

Table 3. Number of Participants per Theme

Major Theme n

Efficiency

29

Calculability

19

Predictability

20

Control

27

Irrationality

32

Diagram Description automatically generated

Comments
0
comment
No comments here
Why not start the discussion?