Skip to main content
SearchLogin or Signup

A Retrospective Look at My Professional Life

Published onSep 30, 2020
A Retrospective Look at My Professional Life

Now that I’m just about done with the working part of my life, I thought that I’d try to trace the steps that I took in getting to this point, from initially embarking on a purely technical career to one that ended up in a social science. I describe how I drifted into studying crime. It may read a bit choppy, but the path I took was not straight; I followed my nose and what interested me, rather than any plan. To some extent, it shows the benefits of what Donald Campbell termed the “fish-scale” model of interdisciplinary studies: move away from the specialization you focused on in school, and apply the skills you learned to another area, since you have a different viewpoint (and skills) that can be valuable.

[And now, with the COVID-19 pandemic upon us, I realize how fortunate I was to grow up during a time when going from school to college to graduate school to work, and then to retirement, was an expected and natural progression. I just hope that the current situation does settle down soon, so others can find a stable path through their lives.]

  1. Prologue

Before getting into the details of my life’s trajectory, here are two of my favorite quotes. They indicate, respectively, the promise and perils of quantitative analysis, which I have been doing for the past 50-odd years. The first is a 1948 poem by a mathematician, Clarence Wylie:


Not truth, nor certainty. These I forswore

In my novitiate, as young men called

To holy orders must abjure the world.

"If ..., then," this only I assert;

And my successes are but pretty chains

Linking twin doubts, for it is vain to ask

If what I postulate be justified

Or what I prove possess the stamp of fact.

Yet bridges stand, and men no longer crawl

In two dimensions. And such triumphs stem

In no small measure from the power this game,

Played with the thrice attenuated shades

Of things, has over their originals.

How frail the wand, but how profound the spell!

And “…this game” is the game I have enjoyed playing, extracting meaning from the numbers that describe various phenomena.

The second quote is even older, from a 1929 book, Some Economic Factors in Modern Life, by Sir Josiah Stamp:

“The [Indian] Government are very keen on amassing statistics – they collect them, add them, raise them to the nth power, take the cube root and prepare wonderful diagrams. But what you must never forget is that every one of those figures comes in the first instance from the chowty dar (village watchman), who just puts down what he damn pleases.”

I suppose that I’ve spent most of my professional life shuttling between these two extremes. I am intrigued with the way that statistics can tell us so much about activities and operations that are hidden within the numbers. But I am equally intrigued with the way that they can tell us so much about the inner workings of the agencies that produce the statistics, sometimes trumpeting them, sometimes hiding them. With these poles as the yin and yang of my career, here’s my story.

  1. From Electrical Engineering to Police Communications

Let’s face it, I was a nerd – maybe not completely, since I’ve always enjoyed playing sports, especially 4-wall handball. But I also enjoyed learning and did well in school. So I skipped half of first grade and all of third grade, which was fine academically, but held me back socially: I was almost always the youngest kid in class, and by skipping I didn’t have a cohort of friends to go to school with.

When I was in junior high school in Brooklyn I took an all-city test of academic proficiency. Those who got sufficiently high scores could go to one of the three academically advanced high schools: Bronx High School of Science, Brooklyn Technical High School, and Stuyvesant High School (which was in Manhattan). I did well enough on the test to attend one of the three schools. I didn’t consider Brooklyn Tech because I wasn’t sure that I wanted to get a primarily technical education, and Bronx Science was too far from Brooklyn to consider. So I opted for Stuyvesant, even though it meant that I would have a longish (45-minute) commute into Manhattan every day.

It was worth it for me. My time at Stuyvesant was well-rounded; for example, I wrote an essay about life with three sisters for the school’s literary magazine (“My Own Three Angels”) and was an editor of the magazine as well. And, unusual at that time, I was able to take calculus (they didn’t have AP courses then); for the most part I didn’t understand the concepts, but was able to memorize the procedures to do well in it. This helped me later in college, when I retook it; in going over the material the second time I was able to understand it more deeply – and it taught me a lesson about going over a difficult concept more than once, from the beginning, instead of just giving up on it.

To accommodate all its students, Stuyvesant was on split session, with no lunch period: freshmen and sophomores went from 12:30-5 and juniors and seniors went from 8-12:30. So during my last two years there, I was able to work after school job at United Lawyers Service, delivering papers to, and collecting papers from, lawyers throughout the city. Although my father was a lawyer, I wasn’t interested in becoming one. And it turned out that I was interested in going to into engineering after all: there were interesting jobs to be had with an engineering degree, but I wasn’t sure of what I could do if I majored in other areas.

I applied to three schools, Cooper Union in lower Manhattan, New York University in upper Manhattan, and Rensselaer Polytechnic Institute (RPI) in Troy NY, and got into all three. I chose the last one because the nephew of a friend of my mother’s (Sam Markowitz, who subsequently taught chemistry at Berkeley for years) had gone there and thought very highly of it.

We were far from well-off. My father wanted me to go to Cooper Union, since its tuition was free and I could live at home, but my mother supported my going to an out-of-town college if we could afford it – we lived in a two-bedroom apartment, and I shared a bedroom with my three sisters, so this would get me out of the house during my older teenage years. With a $350/year New York State scholarship (which I received, based on my score on the Regents exam, a statewide test) and a $700 RPI scholarship, I could just afford it: RPI’s tuition, room, and board at that time was $1050/year. And I had a summer job after graduating high school, again at United Lawyers Service but full-time (at the relatively high salary of $1/hour!), so that helped as well.

Going to college. I had no information about RPI except for the brochures they sent; it was too far away for me to visit beforehand. So one September day, with two suitcases in hand, my father took me to Grand Central Station to board a train to Albany, and from there I took a cab to RPI. My roommate in the freshman dorm was a friend from Stuyvesant, so that made the transition easier.

I joined a fraternity (Alpha Epsilon Pi) and started smoking cigarettes, both to try to remake my image of myself as more mature, to leave the nerd behind. I think that’s what many people do when they go to college, especially when living away from home. I suppose that I tried to emulate those who seemed to be more popular, even to the extent of getting drunk at fraternity parties (I'm not much of a drinker), but I still kept up my studies. In fact, part of my popularity was due to my helping others understand some of the concepts in the courses we took.

I applied for the Navy ROTC program, because that program paid all expenses (and you went into the Navy as an officer). I didn’t get accepted because a physical exam turned up albumin in my urine.1 So I joined the Air Force ROTC; I think I was trying to make sure that I avoided the Army, since the Korean War had just ended and things did not look good in the Far East.

I quit ROTC after two years; what I hated most was the Tuesday afternoon drill on the Troy High School parking lot, across the street from my dorm, learning how to march in lockstep with everyone else – I guess that’s the purpose of military drill. So after a few months of marching in the fall and the winter cold of Troy, I joined the Air Force ROTC band, which practiced (indoors!) during the drill time: this despite the fact that I couldn’t (and still can’t) read music or play an instrument. So I “played” the cymbals, with the bass drummer giving me cues as to when I should strike them together. Since we had a limited repertoire of marches, Washington Post March and others, I was able to memorize when to bring the two cymbals together.

Via AFROTC my first experience on an airplane was during those first two years: the head of the program took a bunch of us up in a T-20 trainer, and we all took a turn in piloting. I’m sure I was pretty bad at it, never having steered a car, or even a bicycle – our street life in Brooklyn did not include bike-riding. In fact, I didn’t learn how to ride a bicycle until graduate school at Stanford, when I bought a bike.

I took five courses in my first semester at RPI, as did everyone else. This included Calculus I which, as I noted earlier, I had taken it during my senior year of high school. It was a senior elective in high school but, as I said, I was able to do well in that course by applying formulas that I memorized, even though I didn’t quite understand them.

So I suppose I could have skipped Calculus 1 at RPI, but because I felt I had limited understanding of the concepts I decided that it would be best if I started at the beginning, all over again. And it was a good thing, for both me and my classmates, that I did, because the instructor was the wife of the head of the math department and was really not qualified. She brought in notes that her husband had prepared the night before, transcribed them onto the blackboard, and read from a script he had prepared. But there was an occasional slip-up. I raised my hand quite a bit in that class, asking her if there was a mistake, and she said that she would look into it. On the next day I would find a note on my desk saying that I was right. Her “teaching” may have confused the others in the class, but I was often able to straighten them out.

One of the lessons I took away from this experience is that you don’t always learn the subject very well the first time through. So thereafter (e.g., learning a new computer language or other subject) I would read over the material until I had a general idea about its content, then apply it and make mistakes, and then go back to the books and really understand it.2

Of course, the RPI scholarship then became part scholarship, part loan after the first year. But I was able to get into RPI’s cooperative engineering program, which gave me paid semester-long work assignments with General Electric. So I was not that great a burden on my parents, who still had three daughters to support and get through school. And during the summer after I graduated, with a Bachelor of Electrical Engineering degree, I got a job at Grumman Aircraft in Bethpage, Long Island. My job was to calibrate aircraft bombsights, which was pretty boring, as were my coop assignments; it reinforced my desire to get an advanced degree, because with only a bachelor’s degree the work would probably be similar to my GE or Grumman experiences.

Applying to graduate school. I applied to MIT, the University of Michigan, and Stanford University graduate schools. My backup was to go to work for Bell Labs in New Jersey, which also paid its employees’ graduate school tuition. I did not get into Michigan, got the runaround from MIT (see Figure 1), and was accepted by Stanford with a research assistantship.

<p>My acceptance and rejection letters from MIT</p>

My acceptance and rejection letters from MIT

To improve my chances of getting into Stanford I tried to game the system. My essay focused on what was one of Stanford’s specialties, microwave and UHF electronics, since I had done well in a course in that field and got the instructor to give me a reference. But to tell the truth, I wasn’t that interested in it.

Despite that, when I got to Stanford I was given a research assistantship with Marvin Chodorow, one of the top researchers in microwaves. However, I was soon turned off by the work I was to do: my first assignment was to soak pieces of blackboard chalk in a hot, thick sugar solution and then bake them, so that the hydrate part of the carbohydrate disappeared, leaving only the carbon to permeate the chalk. The carbon rods would then be placed in holes in a waveguide, to attenuate the signal. Yes, I know that you have to start at the bottom, but that was a bit too low for me.

My next research assistantship. During my second quarter there I took a course in control systems. That area was more to my liking than worrying about how electrons traveled through tubes or wires or the air, or about how they made light or sound. So I started looking around for an assistantship in that area. I was interviewed by Professor Irmgard Flügge-Lotz, who was in the Engineering Mechanics Department but was also affiliated with the Electrical Engineering Department, in the control systems area. A German national, after World War II she and her husband, G. William Flügge, also an engineer, were repatriated to France (ONERA), and then moved to Stanford. They had previously worked at Peenemünde, the German rocket research facility during World War II.

When I was first interviewed by her for the research assistantship position, she asked me in a very kindly voice, “Where is your family from?” I said, “We’re Jewish,” and she replied, “Yes, I know; but where (in Europe) were they from?” I told her that my family was originally from Poland. I’m sure that by those questions she was trying to put me at my ease.

Flügge-Lotz was doing NASA-sponsored research; I became her research assistant and worked with her on what is known as contactor (or relay) control systems, wherein a control jet is turned full on or full off to turn or move a device. Here’s the way it works: to get, say, fifty percent of the oomph out of the control jet, it is set to “chatter”, to turn on and off rapidly, half the time on and half off. And to reduce the jet’s activity, as the device got closer and closer to the desired position, it would still be turned on and off rapidly, but would be off for a greater and greater fraction of each chatter cycle.

Her control system directed the “buzz bombs,” the V-1 rockets, that terrorized England during World War II; the buzzing noise was the chatter of relays that were used to steer the rockets. The Britons on the ground knew that the dangerous time was when the buzzing stopped, since then the bomb was no longer flying but was headed downwards.

Her original research was described in her book Discontinuous Automatic Control, which explained how it was used to control simple devices. Her earlier students had followed up on that work: she set them to developing control systems for more and more complicated device configurations. I was given the task of designing a discontinuous control system for the next most complicated level of device. But I farted around with that assignment, trying to understand what happened when the jets went into chatter. Sometimes the chatter was completely random, flipping on and off for no apparent reason, but at other times it behaved differently – and I was able to describe when the chatter was controlled and when it was essentially uncontrolled. I did this by plotting the different chatter regimes, and by doing so I was able to show that one of her earlier students had made some mistakes in his dissertation. That was a lucky break for me, because Flügge-Lotz was about to fire me for not adhering to the assignment she gave me. Over the next year I continued to work on that topic.

Without realizing it, I found what turned out to be my métier in doing research: plotting up data to see what it looks like and what it could tell us about the data-generating process. In fact, 42 pages of my 111-page dissertation contained data plots. Of course, visualizing data is standard in engineering, but when I began to study crime and law enforcement I applied it there.

I defended my dissertation in the early spring of 1963, and I started looking for jobs. There were a lot of faculty positions available in colleges and universities, since engineering was on the upswing at that time. But just moving from one side of the lectern to the other didn’t seem to me to be that attractive: I wanted to get some real experience under my belt before returning (as I had always expected to do) to academia.

My first jobs. One of the better (and better-paying) offers I received was from Aerospace Corporation in El Segundo, California. Aerospace had an interesting portfolio, since they were working in areas at the forefront of aerospace engineering, as well as in areas that were right up my alley: one of the prime examples in my dissertation was to develop a control system to keep a satellite oriented to face the earth, so it seemed to be a natural fit.

When I got there, however, I was assigned to work on designing control systems for intercontinental ballistic missiles. It seemed like it was too much of rut for me, to basically stick to the same area I knew rather than to get into more interesting things further afield from my dissertation research. Moreover, I wondered if helping to build weapons of mass destruction should be the culmination of my twenty years of school – especially considering the origins of my research area.

Denmark. Preben Prehn, a visiting scholar at Stanford from Denmark, suggested that I write to his school, the Technical University of Denmark, to see if I could spend a year there. I did, and was accepted, and was able to turn my Aerospace position into a summer job. It turned out that the Danish position was available because the engineer’s union had been striking the university for improved benefits and wages, so in essence I was a scab! My year in Denmark was, on the whole, enjoyable: I taught a lab course in control systems ; I worked on a paper Flügge-Lotz and I published in Automatica; I learned Danish; I was able to ski in Norway and Austria; and I was also able to think about what I wanted as a career.

Some significant memories: I remember being at a friend’s house, Steve Bigelow, then a Fulbright scholar, playing bridge with a girlfriend and Steve and his wife, when Steve’s landlord came in and told us the news that President Kennedy had been killed. My answer to “Where were you when…?”

And before the Civil Rights Act was passed (under LBJ), I was put on the defensive by my Danish colleagues at DTH concerning the way the US treated Blacks. I was apologetic, noting that we were working on it. That was the way things stood, me feeling somewhat defensive all the time, until I noticed something: I remarked to them that it seemed to me that almost all of the menial jobs in Denmark, street sweepers and the like, were handled by ethnic Greenlanders (readily identifiable due to their Inuit ancestry); they were supposedly full citizens of Denmark but were hardly given equal opportunities. After that, there was no mention of the mote in the United States’ eye.

Becoming a consultant. Looking around at what was available in the US, I was impressed by one engineering consulting company, Arthur D. Little Inc. (ADL), in Cambridge, Massachusetts. As a consulting firm, they had a great variety of consulting assignments. So I wrote them, and they were encouraging; upon my return in June 1964 I went for an interview, and was offered a position in its Systems Engineering Section. An aside: my supervisor, Gordon (Tobey) Raisbeck, had been a student of Norbert Wiener’s and had married his daughter, Barbara. So I was two degrees of separation from the founder of cybernetics. And from Flügge-Lotz I’m also four degrees of separation from Ludwig Prandtl!

My first two assignments at ADL were on US Navy projects. So much for my peacenik stance, although my rationalization was that the assignments were not weapons-related, but rather focused on defending against enemy (i.e., Soviet) submarines. My first project was to determine the best pattern to deploy sound detectors (sonobuoys) in the ocean to enable them to determine the location, direction, and speed of submarines. The sonobuoys were deployed by a drone helicopter that was launched from a destroyer; the system was known by the acronym DASH (drone anti-submarine helicopter).

This was not straight control systems engineering, but more in the nature of operations research (OR), and my colleagues and mentors in this activity were among the top researchers in the field, George Kimball and Bernard Koopman, both of whom were consultants to ADL. Kimball was coauthor of Methods of Operations Research, and Koopman had written Search and Screening, both path-breaking books in OR. My coworker on this and subsequent projects was Steve Pollock, who was just finishing his PhD in OR at MIT.

Two additional projects that I worked on involved the development of computational techniques for locating submarines based on the noise they generated. The first was known as SOSUS (Sound Surveillance System) and was then classified as Top Secret. Strings of acoustic detectors were positioned underwater, off Norfolk Virginia and other places, and it was possible to determine the direction from which the noise emanated by realizing that the sound reached each detector at slightly different times (unless the source was the same distance from all of the detectors), the difference in time attributable to the speed of sound in water. I wrote a target motion analysis program for that purpose, permitting the estimation of the target’s location, speed, and direction of motion.

The second project provided me with a two-week sojourn in St. Croix, Virgin Islands, courtesy of ADL and the Navy. There was an underwater acoustic tracking range off the west coast of St. Croix. ADL had a contract to experimentally determine whether a destroyer’s sonar could hear an incoming torpedo. It worked like this:

  • a destroyer would travel south into the tracking range at a given velocity;

  • a torpedo launch boat would fire an unarmed torpedo that was timed to go under (of course!) the destroyer at its time of arrival;

  • at the other end of its trajectory, the torpedo would be recovered by another boat, refueled, and return it to the launch boat;

  • the destroyer would go back to its starting point;

  • and all this would take place for about ten times at different destroyer speeds, torpedo depths, and bearings between the destroyer and torpedo, from 90 degrees to about 20 degrees.

The experiment took weeks to accomplish. Just imagine how long it would take a destroyer to return to its original position north of the range and then get up to the required speed – probably well over half an hour. And then do it over and over! I was involved in running the electronic equipment on the destroyer, the USS Glennon (DD840), which entailed calibrating the equipment and recording the sonar output. The ship was initially berthed on nearby St. Thomas, so I flew over there to start the work. Then I spent an overnight in a bunk on the destroyer as it headed to St. Croix – which I considered my “military deployment.” Every few days I spent 12-14-hours below decks with the equipment, making sure we got our data. And of course my colleagues and I enjoyed the banana daiquiris we made with fresh-picked bananas and Cruzan rum. I had just gotten married that March, so my then wife joined me for one of the weeks – essentially, an additional honeymoon.

Other consulting activities in which I was involved were not all confined to Navy projects or to acoustic detection. They included:

  • The number, geographic distribution, and scheduling of cargo ships (for the Navy’s Fast Deployment Logistics program). On this project I worked with George Kimball, who showed me how to deal with measuring distances on a globe, essentially spherical trigonometry. I put this to use later on, in studying cigarette smuggling.

  • The design and mathematical modeling of a resonating hammer (or reciprocating drill), for Black and Decker. This was a fun project, but I have no idea whether it was ever built.

  • The development of training material to instruct control system engineers in the use of various computer packages, for IBM. Our goal here was to show how to deal with time- and state-varying inputs and outputs to control systems.

Police communications. But my longest and most interesting project, and the one that led to my changing fields, was the analysis of, and recommendations for improving, the command-and-control system of the Boston Police Department (BPD), including its communication system. That project came about as a result of the 1964 presidential campaign. Lyndon Johnson, the incumbent president, was accused by Barry Goldwater of being “soft on crime.” Although Johnson won handily, he felt that he needed to respond to Goldwater’s accusation, and he did so in a politically conventional way: he formed a commission, the Presidential Commission on Law Enforcement and Administration of Justice. The commission began to fund local criminal justice projects through the newly created Office of Law Enforcement Assistance. The BPD project was one of these.

Since I was an electrical engineer, I was put in charge of planning the upgrade of the BPD’s communication system. But to tell the truth, I knew very little about radio communications hardware, systems or their components: as I noted earlier, that was not something I had ever dealt with, or been interested in. Luckily, the head of BPD’s communication center, Sergeant Leroy Hunter, knew what needed to be done. And I learned an important lesson about consulting for large organizations: the people at the top rarely ask for advice or ideas from those at the working level about how to improve things, so a good consultant talks to those people when making recommendations.

  1. From Police Communications to Policing in General

One part of the police communications system that I found interesting was how complaints (the term used by police departments to denote citizens’ calls for police service) by citizens were dealt with. Phone calls came in from citizens to a central complaint room, and a time-stamped card was filled out with the information about the complainant, complaint, and location. The card was then put on a conveyor belt that ended at the radio dispatcher’s desk, who then assigned a patrol car to the incident. Some of the issues that were addressed involved electronics, but for the most part they focused on operations research issues like queuing theory.

And in fact, this led to my writing some articles on those aspects of police communications, one for a book on law enforcement science and technology, another for a NATO book on communications systems. [And the I presented the latter article at a NATO conference held in the summer of 1968 on Ile de Bendor, a French Mediterranean island – another nice perk!]

A policing “experiment.” But I also got interested in (non-technical) communications at the street level, seeing how patrol officers dealt with the calls for police service that were assigned to them. So I went on a number of ride-alongs to get a feel for what the police do when on patrol. With respect to police patrol, I was interested if we could determine the value of what was called “preventive patrol,” driving through high-crime areas to provide a police presence and possibly deter crime. So I asked the police chief, Ed McNamara, if I could do an experiment, by randomly selecting two police districts, doubling the patrol cars in one district for one month and leaving the other district with its normal patrol during that period. There I was, just under thirty years old, wanting to play around with a city’s police department!

McNamara reluctantly gave his approval, but with strong conditions: he selected the two districts himself (Jamaica Plain and Roxbury), with Jamaica Plain as the experimental district. And the experiment was confined to one week, and only during the afternoon shift. So much for a true experiment!3

What was surprising, however, was the result of the experiment. While it didn’t seem to affect the crime rate (both districts’ crime and complaint rates didn’t vary much from the previous year’s statistics), we found a major change in arrests: whereas during that week in 1967 there were no arrests in either district, and none in Roxbury for that week in 1968, six arrests occurred, all in Jamaica Plain. Wow! What statistical significance (p= (1/4)6 < .001)!

With such clear-cut results, I wanted to find out a bit more about how the experiment worked. A few days later I spoke to the two of the patrolmen who were assigned to the experimental district. Three factors became immediately apparent. First, you can’t just gin up police officers out of thin air; they have to come from another assignment.

Second, police officers are not fungible; they have different characteristics based on their propensities and on their assignments. The additional officers that manned the experiment were all from the Tactical Patrol Force, the “marines” of the department, the ones who don’t handle the usual run of minor complaints or petty thefts but are there to clamp down on major crime. One officer I spoke to said that “it was like shooting fish in a barrel;” they knew that there was a gang of car thieves in the district and used the opportunity to take them all out.

Third, my implicit assumption in the equation (and p-value) in the above paragraph, something that is all too often overlooked, was that the arrests were all independent – which they certainly were not. This made me a lot more hesitant about assumptions that are often made about crime control programs: one needs to look beyond the numbers to see what they really mean (or as I later taught my students, to “smell” the data).

Rethinking the police role. In addition, I reached the conclusion, jokingly, that the police were just report writers with the power of arrest. An exaggeration, of course, but it appeared to me that most of their activity seemed to be documenting the nature of their interaction with citizens.

Rather than look upon that as a joke, we can turn that around to some extent and realize that they are the only 24-hour-a-day representatives of the city on the street. And as such, they should note the deficiencies, not just in the people they meet, but in the city services that should be (but are often not) provided to the areas of most concern.

  • Kids congregating on street corners? Where are the athletic facilities and parks?

  • Trash on, and potholes in, the streets? Call the appropriate city agencies.

  • Truancy? How much is invested in the neighborhood schools?

I don’t mean to imply that police officers should all be renaissance men/women, but that they should be aware that the problems they encounter may not be for them to solve using their law enforcement power. In other words, the police should be tasked not only dealing with problem persons but dealing with problems that are exacerbated by municipal deficiencies. This would make the “thin blue line” a little bit stronger and more substantial.

Hospital communications. The one project I directed was my last project at ADL. ADL was approached by the Maine Hospital Association to help them design an emergency medical communication system for its hospitals throughout the state, funded (I believe) by the Federal Highway Administration. This would entail knowing something about radio transmission at different frequencies in hilly terrain and under differing weather conditions – about which I knew next to nothing. I wrote a proposal, which they accepted. Luckily for me, I was able to hire Sgt. Hunter as a consultant, and had two additional ADL staff, both of whom were older than I and had more experience, to help me. In the end, we did a creditable job, but I felt that it was too much, too soon, for a thirty-year-old to be head honcho. So becoming a bureaucrat with a guaranteed paycheck and a straightforward career path, while perhaps boring, seemed attractive.

That is, even though I was pretty successful in doing consulting, and enjoyed the variety of assignments, I was concerned about how I would fare in the future. To really be successful there, I would have to become a “rainmaker,” a person who brings in new clients and projects. I was afraid that I wouldn’t make the grade in that area, or it would make me anxious about selling myself and the company. I was good as a worker, but had only the one experience in selling myself and ADL as a project director, and felt awkward and nervous about it.

  1. Becoming a Federal Bureaucrat

Subsequent legislation (the Omnibus Crime Control and Safe Streets Act of 1968) upgraded the Office of Law Enforcement Assistance to become the Law Enforcement Assistance Administration (LEAA). Within it a new entity was created, the National Institute of Law Enforcement and Criminal Justice, now known as the National Institute of Justice (NIJ). In the summer of 1969 I was asked by our DC Boston project monitor, Bob Emrich, to join NIJ to start planning technical research in criminal justice, since I was one of the few techies who knew something about criminal justice. I asked my boss at ADL if I could get a two-year leave of absence (I felt that one year wouldn’t be enough to do a good job), but was turned down. Moreover, with (by then) three rug rats under the age of three, it would be the height of folly to uproot the family for just one year. So instead I decided to quit and join the new organization, and I became a GS-15 Operations Analyst at NIJ.

And that was the way that I changed my career trajectory. My metaphor for that change was as if I was a comet heading toward the sun – Washington, DC – from which all things emanated. I headed into DC as an engineer, and when I left DC I headed out along a different path. I didn’t leave engineering (or more accurately, operations research) entirely, but I found a new home in studying crime, criminal justice, law enforcement, and corrections.

In one instance my commute to work in DC merged the personal and professional. A little background: in 1964 in Queens New York a woman, Kitty Genovese, was murdered, on the street, at 3AM. She cried out, and some 38 of her neighbors came to their windows, none of whom called the police.4 To explain what might have caused this inactivity on the part of her neighbors, in 1969 Bibb Latané and John Darley published an article in American Scientist about research the authors conducted on the correlates of bystander “apathy,” as they termed it. They strongly suggested that it was not indifference, but each bystander’s presumption that someone else would step up to the plate and call the police, which no one did.

Well, it was an overly warm morning in the spring when I got on the L-7 bus (we lived on Huntington Street, just west of Connecticut Avenue), found a seat in the rear, and realized that it was sweltering back there. The driver had his window open, so he was not aware how hot all of the riders were. A bunch of them complained to each other, but not to the driver. So I walked up the aisle to the driver and asked him to put the air conditioning on, which he did – and walking back down the aisle I was thanked by a number of passengers. That made me realize that sitting on one’s hands, waiting for someone else to act, is more common than I had realized, and that I would take it upon myself to stand up when others didn’t.

Expanding research on the police. At that time, most of the researchers in criminal justice were sociologists and psychologists, which gave me free rein to develop a research program focused on other areas. Of course, I was given the more technical areas, like the use of aircraft in police operations, but I was also able to dig into other areas. That is, I didn’t focus on the sociological or psychological characteristics of offenders – my feeling about those studies was that so many seemed to be variations on a theme like this: “Take two parts poor education, one part impulsivity, and three parts poverty and you cook up a delinquent.” In other words, they focused on personal and environmental variables, and assumed that they were linearly related to each other. The presumed linear (or log-linear) relationship was not because they had evidence of it, but because the prevailing methodologies and analytic techniques were based on linearity of one sort or another – and that was what they were taught in graduate school.

I became interested in the what and how of crime, more than the who. I developed a program at NIJ called “Analysis of Criminal Activity,” which sought to look for patterns in robbery and burglary – focusing on the kinds of targets that are selected, when the crimes occur, and the environmental characteristics of the locations where the crimes are committed. I also applied the same approach in the study of organized crime. In short, I had a lot of leeway and some fun in looking at the operational analysis of crime rather than the characteristics of offenders.

My original look at police aircraft operations led me to focus on evaluation. As I wrote in an article in The Police Chief (“Evaluation of Police Air Mobility Programs”), aircraft in policing had become the “post-computer status symbol” among police departments, and promoted by helicopter manufacturers who were looking post-Vietnam to increase sales in the civilian sector. My point in the article was that their cost and the inability to deploy them quickly in an emergent situation made them less than useful, since they could cost as much as three or four fully staffed patrol cars. I went on helicopter ride-alongs in Kansas City and Los Angeles, two of the best police air programs in the country, which didn’t make me change my mind. [Of course, with drones so inexpensive nowadays, it would be relatively easy to launch one from a police vehicle to follow and record a car instead of getting into a car chase. To accomplish this, of course, it would be necessary to provide the officers with relevant training.]

When I first wrote the article on evaluating aircraft in police work, I sent it up the hierarchy to the LEAA’s public affairs office, which gave its approval to its publication. But after it was published, the manufacturers complained to the agency administrators (this was during the Nixon administration), and I was cautioned not to publish anything else that was anti-business.

Some of the other areas I was involved with included:

  • Program manager, police mobility systems, 1969-1972. As noted, this was primarily focused on the use of aircraft in policing.

  • Program manager, analysis of criminal activity, 1969-1972. I got interested in how burglaries and robberies were committed, and I funded (and closely followed) projects that went into the way burglaries and robberies were committed and handled by the police. I suppose that my operations research training got me interested in exploring the various ways those crimes were committed.

  • Acting program manager, organized crime, 1970. I tried, somewhat successfully, to change the focus from going after mobsters to looking on the business aspects of organized crime. I looked upon OC as an activity that dealt in commodities that were in demand but were illegal, and in the steps taken by those running the businesses (bootlegging in the 30s, gambling and numbers, drugs, prostitution, etc., later on) to protect their business (payoffs to cops, e.g.), and to deal with disputes (they couldn’t use legal processes to deal with them, so violence was often used).

  • LEAA representative to the Interagency Committee on Transportation Security, 1971-1972. This committee never really got off the ground.

  • Director, task force on Offender-Based Transaction Statistics, 1971-1972. The object was to look at offender data longitudinally rather than cross-sectionally, to follow the different trajectories of individuals who pass through the criminal justice system. The problem here was the different levels of government that were involved: municipalities (police), counties (prosecutors and courts), and states (prisons). Moreover, the data systems they used were incompatible at that time. And most states still don’t have good ways of tracking offenders through their systems.

  • Director, task force to develop a long-range research program on “Policing the City,” 1972. This was one of my last projects, and I left NIJ before it came to fruition.

Too big for my britches. I also got a little too big for my britches, as happens all too often in government. Here’s the way it happened. I was invited to give a talk at the Airborne Law Enforcement Association about how to evaluate aircraft effectiveness in police work. At the end of my talk, one of the officers there said, “When Walter Key (the LEAA person heading the mobile radio programs) gives us funding for mobile radios, we don’t have to evaluate them.” I responded, “Well, if I were in charge of that program, you would have to.” There I was, a snot-nosed kid telling these experienced police officers off. I realized after that that I should be a little more circumspect and stop mouthing off.

At that time, NIJ as an organization was flailing around. In the three years I was there I had five different supervisors. A lot of the staff spent a lot of the time trying to figure out what was going on, who was moving up or down, and gossip was rampant. I decided to ignore the gossip and instead put some effort into getting some ground rules established for the grants I was monitoring. I realized that very little was known about how to evaluate programs aimed at crime, and started to write what became a government publication, Evaluation of Crime Control Programs. It was lucky that I had written this and my air mobility publication, because it turned out that they helped bail me out of a sticky situation.

  1. Becoming a Former Federal Bureaucrat

As I noted earlier, I joined the Justice Department in the fall of 1969, a few months after the (Republican) Nixon administration replaced the (Democratic) Johnson administration. The new administration changed the top people; for example, the new head of DOJ’s Criminal Division was Will Wilson, a former Texas Attorney General. One of the people Wilson replaced was Herb Edelhertz, who had been head of the Fraud Section of DOJ’s Criminal Division. He came to NIJ and wrote an excellent monograph published by NIJ, The Nature, Impact, and Prosecution of White-Collar Crime. And two of the three LEAA administrators, by statute bipartisan, were Republicans, with one Democrat, Charles Rogovin.

First indication. While I was acting program manager of organized crime research, I received an unsolicited proposal, via Wilson, from a faculty member of the University of Houston’s Bates College of Law, from (I believe) a professor named White. He proposed to develop model statutes for the prosecution of organized crime. But to determine the nature of the criminal activity on which to base the statutes, he would read Mario Puzo’s (fictional) books on organized crime – The Godfather and its sequels. This seemed to me to be a laughable idea; I felt that this might just have been a way for Wilson to promote/fund his friend. But I also realized that I had to be careful. So I asked two colleagues who had been involved in organized crime at the local level, Marty Danziger (New York) and George Higgins (Boston), to read the proposal and send me comments. They panned the proposal, so I was able to use their critiques to reject it. But to be courteous I called Wilson to let him know of the decision. His angry reply was, “What’s your name? How do you spell it?”, so I knew I was in for it. A colleague at NIJ, who was more knowledgeable in matters bureaucratic, told me to write up the conversation in a memo to myself, to protect myself should the need arise. Fortunately, at that point it didn’t.

Second indication. At that time, the House of Representatives was controlled by Democrats, and we occasionally had staff members from the Legal and Monetary Affairs Subcommittee of the House Committee on Government Operations, which oversaw our budget, come in and ask us about our work. In particular, Charlie Intriago, the administrative assistant to Rep. Dante Fascell, came to NIJ and asked me questions. I was happy to answer them, and also did so when he followed up with phone calls.

Then one day, in the spring or summer of 1972, I was called into the office of the newly appointed NIJ director, Marty Danziger. He asked me if I had been talking to Intriago, and I answered “Yes” – I had no idea how they found out, or that they even cared. Then he slammed his hand down on the desk and reamed me out for doing so; he told me in no uncertain terms that all calls had to be vetted by the Public Information Office. He implied that I was disloyal, that I was essentially consorting with the enemy (a Democratic aide).

I was pretty shaken up by that encounter and wondered if my time at NIJ was limited; I had been there for less than three years and was therefore still subject to being fired for no cause. So I was looking to get out before my third anniversary date that fall, when a decision had to be made about my tenure there. My immediate boss, John Gardiner, commiserated with me and suggested that I consider applying for a position that opened up in Chicago, at a fairly new university, the University of Illinois at Chicago Circle (UICC), in a newly formed department, the Department of Criminal Justice.

I decided to apply and prepared a talk, but since it was over the summer, only three people were there: Jim Osterburg (department head), Joe Nicol, and Steve Schiller. It went well: I was soon made an offer, accepted it, and we moved to Evanston in the fall of 1972. It was as a visiting associate professor, that is, without tenure. I also was given a courtesy appointment in Systems Engineering, which later morphed into an appointment in the Department of Information and Decision Sciences. [An aside: when I went from ADL to DOJ, I took a salary cut, but standard Federal increases made up the difference and a bit more. And when I went from DOJ to UICC (now called UIC – Circle was dropped from its name) I also took a salary cut, with the same result. Even now, Illinois state pensions have constitutionally-mandated three percent annual increases. Salary-wise I’ve been fortunate in my career changes.]

  1. Making My Way in the Social Sciences

Having never taught (or even taken!) a social science course, here I was, having to put together courses in a social science curriculum. Two of the courses I was to teach were on research methods: first, how to do research; and second, how to analyze research data. But I had no idea as to what textbooks to use for the courses, so I relied on the books that others had used. After a few years I realized that those books were pretty limited, so I began to look for better ones. In particular, I was puzzled by the fact that the focus of statistical analysis was on getting a low p-value, that “statistical significance” was the be-all and end-all for assessing relationships among variables.

Moreover, it was considered cheating if you looked at the data (called data dredging or data snooping) to see if you could find interesting relationships, instead of putting the data, unseen, through the meatgrinders of what I termed the Four Horsemen of the Statisticalypse (SAS, SPSS, Stata, and SYSTAT) to obtain “significant” findings. For example, I was asked by NIJ to review a paper on recidivism. The authors found that every finding was “significant” because the number of cases was so large. So they decided to take a ten-percent sample of the data, just so that some of the findings did not rise to the level of statistical (.05) significance. Imagine, they threw out 90 percent of the data instead of using it to do a more in-depth study! So I finally found a good statistics textbook that didn’t rely on p-values alone to determine relationships.5

You can’t average everything. Moreover, in all too many studies the meatgrinders were based on averages, with factors thrown in to “control” for some factors. But how can you base your assessments on averages, when the groups being averaged over are very heterogeneous? If they’re very heterogeneous, on what basis can they be compared? As an example, how would you compare two different meals? Suppose one consisted of onion soup, filet mignon, Caesar salad, a fine Bordeaux wine, peach Melba, and a glass of port– and the other consisted of guacamole and tortilla chips, cheese quesadilla, carne asada, Dos Equis beer, nopal salad, tres leches cake, and a shot of Don Julio tequila. The meatgrinder approach is to put each meal in a blender and compare them that way – after all, they both go into the stomach, don’t they?

Specifically, evaluating two such meals in this way, one could compare them in terms of their carbohydrate and alcohol levels, but not in terms of the way we would ordinarily evaluate them – how they taste and how satisfying they seem. In other words, averaging over a heterogeneous sample or population restricts the ways that they can be compared – and don’t tell me that you can “control” for the various flavors! Francis Galton had it right when he criticized those who focused on averages as being "as dull to the charm of variety as that of the native of one of our flat English counties, whose retrospect of Switzerland was that, if its mountains could be thrown into its lakes, two nuisances would be got rid of at once."

Getting published. As for writing articles, I updated and condensed my evaluation monograph and wrote an article for Operations Research entitled “Measures of Effectiveness for Crime Reduction Programs.” And I was asked by John Gardiner, who also had left NIJ for UIC, to write a chapter on organized crime research (which I had dealt with at NIJ) for a book he was coediting. Since my focus in organized crime research had been on how illegal markets operate and on policies and operations rather than on people, I expanded it to include white-collar crime.

One of the illegal markets I became interested in was cigarette smuggling, because Herb Edelhertz, who by then had left NIJ to run the Battelle Seattle office, asked me to work on it with him. I made the assumption that there would be two kinds of cigarette smuggling: one between bordering states with tax differences (which I termed “casual smuggling”) and the other from very low-tax states to high-tax states (which I called “organized smuggling”). And I found good evidence for these two, based on state tax rates and sales data, and on the distance between states (for which I used the distance calculations I had learned from George Kimball). That is, sales in a low-tax state like North Carolina were far above what one would expect, and those in Illinois were far below expectations, suggesting organized (truckload-sized) smuggling. And Indiana’s were slightly above expectations, suggesting casual smuggling between it and Illinois. This study, published in an Operations Research article, produced an accolade from one of the top statisticians in the US, W. Edwards Deming, of which I am particularly proud. It was also published in a monograph, Law Enforcement Guide: Combatting Cigarette Smuggling, cowritten with Edelhertz and Harvey Chamberlain.

<p>Deming letter</p>

Deming letter

On being a smartass. I had a few issues with my department head, Jim Osterburg. At one point he told me that he had been asked to review a book manuscript on organized crime, and asked me if I was interested in doing it instead of him. He said if I gave him the review, I could select two or three of their published books as an inducement. I said that I’d be happy to, but that I would correspond directly with the publisher. Which I did, and found out that part of the inducement was a check for $50 or $100, that Jim neglected to tell me about.

I had another run-in with Jim sometime afterwards. We were planning to mount a PhD program and began recruiting. Jim had a good friend and excellent scholar who was teaching at Rosary College, and Jim was promoting him for one of the positions. But he was essentially a solitary (almost monastic) scholar; I didn’t see him as someone who could help in terms of dealing with PhD students and the research issues that they would encounter. I wrote a memo to the faculty, with a copy to the dean of Liberal Arts and Sciences, called something like “Attributes to Consider in Hiring Faculty for a PhD Program.” Jim was furious, but we didn’t hire him. And, as you might have guessed, by then I was rolling in terms of getting publications, and had gotten tenure. So I could afford to be a smartass. [One of the persons who reviewed my application for tenure, I later found out, was Al Reiss, a well-known Yale sociologist, who liked the articles I had produced but wished that I had published more of them in criminology journals.]

Additional topics. And around that time DOJ began to look into the way criminal records were being used, to deny credit, to prevent ex-offenders from getting credit or jobs or housing. None of these uses were necessarily appropriate, and very often they were being done by people who knew police officers who had access to the records, and who (often illegally) checked the records for friends (and money). So Congress and DOJ began to solicit ideas as to how to best curtail such practices. First I wrote a letter to the editor of Harpers Magazine (January 1974) about the topic. Then I wrote a response to their solicitation, and then later turned it into a chapter, “Privacy, Criminal Records, and Information Systems," for a book edited by Sidney Brounstein and Murray Kamrass, Operations Research in Law Enforcement, and Societal Security. My contention was that information was a valuable commodity, but rather than ban its use altogether (which would not stop it, but would increase the market price for such data), the information could be made appropriate to the need: an embezzlement record should prevent a person from getting a job in a bank, but not foreclose the person from getting a job as a truck driver; but a reckless driving record should preclude the driver position but not the bank position. In other words, the questions one could ask about a person’s criminal record should be tailored to the specifics of the situation.

A data maven, I also wanted to understand the origin and development of the Uniform Crime Reporting System (UCR), the nation’s primary way of looking at crime. After all, I was teaching about crime and wanted to be clear about how it was measured. So I got into the details of when and how the UCR was developed; why it focused on seven crimes (murder, forcible rape, robbery, aggravated assault, burglary, larceny over $50, and auto theft) to form a Crime Index, and how it changed over the years since its inception in 1930. In fact, a 1977 paper I wrote describing its origins (“Crime Statistics: A Historical Perspective”) is my most downloaded article.

Recidivism research. At around that time we hired Dick McCleary, who was just finishing his PhD in Sociology from Northwestern University, under Donald Campbell. He had been doing research on parole and had data on the date of release and (in some cases) rearrest of parolees. We looked at it together, and then I decided to determine the time difference between date of release and date of rearrest, or, for those not rearrested, the date the data was compiled. We then plotted it, a simple two-dimensional plot showing the number of those rearrested vs. the length of time from release. At that point we had an “Aha!” moment, since the number of those rearrested followed a negative exponential distribution, but the curve clearly showed (Figure 3) that a significant number of them would never be rearrested. That is, it portrayed an incomplete exponential distribution.

<p>Recidivism data compared to two recidivism models</p>

Recidivism data compared to two recidivism models

But we were unaware as to how to estimate the value of the exponent (how fast the failures rose and tapered off) and the value of the asymptote (the level to which they rose). So to answer that question I called Steve Pollock, who by then had started teaching at the University of Michigan. He said, “use a likelihood function” – and then we had to figure out what a likelihood function was and how to apply it to this problem. Steve suggested that we look at Jan Kmenta’s book on microeconomics, and that’s how we learned about likelihood functions. Our initial paper was entitled, “The Mathematics of Behavioral Change: Recidivism and Construct Validity.” And with that result, we started a strand of research that lasted quite a while for us.

That paper changed the way criminologists measured recidivism. Prior to our paper, recidivism was measured at one point in time, the “one-year recidivism rate.” That is, people based their assessment of, say, a particular correctional program on the percentage of people going through the program who had recidivated within 12 months of being free. It did not matter if 60 percent of them failed in the first month and there were no more failures in the 11 succeeding months, or if five percent failed in each of the 12 months – just that one point was used to determine it.

And we thought that we were the first to discover the method, but soon found that it was first developed by John Boag in his 1949 analysis of patients cured by cancer therapy. Sic transit gloria mundi.

Based on that method, Dick and I applied for and received a grant from NIJ to do a more in-depth study of recidivism. But within a year Dick left to go to Arizona State University, so I carried on with the grant. We wrote a few additional articles on the topic, answering criticisms; and I ended up writing the grant’s final report, which I soon turned into my first book, Recidivism.

Recidivism rethought. Now with a few more years under my belt, and a little more perspective, I realize that we should not only follow the lead of biostatisticians in the methodology to use, but also in the perspective we bring to evaluating correctional programs. Specifically, we should look at survival rather than failure as the more important consideration.

This is an important point that is often overlooked: the words we use have an impact, and these two words promote different views of the context in which a failure may occur. When we say a person recidivates, we frame the situation as a deliberate action on the part of the offender. That is, we ascribe to the offender a willfulness to do bad. On the other hand, when we talk about correctional failure from the standpoint of survival analysis, we frame the situation as an action on the part of the offender due to either

  • the inadequacy of the programs that were used to stave off additional criminality,

  • his/her inability to withstand the pressures preventing rehabilitation, or

  • the individual’s propensity to commit crimes regardless of any support given the offender.

I realize, of course, that the individual actors have a major part to play in their failure to stay on the straight and narrow, but it is not just they whose actions are in play: staying on the straight and narrow (a good metaphor in this case) can’t be accomplished by a people who don’t have good balance – and who weren’t given the tools (or encouragement) to improve their balance to help prevent them from straying off the path.

Other correctional programs do not ascribe personal willfulness to failure. As Alan Marlatt noted in his book Relapse Prevention, we realize that it may take a few tries before a person quits smoking, or drinking, or drugs. In these situations we say a person has thus far survived, and that s/he has not yet relapsed.

Obviously, there are people out there who are truly beyond correction and who, regardless of the circumstances, will continue to reoffend. But this is far from the norm, and tars every other person released from incarceration with the same brush.

Additional research areas. As I noted, with my engineering/OR background I had a different viewpoint from most other researchers in criminology. My approach was to look at data patterns to get some idea as to what the data could tell us, sometimes about the phenomenon under study, sometimes about the differences in the way agencies collected the data -- rather than about the personal or social characteristics of the individuals represented in the data.

This also led to my being asked to look at organized crime from the same standpoint. Charlie Rogovin, a former LEAA administrator, was then teaching at the Temple University law school. He had initiated a program of research into organized crime while there, and asked me to look into how such programs might be evaluated. The result was a monograph, one that was subsequently published by UIC’s Office of International Criminal Justice, Measuring the Effectiveness of Organized Crime Control Efforts.

My interest in organized and white-collar crime was given another boost in 1977 when Steve Pollock was asked to look into the possibly collusive bidding behavior in Genesee County, Michigan, on standard off-the-shelf road material, corrugated culvert pipe. The National District Attorneys Association funded the project, probably with LEAA money. Steve brought me into the project, and we collected bidding data over a number of years. [BTW, it was not easy to put all the data together at that time. Bids were all on paper, requiring a great deal of manual data entry; spreadsheets were limited in what they could handle; and graphing programs and equipment were far from easy to deal with.]

When we plotted the data (Figure 4), it was clearly obvious that there was turn-taking among the bidders: most of the bids were at the same high level, while just a few were lower, and the lower bids got the contracts. However, in one of the years we plotted the data seemed to be less collusive. We soon found out that this was known as the “year of the price war.” Before and after that year the collusive patterns were obvious. Although we presented the data to the Genesee County prosecutor, we never were able to testify about our finding: it turned out that the prosecutor was indicted for misuse of Federal funds!

<p>1972 and collusive bidding, and 1973 and the year of the price war</p>

1972 and collusive bidding, and 1973 and the year of the price war

And about that time a report on Illinois’ Unified Delinquency Intervention Services (UDIS) was published, prepared by Charles Murray and Louis Cox. It showed that those selected for the intervention had been increasingly (almost exponentially!) delinquent just prior to receiving the UDIS treatment, their post-treatment delinquency rates dropped down to the level where it had been before it started to rise (see Figure 5). To me the cause for this precipitous rise and subsequent (to treatment) fall was a variant of “regression to the mean”, which we called “selection of the extreme.” Steve Pollock, Dick McCleary, Andy Gordon, David McDowall, and I showed that if kids are randomly delinquent, say once every week, but are only randomly (say, ten percent of the time) detected by officials as being delinquent, it might just be a decision rule like “Select those kids who have four detected incidents in the past six months” that makes it look like their activity was growing, when it was just the luck of the draw that their activity in the past few months was more frequently detected.

<p>UDIS data overlaid with our model</p>

UDIS data overlaid with our model

What this model shows is that a group of offenders with an average arrest rate of about .88 arrests per month (10.6 per year), but alternating between 1.69 arrests per month (20 per year) and .08 arrests per month ( 1 per year), can appear to have arrests rates growing uncontrollably when viewed retrospectively. But this exponential buildup is merely due to the fact that most of them were in their active state when intervention occurred. The apparent suppression effect is obtained by comparing this artifactual buildup in arrest rate to the average rate that would characterize arrest behavior after release. And we got a few publications out of this.

Charles Murray later gained more notice when he and Richard Herrnstein published The Bell Curve, which purported to prove that blacks had innately lower intelligence, another statistical miscalculation.

In terms of the doing of research, I was concerned that surveys were considered the primary data sources in the field. I later wrote (in Recidivism, 1984):

“When I was an undergraduate in engineering school there was a saying: An engineer measures it with a micrometer, marks it with a piece of chalk, and cuts it with an axe. This expression described the imbalance in precision one sometimes sees in engineering projects. A similar phenomenon holds true for social scientists, although the imbalance is in the opposite direction. It sometimes seems that a social scientist measures it with a series of ambiguous questions, marks it with a bunch of inconsistent coders, and cuts it to within three decimal places. Some balance in precision is needed, from the initial measurement process to the final preparation of results.”

Crime mapping. Sometime later I was asked by a colleague in engineering, Bill O’Neill, why, if mapping could be used by agricultural researchers to determine hot spots that required attention, the same procedure could be used in crime mapping to determine hot spots that required police attention, I replied that we knew when and where the hot spots were: inner city neighborhoods on weekend evenings. But it suggested to me that it might be useful to map those neighborhoods more closely to see if incidents were related to each other or to characteristics of the blocks where they occurred.

I was at that time (1986) on the Chicago PD’s research advisory committee, being run by Dennis Nowicki, head of administrative services, so I asked him if we could begin to do some crime mapping. He said that it was already being done, and that Andy Gordon, professor of sociology at Northwestern University, and Warren Friedman, head of the Chicago Alliance for Neighborhood Safety, were doing it; he suggested that I join up with them. Which I did.

They were using the first-generation Macintosh computer to draw a map of Chicago on the screen and put different icons, representing different crime types, on the screen. The commander of CPD’s 25th District, Matt Casey, was willing to let us play around with the data his officers generated. They filled out forms in duplicate; one was sent to headquarters to be entered into a database, and the other was kept at the district.

I thought that the effort would be worth an in-depth analysis so, with the approval and encouragement of Dennis and his superiors, Fred Rice and John Jemilo, CPD Superintendent and Deputy Superintendent respectively, we applied for a federal grant to study its use in police work. We truly didn’t know how we should deal with the data, but threw out some possibilities (crime pattern analysis, dynamic mapping, adding contextual information, etc.) to make it sound like we knew what we were doing. The grant application was submitted by the CPD, with me as project director. And lo and behold, we were funded (Mapping Crime in Its Community Setting, 86-IJ-CX-0074)! Then we had to figure out what we really should do.

We went over to the CPD and met with Jemilo and Casey to figure out our next steps. Matt invited us to the district and called in a member of his tac team, Marc Buslik, who was in the office writing up an arrest report. He said, “Marc, you know something about computers, don’t you?” Marc, now a retired District Commander himself, replied that he did, so he was assigned to work with us. For a start, Marc suggested that we take the activity reports from the previous day and enter the locations into the Macintosh, with a different icon to represent different crimes – and the time of occurrence would also be included, attached to the icon. Three things came out of this: first, we had a one-day turnaround time for computerized data, while getting such data from headquarters would take weeks. Second, Matt was able to deploy his officers based on much more current data than if he had waited for the headquarters printout. Third, prior to going out on patrol, officers could take a look at the previous day’s (and week’s) activity to see what they should concentrate on.

Our book, Mapping Crime in Its Community Setting, which detailed what we found. Marc wrote a paper describing how a neighborhood’s crime maps could be provided to its residents. He and I later wrote a chapter, “Power to the People: Crime Mapping and Information Sharing in the Chicago Police Department,” published in 1998.

Soon after the end of our project, NIJ launched a new program to deal with drugs, using mapping techniques. It was called DMAP, for Drug Market Analysis Program, which nicely underscored the utility of mapping in understanding the nature of drug markets. Five cities received grants under the program, and Craig Uchida, its NIJ monitor, formed an advisory committee to oversee them. The committee consisted of Al Reiss, Steve Mastrofski, and me. There were projects in Hartford, Jersey City, Pittsburgh, Kansas City, and San Diego. The most successful one was in Jersey City, whose evaluation was directed by David Weisburd and Lorraine (Green) Mazerolle; the JCPD’s then research director, Frank Gajewski was also involved.

Since then, of course, virtually every city police department maps its crimes. They based it on the NYPD’s CompStat, which was based on our mapping project. Unfortunately, however, they don’t use it the way it was supposed to be used – for all district commanders and analysts to look at patterns of activity and share ideas as to how to deal with them. Instead, it appears to be used most often as a bludgeon for the top brass to compare each district’s crime statistics with the others and to ream out those district commanders whose numbers are creeping up, ignoring the district-to-district differences in income, ethnic, and general contextual factors that generate differences in criminal activity.

Crime mapping seemed to me, a person who likes to visualize patterns in data, a much more natural way of finding relationships, either causal or correlational. But the reigning method of the time in criminology and in virtually all of the social sciences was to crunch a data set and look for low p-values, which translated, the authors asserted, into “significant” relationships among the variables. Curious as to why and how this method/approach gained popularity, I again looked for its historical antecedents. What I found convinced me that it was often a mistaken approach to data analysis, and in 1994 I wrote an article, one of my favorites (“Deviating from the Mean: The Declining Significance of Significance”), criticizing this approach. And from that point on, although I still taught students about significance tests (after all, they had to understand papers that used it), I explained its deficiencies as well.

Additional appointments. In 1995 my wife, Marcia Farr (also at UIC, in English and Linguistics), received a full-year Fulbright there to do research in sociolinguistics at el Colegio de Michoacán in Zamora, Michoacán. So I applied for a Fulbright there as well, but only received a half-year appointment. [We joke that she got a Fulbright and I got a Halfbright.] I attempted to teach the same principles about looking at data rather than stuffing it into a computer program, and was partially successful. The main result of my stay there, however, was my own increased fluency in Spanish.

In 1997 I was recruited by Jamie Fox at Northeastern to both teach and become editor of the Journal of Quantitative Criminology, which Jamie started. The then editor, John Laub, was leaving to join the University of Maryland, and Jamie wanted to keep the journal at Northeastern, using it as an inducement to leave UIC. It came with a salary increase and an additional stipend for being the editor, and I accepted. Jamie tried to find a position for my wife Marcia at NEU or another nearby college, but was unable to. So instead of my moving to Boston permanently, I turned it into a visiting professorship and returned to UIC – with the salary increase, the stipend, and the journal editorship!

It was both hard work and fun, editing the journal. I tried to get the journal to focus less on statistical analyses and more on visual representations of data, with limited success. Now, of course, it’s a lot easier to do so, since word processing software, online journals, and even printed journals are set up to print graphical output.

Other stuff. In 2000 I was asked to consult on the statistical analysis that was submitted by the defendant, the Village of Mt. Prospect, in a lawsuit. The village was accused, by some of its officers, of targeting Latino drivers for DUI arrests to fill their arrest quotas. My job was to look at the statistical evidence prepared by another statistician (the suburb’s expert witness) and evaluate its merits. I was able to show that there were no merits to the analysis (the data set was hopelessly corrupted), and the case was settled before I had a chance to testify.

What struck me after the settlement, however, was the geography and timing of the arrests. Most of them occurred on weekend nights on the road between the bars where most of the Latinos went to drink and the areas where they lived. None were located on the roads near the Elks or Lions clubs, where the “good people” bent their elbows.

I blame myself on not seeing this immediately, but it helped to underscore the necessity of going beyond the given data and looking for other clues and cues that motivate those actions that are officially recorded. While it may not be as necessary in some fields of study, in criminology it certainly is.

Other publications. Steve Pollock and others created a series of handbooks on operations research and management science in different fields. He was tasked with the job of putting together a handbook on the use of operations research and management science in the public sector, and I worked with him on it. We cowrote the first chapter, “Operations Research in the Public Sector: An Introduction and a Brief History,” and I authored a chapter entitled “Operations Research in Studying Crime and Justice: Its History and Accomplishments.” As usual, I started with a historical view, from Quetelet to Poisson to Shaw and McKay, and brought it up to the present.

  1. Involvement with the Bureau of Justice Statistics

I had interacted with Jan and Marcia Chaiken in the late 1970s, when they were on the RAND Corporation’s staff and working on criminal justice programs. And we stayed in touch, and worked together on additional projects. So when Jan became director of the Bureau of Justice Statistics (BJS) in 1993, he asked me to get involved with the organization. Rather than leave UIC and return to DC, we arranged for me to become a Visiting Fellow, and I held that position from 1995-2000. During most of that time I would spend about half a week in DC every other week, staying in a hotel around the corner from BJS. I worked on various projects while there, focusing for the most part on crime statistics, both survey data (the National Crime Victimization Survey, NCVS) and police-recorded data (the UCR). During that time I also served as Editor of the Journal of Quantitative Criminology as well.

Working with crime data. Although at BJS I did some work on the NCVS, my attention was focused primarily on UCR data. The main reason for this was that in 1994 Congress, in reauthorizing the 1968 Omnibus Crime Control and Safe Streets Act, appropriated additional funding to be distributed to police departments based on the number of violent crimes they experienced in the past three years, based on their UCR statistics. Jan asked me to figure out how to do it.

This meant that I had to understand the ins and outs of how agencies reported their data to the FBI and what the FBI then did with it. I got in touch with FBI statisticians in DC, primarily Yoshio Akiyama, Jim Nolan, and Vickie Major; I also traveled to the FBI’s Charleston West Virginia facility, where the data was processed, to meet with Ken Candell and others to determine how they dealt with it once they received it. I then discovered just how problematic the data might be, depending on how meticulous the agency was.

For example, there were occasional unexpected jumps (either up or down) in crime counts, some of which coincided with new police chiefs and/or new policies. And there were occasional gaps in the data. This sometimes happened sometimes when a new person in charge of sending the data to the FBI had not yet learned the ropes, sometimes because of computer problems, sometimes for no apparent reason – and sometimes the agency stopped sending the data for over a year. To compensate for small gaps (fewer than ten months) the FBI assumed that the monthly count would be the same for the missing months as for the reported months. So the estimated annual count for agencies reporting three or more months was just the actual count for that year times 12/N, where N was the number of reported months.

But if an agency reported two or fewer months for a year, the FBI treated it as it would a non-reporter, and estimated its crime counts by using the crime rate of another, presumed “similar” (i.e., having close to the same population), agency. That is, if Evanston IL were non-reporting for a year, a similar agency (say, Skokie IL) would be selected, and the crime experience for Evanston would be considered to be the same as Skokie’s, and the FBI multiplied Skokie’s crime count by the ratio of Evanston’s population to Skokie’s population – regardless of the known differences in age, ethnicity, and income distributions.

So getting good figures on violent crime, required by the legislation, meant that we had to decide whether to use FBI-imputed figures for the missing data. We decided to not use the imputed data, but to count only those violent crimes that the agencies reported in their monthly submissions. In fact, we hoped that by doing so, agencies with missing data would be encouraged to improve their reporting practices.

One of the ways I tried to understand the reporting practices of police agencies was to put together a working group looking at imputation practices, for both the UCR and its companion report, the Supplementary Homicide Report (SHR) that describes individual homicides in much greater detail than does the UCR homicide report. We held a two-day meeting in Washington in 1996, bringing in police practitioners, people from the FBI, state agencies and criminologists. This led to a BJS report, Bridging Gaps in Police Crime Data, and to my abiding interest in improving crime data.

I also began to get interested in looking at criminality from a different standpoint. Rather than just looking at types of offenses and offenders, I began to look at how events occurred over time and in neighborhoods. I put this together in a chapter, “Criminality in Space and Time: Life Course Analysis and the Micro-Ecology of Crime,” which was published in 1996.

Additional areas. In 1998 John Lott wrote the book More Guns, Less Crime, an elaboration of an article that he and David Mustard published a year earlier (“Crime, Deterrence, and Right-to-Carry Concealed Handguns”). In it he used county-level data obtained from the Interuniversity Consortium for Political and Social Research (ICPSR) to “prove” that counties that had RTC laws had lower homicide and violent crime rates. I didn’t realize that ICPSR published county-level UCR crime rates, and how bad they were in imputing missing data, until I encountered Lott’s analyses.

Joe Targonski, then a graduate student of mine, and I explained (in “A Note on the Use of County-Level UCR Data”) that the fact that rural jurisdictions have poorer reporting practices can bias the results in such a way that the results cannot be taken as valid. Lott and John Whitley countered, in the article “Measurement Error in County-Level UCR Data,” which we then re-countered with the article “Measurement and Other Errors in County-Level UCR Data: A Reply to Lott and Whitley.” The end result: don’t trust Lott’s analyses.

And another graduate student, Jacqueline Mullany, and I explored the use of life course trajectories to understand how and when people get involved in crime (“Visualizing Lives: New Pathways for Analyzing Life Course Trajectories”). This line of research went counter to the spreadsheet-like analysis of crime, since we looked into when in a person’s life certain things happened and how it might have affected their subsequent behavior. Most research at that time dealt with how many things happened rather than look at their trajectory over time.

As you might expect, with all the publishing we academics do the royalties certainly do mount up. Here’s a typical example:

  1. Retirement and Beyond

I retired from UIC in 2002, as did my wife Marcia. However, she was offered, and accepted, a position at the Ohio State University. So for the next ten years we split our time between Ohio and Illinois. I became affiliated with OSU’s Criminal Justice Research Center, an affiliation I still have. While there I worked on three studies: imputation of the UCR (2006), domestic violence in Ohio (2007), and analyzing FBI-collected arrest data (2007).

In addition to these projects, I also worked with Randy Roth (History and Sociology, OSU) on different projects dealing with homicide. In 1998 I published an article on visualizing homicide, in which I showed how necessary it is to look at the data before doing any analyses. When I got to OSU I started working with Randy. He was interested in child homicide, and on homicide patterns throughout the country and the world. We had an interesting finding that we never published: it turned out that those counties which had higher percentage of Republican votes in 2008 (when Obama won) than in 2004 (when Bush won) also had higher homicide rates in subsequent years. This jibed with Randy’s theory that homicide rates go up when/where discontent with the government is higher. And, with Doug Eckberg, we subsequently showed that in the nineteenth century the “wild west” was actually wilder, in terms of homicide rates, than the rest of the country.

I also had a contract with the California Department of Correction and Rehabilitation (CDCR) to forecast California’s prison population (2009). For that contract I enlisted the services of Jan Chaiken as consultant; I needed his expertise, but his name could not be on the contract because at the time his daughter was a staff psychologist at California’s Pelican Bay correctional facility.

After a while a researcher is no longer on the front lines in the doing of research, and becomes a bit more reflective. I started to get requests to write more global articles, and pieces for handbooks or encyclopedias. For the Journal of Experimental Criminology I wrote “Some p-Baked Thoughts (p > 0.5) on Experiments and Statistical Significance,” my view of some of the issues I experienced in dealing with statistical significance.

Police stops. I also was concerned with the fact that I felt that the ACLU went too far in insisting that the Chicago Police Department pull back in its efforts to control crime; my 2017 blog, ”Police Stops” described my misgivings. A year later Paul Cassell and Richard Fowles provided empirical evidence that it caused a spike in homicide. Cassell presented their findings at the Illinois Academy of Criminology, to which I responded; here is the (90-minute) video of that presentation.

With regard to police actions, let’s put a few things in their correct order. Prior to “the inherent racism and classism in our justice system” (undeniably so) is the inherent racism and classism in governmental (and corporate) policies that kept African-Americans out of many communities, thus segregating them. This led inevitably to fewer resources (education, social services, infrastructure maintenance, etc.) provided to the neighborhoods to which they were relegated, which in turn led to increased crime and violence in those neighborhoods.

Yes, more police attention is devoted to minority neighborhoods, but let’s not forget that the overwhelming majority of the neighborhood’s residents do not commit crimes. In fact, these residents are at greater risk of victimization, which is why the police are more active in these neighborhoods.

In other words, police resources are usually allocated, not on the basis of race, but on the basis of crime rates, violent acts, and shootings, all of which are more prevalent in those neighborhoods, stemming from the aforementioned biased policies of the past. True, there is heightened scrutiny of minority neighborhoods – because it is in fact a heightened scrutiny of more dangerous neighborhoods. And this is true not because of police policies, but because of policies established by public officials and by organizations (e.g., redlining) well above their pay grade.

This does not excuse the execrable behavior of individual police officers, who have detained, assaulted, and killed unarmed minority civilians with impunity; only the (more recent) presence of cameras has given lie to their assertions of imminent danger.

Overviews. I was still being asked to do overviews in some areas. I wrote a piece with Kathleen Frey on the “History of the Statistics of Crime and Criminal Justice” for the Encyclopedia of Criminology and Criminal Justice. For an edited book, Putting Crime in Its Place, I wrote “Waves, Particles, and Crime,” about how looking at crime from different perspectives, different units of time and space, provides different perspectives about how to deal with crime. For the Handbook of Quantitative Criminology I wrote a chapter, “Look Before You Analyze: Visualizing Data in Criminal Justice.” And for the journal I used to edit, the Journal of Quantitative Criminology, in 2010 I wrote a think-piece on data visualization, “Picturing JQC’s Future.”

  1. Looking Back

This is a sort of crammed-together description of my research career. In retrospect, I see that I focused primarily on technical issues, but in a data-centric way, applying technical thinking to relatively “soft” areas. That is, I was not so much focused on what our policies should be with respect to crime, corrections, law enforcement, or criminal justice in general, but rather on the data and methods we use to determine what our policies should be: how can our knowledge of how data is collected in these fields inform us of best practices? how good/useful is the crime data we use? what's the best way to measure recidivism? what’s the best way to extract information from data? how can data visualization help to find patterns in data? Occasionally I drifted into policy-related issues, but it was usually in the context of getting the most information we could out of the data we had.

I think I helped move the field into new ways of looking at issues, and I'm pleased with what I was able to accomplish.


No comments here