proposed laws

PA Bill Number: HB2170

Title: In assault, further providing for assault of law enforcement officer; and making editorial changes.

Description: In assault, further providing for assault of law enforcement officer; and making editorial changes. ...

Last Action: Referred to JUDICIARY

Last Action Date: Mar 28, 2024

more >>

decrease font size   increase font size

CPRC-A response to Mother Jones' mistake filled article on John Lott and the Crime Prevention Research Center :: 08/02/2015

The altered image above is from a new Mother Jones’ article, which was written to try discrediting John Lott and the Crime Prevention Research Center.

MJ lott

Mother Jones is willing to do anything to push for gun control.  The magazine is very left-leaning and is funded by individuals such as George Soros.  Even academics such as James Alan Fox, who supports gun control, have taken Mother Jones to task for its misleading use of data on guns.

For several years, Mother Jones and John Lott have been having a running feud.  Unable to win the battle of facts, Mother Jones this week tried hitting back in a more personal way.

Julia Lurie, the reporter Mother Jones assigned to do a story on Lott, somehow managed to ask him about 50 specific questions and still write on many issues that she had not asked him about, and when she did ask about points she wrote on she ignored Lott’s responses.  Her piece was filled with simple factual errors. Even a brief look in Lott’s book More Guns, Less Crime or in his original research paper with David Mustard would have prevented them.

Lurie somehow couldn’t manage to talk to any researchers who have found results similar to Lott’s, and she couldn’t even mention what most of the peer reviewed studies have found.

Mother Jones claims that the Crime Prevention Research Center’s work isn’t “academic quality.”  According to the article, Professor Gary Kleck said that he doesn’t know of any “credible criminologist” who believes that “with more guns there are less crimes.”  “Garbage in and garbage out” is how Kleck reportedly described Lott’s work.  Lurie also cites The National Research Council, claiming that the organization declared there to be “no credible evidence” that right-to-carry laws affect violent crime.  The attacks are misleading and out of context.

Take Lurie’s points in order.

1) “The National Research Council, a branch of the National Academy of Sciences, assembled a panel to look into the impact of concealed-carry laws; 15 of 16 panel members concluded that the existing research, including Lott’s, provided “no credible evidence” that right-to-carry laws had any effect on violent crime.”

The National Research Council report actually concluded as follows: “The committee concludes that with the current evidence it is not possible to determine that there is a causal link between the passage of right-to-carry laws and crime rates.” Lurie somehow manages not to mention that despite evaluating every possible gun law, the Council found no evidence that any law had any impact.  The Council was noncommittal about every policy.  The panel standard response was simply advocating that more money be available to academics to fund additional research.

In fact, right-to-carry laws were actually the only type of law where there was dissent. James Q. Wilson, who at the time was possibly “the most influential criminal justice scholar of the 20th century,” concluded: “I find that the evidence presented by Lott and his supporters suggests that [right-to-carry] laws do in fact help drive down the murder rate.” 

2) “that Lott had drawn inaccurate correlations: Cities had experienced a spike in crime in the 80’s and 90’s in part because of the crack epidemic, not because of strict gun laws.”

But from the very start, Lott’s research has addressed the crack cocaine issue.   As Florenz Plassmann and John Whitley (Stanford Law Review, 2003) summarize the research at that point:

“One of Ayres and Donohue’s greatest concerns is the apparent failure of previous research to account for the differential geographic impact of cocaine on crime. Lott’s book (and the Lott and Mustard paper) reported that including price data for cocaine did not alter the results. Using yearly county-level pricing data (as opposed to short-run changes in prices) has the advantage of picking up cost but not demand differences between counties, thus measuring the differences in availability across counties. Research conducted by Steve Bronars and John Lott examined the crime rates for neighboring counties . . . on either side of a state border. When the counties adopting the law experienced a drop in violent crime, neighboring counties directly on the other side of the border without right-to-carry laws experienced an increase. . . . Ayres and Donohue argue that different parts of the country may have experienced differential impacts from the crack epidemic. Yet, if there are two urban counties next to each other, how can the crack cocaine hypothesis explain why one urban county faces a crime increase from drugs, when the neighbor- ing urban county is experiencing a drop? Such isolation would be particularly surprising as criminals can easily move between these counties. . . . Even though Lott gave Ayres and Donohue the cocaine price data from 1977 to 1992, they have never reported using it.”

In his third edition of More Guns, Less Crime (2010) Lott uses new data from Fryer et al to attempt to measure the impact of crack cocaine from 1980 to 2000.

Critics such as Ayres and Donohue claim that the results can be explained away by the impact of crack cocaine, but they haven’t done any concrete analysis to show this.

While Ms. Lurie didn’t ask Lott directly about this issues, even a quick look in the appendix of More Guns, Less crime would have show this claim that crack cocaine was ignored was false.

3) “When [Ayres and Donohue] extended their survey by five years, they found that more guns were linked to more crime, with right-to-carry states showing an eight percent increase in aggravated assault.”

This is a simple counting error.  Ayres and Donohue made the false claim, and Lurie never bothered to confirm it.  The Second edition of More Guns, Less Crime (2000) uses data from 1977 to 1996.  The 2003 paper by Ayres and Donohue uses data from 1977 to 1997.  John Lott provided Ayres and Donohue with his data from 1977 to 1996.  Adding the data for one year, 1997, did not make a notable difference.  Ayres and Donohue obtained somewhat different results because they used a different specification and ended up misinterpreting their results.

Even the first edition of More Guns, Less Crime (1998) had some estimates with data up through 1994, and thus even compared to that edition, Ayres and Donohue only added three years of data.

Again, either a fast look at either the first, second, or third editions of More Guns, Less Crime would have let Lurie realize that this claim was incorrect.

4) Claim by Gary Kleck that John Lott hadn’t “accounted for missing data,” and that “It was garbage in and garbage out,”  The problem is simple: in some counties not have all the cities in those counties reporting crime rate data every year.  This causes some randomness in the number of crimes reported for those counties.  The problem used to be particularly prevalent in low-population counties, but it has improved considerably over time.

Take Georgia, a state whose data was singled out as particularly flawed due to this problem..  From 1980 to 1993, 16 of Georgia’s least-populous counties (out of 159 total counties) received crime reports from only 65% of police departments.  By contrast, the 127 most populous counties (with 97.2% of the total population) averaged an non-reporting rate of 5.6%.  In his regressions, Lott weighted county data according to population.  Therefore, the counties with the most significant problems had little effect on the results.

All data contain some errors.  The question is whether those errors are random or whether they systematically bias the results.   This data error has been accounted for in many different ways.

— In their original paper, Lott and Mustard first looked at all counties.  They then narrowed their scope – first to counties with more than 50,000 people, and then to those with more than 100,000 people.  The results stayed much the same, showing that the low-population counties with these errors were not creating a bias in favor of right-to-carry laws.

— The Second edition of More Guns, Less Crime studied city, county and state level data.  Even if that particular error existed for county level data, it did not exist for city or state level data.  And, again, the results were similar.

— A 2002 paper with John Whitley explicitly examines errors in the county level data and finds no evidence of any systematic biases.  This was published in 2003 in the Journal of Quantitative Criminology.

All of this information was provided to Lurie.

5) “Kleck, who conducted a controversial, yet often-cited survey on defensive gun use, observes, “Do I know anybody who specifically believes with more guns there are less crimes and they’re a credible criminologist? No.”

So, does Gary Kleck not believe that James Q. Wilson was a credible criminologist?   Gary Mauser just completed a survey of researchers who have published on firearms issues in refereed criminology journals between 2000 and 2014. Mauser found that 31% of these scholars thought that right-to-carry laws lowered murder rates.  Fifteen percent said that these laws increased murder rates, 46% said that the laws had no effect, and 5.1% said that they didn’t know.  Clearly, a very significant proportion of criminologists believe the more guns, less crime hypothesis.

6) What John Lott actually claimed about the views of economists and criminologists was that the vast majority of published peer-reviewed papers looking at the impact that right-to-carry laws had on US crime rate found that they reduced violent crime rates and the rest of the papers claimed that there was no effect for murder, rape and robbery (see also here).

7) “The organization . . . proceeds and publishes ‘academic quality’ reports that have yet to be published in peer-reviewed journals.”

To put things in perspective, John Lott has published over 100 peer-reviewed academic journal articles.  The CPRC was only started in October 2013, and it takes time to produce research.  It takes even more time for the peer-review process to conclude.  Nevertheless, we supported research published last year in the paper “The Impact of Right-to-carry laws on Crime: an Exercise in Replication” (Review of Economics and Finance, Carlisle Moody, Thomas Marvell, Paul Zimmerman, and Fasil Alemante).  The CPRC co-authored a paper that was published in the peer-reviewed Public Choice.  In addition, as Lurie was informed, one paper by the CPRC has been revised and resubmitted to a journal.  Another paper, showing errors in a recent FBI report on active shooters, was published in Academy of Criminal Justice Sciences Today.

The CPRC’s academic advisory board members are at the top of their fields and are affiliated with the University of Chicago, Harvard, and the Wharton Business School.

8) “one of the small number of very pro-gun researchers like Gary Kleck or John Lott”

This statement makes two mistakes.  First, most economists who have published research on firearms in peer-reviewed journals believe that there is a net safety benefit from people carrying guns.  For example, worldwide 83% of economists who have published on this topic believe that guns are more likely to be used in self-defense than to be used in crime and 74% believe that concealed handgun laws lower the murder rate.  As noted earlier, those who publish in criminology journals are more divided on the issue, and they are thus do not take monolithic position that the article describes for researchers.

It is strange that Kleck is labeled “pro-gun” in the same article where he is quoted as saying: “Do I know anybody who specifically believes with more guns there are less crimes and they’re a credible criminologist? No.” Kleck believes guns have no net effect on crime rates, and thus he doesn’t thinks that it matters whether guns are banned or licensed or regulated in some other way.  Gary Kleck and John Lott clearly have very different views on guns, and it is surprising that the articles lumps the two of us together.

8) “Lott claimed that it was based on a data from a survey he had conducted—but that the data had been lost in a computer crash.”

The hard disk crash was widely documented by people at the time it occurred on July 3, 1997.  The crash destroyed data for all the papers that Lott was working on up to that point.  A number of co-authors who he was working with also lost data for papers that they had been working on together (Larry Kenny at Florida State, Richard Manning who was then at BYU, Jonathan Karpoff at the University of Washington, David Mustard at University of Georgia) and others who had contemporaneous knowledge of the crash (including Geoffrey Huck, an editor at the University of Chicago Press; Dan Kahan at Yale; and John Whitley who was at the time at the University of Adelaide in Australia).

9)  Mother Jones botched the timeline and seems unable to accurately report the numbers of a 2002 follow-up survey that confirmed the results of the 1997 survey.  The 2002 survey sought to determine if there had been any changes in the rate of defensive gun use since 1997.  When controversy erupted and the earlier results were questioned, the follow-up survey served as a way showing that the 1997 results had been replicated.  The survey was designed differently than other surveys.  It asked people only about crimes in the past year rather than about events that had occurred over the past decade or more.  The survey had started being prepared in June 2002 by  research assistant James Knowles. This is well before controversy arose over the first survey at the very end of 2002.

10)  “Rosh and Lott shared an internet address.”  This is simply false.   Lott had a dynamic IP address.  In a blog post, Julian Sanchez noted that “maryrosh” had an IP address in southeastern Pennsylvania  He asked for help from anyone who might know who the person was.  After seeing the blog post, Lott emailed Sanchez and admitted to  using his kids’ email address in putting posts up on an internet chatroom.  Lott had originally used his own email address in the chatroom postings.  Unfortunately, some individuals continued the discussions outside the chatroom in unpleasant ways, and Lott found it more convenient to use a pseudonym.  Since the vast majority of chatroom participants were using pseudonyms, it seemed appropriate to follow that example.

11)  The Mother Jones story originally made fun of the fact that Milwaukee Sheriff David Clarke is on the Crime Prevention Research Center’s Board of Directors (the article has since been modified).  We are very proud of our relationship with David Clarke and believe that he brings an important real-work perspective to the Center.  They also fail to note that Professor Edgar Browning, who is also on our board, has been one of the top public finance economists in the world.

This article by Mother Jones is part of pattern of similar attacks.  For a decade Media Matters attacked John Lott in over a 110 posts, and they were so unwilling to discuss the claims that they made that they would systematically remove any responses that were attempted to be posted in their comment section.  More recently a new group named “Armed with Reason,” which has connections to Bloomberg’s gun control groups, rehashed old comments, but completely ignored the academic responses that Lott had already published.   A more productive approach would have been to explain why Lott’s responses were wrong.

http://crimepreventionresearchcenter.org/2015/08/a-response-to-mother-jones-mistake-filled-effort-to-discredit-john-lott/