As soon as the NRA show rumbles to an end, our friends in the gun-control world can get back to business and celebrate the latest news about gun violence from the CDC. Because the CDC has just published the numbers for how many Americans were wounded but not killed by guns in 2016, and the number is the highest it has ever been – 116,114 – an increase from the 2015 of nearly 40 percent!
The only problem with the numbers reported by the CDC is that they probably aren’t correct. How can I say something like that? I mean, we’re not talking about numbers from this survey outfit or that; we’re not talking about Pew, or Gallup, or even the vaunted gun researchers at Rand. We’re talking about the U.S. Government and even more to the point, about the agencies responsible for medical research, which we all know is science- based. This data is also instintingly used by gun-violence researchers at major academic institutions like Harvard and Johns Hopkins, so it has to be correct, right?
If by using the word ‘correct’ we are saying that the numbers on gun injuries collected and published by the CDC are accurate to the point that they withstand serious scrutiny either in terms of how the numbers are gathered or how they are used, then when I characterize these numbers as correct, I am wrong. And I’m not saying that I’m a little bit wrong. I’m saying that I am wrong to the degree that anyone who uses these numbers to support any argument about gun violence is making an argument out of whole cloth. Which happens to be a polite way of saying that the numbers are nichtsnutzig, pas bien, non buono, zilch – whatever works, okay?
The CDC numbers on non-fatal gun injuries come from an agency known as the National Electronic Injury Surveillance System (NEISS), an outfit run by the Consumer Product Safety Commission (CPSC.) The data collected by NEISS “is based on a nationally representative probability sample of hospitals in the U.S.” Those happen to be my italics, and if you can show me a single place where these numbers are used by any gun-control organization with the caveat that they are a ‘sample’ rather than what the real numbers might be, I’ll send a hundred bucks to the charity of your choice. Don’t waste your time looking, I already did.
Hey! Wait a minute! I thought the gun industry was exempt from any consumer regulation by the CPSC or anyone else. That happens to be true, thanks to an exemption written into the first Consumer Product Safety Act of 1972. But this law doesn’t prevent the CPSC from collecting information about consumer injuries from guns, an activity for which they use the same NEISS reporting system and then send the numbers to the CDC. The NEISS numbers for gun injuries also come from the same ‘nationally representative’ hospitals which furnish the injury data for every other product group: toys, kitchen appliances, ATVs, amusement-park rides and just about everything else.
I don’t know about injuries from hair dryers or chain saws, but when it comes to gun injuries, the ‘nationally representative’ list of hospitals isn’t representative at all. How do you compute a national estimate of gun violence when the hospital you use in Virginia is located in the little town of Danville, whereas Richmond is left out? How do you have any idea about gun violence in Florida without including at least one hospital from Dade County?
The CDC says that the margin for error they employ for gun injuries means that the actual number might be 30% higher or lower than the specific number they publish each year. Which means that the real 2016 gun-injury number might have been as low as 85,000 or as high as 150,000 – take a pick.
Whether they know it or not, when gun-control advocates talk about the number of gun injuries, it’s nothing but a guess. You would think that the gun-violence researchers, on whose work the gun-control movement depends, would at least try to point this out.
Further Comment On What We Don’t Know About Gun Violence Numbers. | mikethegunguy
May 09, 2018 @ 11:25:18