top of page

B&G REPORT.

Search
Writer's picturegreenebarrett

MOUSETRAPS FOR FLAWED DATA

It may seem a little heavy-handed, but for years now we’ve been writing about the endless reams of bad data that are used to manage and to make policy. For the most part, we’ve pointed to issues that require careful examination of the information to determine if its trustworthy or not.


But, as time has passed, we’ve come across a great many signals, easily spotted and identified, that point to quicker recognition that information should be scrutinized. Here are a half dozen examples:

1) Beware comparisons between absolute figures that come from different size cities or states. If, for example, something criminal happens to hundreds of people in California that may not be nearly as alarming a situation as when the same thing happens to dozens of residents of Wyoming or North Dakota.


2) Sometimes reports or articles use numbers that are so precise as to be unbelievable. It seems to us that when project spending is reported as $1,436,432.15, there’s no legitimate way to figure costs out to cents, dollars or hundreds of dollars. A tight range is often more useful and believable.


3) Speaking of ranges, it’s self-evidently problematic when an expense is reported as somewhere between $100 and $500 million. Either not enough due diligence has been done, or the estimators are living in the Land of the Wild Guess.


4) If you’re relying on data for which no assumptions are provided dig deeper. When discount rates vary between two state pension plans, it’s entirely possible that the liability figures are not comparable.


5) Watch out for figures that are huge beyond common sense. Some years ago, there was a lot of talk about one million children being abducted each year. Living in New York City, news reports were full of the story of just one little boy, Etan Patz who was last seen at a bus stop in lower Manhattan. How could it be that if such huge numbers of children were disappearing, one child was getting so very much attention? It turned out, according to the Denver Post in 1986 that the “national paranoia" raised by the one million figure wasn’t really reflective of scary men luring children into their cars with candy – but rather children taken in a custody battle.


(And the often repeated one million figure was also an exaggeration. In 2017, the Justice Department reported that the number of serious parental abductions is closer to 156,000 annually of which about 30,500 reach the level of a police report.)


6) Information that is self-reported to the federal government by states or by cities and counties to the states can he questionable. A question like “does your city use performance information,” can get “yes” answers regardless of differing definitions and degree of use. In the past a big-city mayor told us that his community used measurements to make decisions about road conditions. When we pursued the question, it developed that the only data the city had was an accumulation of citizen complaints about potholes.


1 Comment


douglas.jones
Nov 06, 2019

Something else to think about is the scale used in a graph. The data may be good, but the presentation of the data not so much. The slope of lines look a lot different when the axis starts at 30% and ends at 65% rather than 0% to 100%. Increases that while good and showing steady improvement, look very dramatic and more impressive on the truncated scale. I've seen this happen.

Like
bottom of page