Back in 2015, we wrote a major feature for Governing magazine titled “The Causes, Cost and Consequence of Bad Government Data.” As far as we know, aside from the pieces we did for the Government Performance Project, it was the most popular single feature we wrote over the twenty years we were contributors to that magazine prior to becoming columnists and senior advisors for Route Fifty.
In the years that followed, our interest in the topic has continued to be fed by state and local government performance audits (which we read like other people read Stephen King novels). They contain a steady stream of complaints about data quality. These findings are increasingly distressing as cities, counties and states rely ever more heavily on data to make decisions.
A striking example comes from an advisory issued late last year by the Chicago Interim Inspector General, William Marback, to that city’s Chief Data Officer, Nick Lucius. It listed numerous examples of data quality audit issues that had come up over the last several years.
This included fire department data that inadequately measured emergency response times; missing information about employee leave time; poor and missing data on municipal license plates; inaccurate lists of city-owned lots; contradictory information on gang membership and incomplete documentation of lead hazards for Chicago’s Low-Income Housing Trust Fund. “The inconsistent quality of the City’s data hinders it from effectively allocating resources, measuring performance and achieving objectives,” the public advisory letter said.
The IG advisory went on to expand on the importance of this issue. “Data is a key strategic governmental asset. Yet data can only serve its purposes if it is accurate and reliable.”
Chicago is hardly alone.
Our digital files are bulging with other accounts of recent audits with similar complaints. A few examples:
An independent 2021 audit in Boston raised questions pertaining to the accuracy of Boston data on high school graduation rates. For example, the audit found limited data documentation to support why 16 out of 40 sampled students had been listed as transferring out of high school rather than dropping out – a problem that meant Boston Public Schools was “potentially misstating” data used by Massachusetts to determine four-year adjusted graduation rates.
A memo late last year from the Multnomah County auditor in Oregon explained that she had to halt an audit into the placement of individuals experiencing homelessness into permanent housing because the data was unreliable. Examining address data from the Joint Office of Homeless Services, the auditor found that among program participants listed as being placed in permanent housing, “approximately 60% were missing address data or had address data that were not actual addresses.”
A January 2022 performance audit of Building and Zoning Enforcement by the Atlanta Auditor noted that the Inspections and Enforcement Division “had not maintained accurate and reliable data” in its computer system regarding complaints. A review of complaint data “found that staff entered incorrect information, that some fields were left blank, and some cases were left unassigned to an inspector.”
Unquestionably the problem of poor data quality needs continued – and more aggressive – attention. We know there have been improvements in many places since we wrote our 2015 feature, but though people who are dedicated to state and local government are taking steps in the right direction, in the words of Robert Frost they have “miles to go before they sleep.”
(Extra non-governmental B&G bonus: Listen to Robert Frost reciting that delightful poem, “Stopping by Woods on a Snowy Evening.” It’s wonderful.)
Comments