Skip to main content

Documents, says Mike Flowers, above, possess “predictive value for bad things.”TODD HEISLER/The New York Times

In mid-2011, New York found itself gripped by a series of horrible tragedies: Five people, including several children, had died in blazes that broke out in overcrowded, decrepit apartments that were literally disasters waiting to happen.

Facing pressure from firefighters who risked their lives responding to such emergency calls, senior officials in Mayor Michael Bloomberg's office got to work trying to figure out how the city could prevent such catastrophes by focusing their inspection efforts on the riskiest buildings.

But as Mike Flowers, a former district attorney who then led Mr. Bloomberg's newly formed Office of Data Analytics, well knew, the challenge would come down to resources: The city employed 200 inspectors who had to triage 20,000 complaints about illegally converted apartments each year, only a tiny fraction of which proved to be genuine death traps. New York, moreover, contains a million buildings. The problem, as Mr. Flowers now observes, "is one of too much body, and too little blanket."

Mr. Flowers's team of data scientists began to scour city records for patterns – what traits did the buildings have in common? It was during this formidable exercise in bureaucratic sleuthing that they discovered the most unlikely sort of smoking gun: an obscure form known as a lis pendens that is filed "like clockwork" in the land-registry office when a lender forecloses on a landlord in arrears.

"Those documents ended up being extremely valuable in their predictive value for bad things that happen on a property," observes Mr. Flowers. With the benefit of hindsight, the reason seems obvious. Landlords who let their properties fall apart, cram in illegal tenants and stop making mortgage payments are almost certainly ignoring safety-code violations. "The city," he says, "didn't know that it knew that."

Mr. Flowers, now a visiting scholar at New York University's Center for Urban Science and Progress, is at the forefront of what he rightly describes as a "revolution" in the way local governments deliver services.

By making smarter use of their vast storehouses of operational information – everything from traffic counts and readings gathered by air-quality sensors to date stamps on business-licence applications – municipalities may be able to prevent deaths, boost quality of life, improve their operations, and reduce costs. In such sprawling city-regions as Greater Toronto, planners are trying to go a step further by using extensive transportation surveys, granular census data and sophisticated computer forecasts to model demand for multibillion-dollar transit lines.

It's a sea change in thinking that could rival the shift to professional municipal management that marked the dawn of the Progressive Era over a century ago, according to Stephen Goldsmith, a professor at Harvard's Kennedy School of Government and Mr. Bloomberg's former deputy mayor. "Whenever we're talking about data, we're talking about modernizing how government works," says Mr. Goldsmith, the co-author (with Harvard visiting law professor Susan Crawford) of The Responsive City.

As Mr. Goldsmith notes, the growing fascination with the potential applications for all sorts of data shouldn't obscure the reality that long-serving front-line workers still possess a valuable street-level understanding of the way cities work that isn't available to consultants or data scientists. Still, there's no question that cities now have access to troves of digital data that no one could have envisioned even a decade ago. Transit vehicles equipped with GPS devices allow planners to fine-tune schedules. Some data experts believe that billions of smartphone signals can be pressed into service to help planners understand how people move through congested urban areas.

Meanwhile, the thousands of citizen complaints that flow into 3-1-1 call centres can be potentially transformed into maps that not only show where certain types of problems – from overflowing garbage bins to broken water mains – are occurring, but also provide clues to the underlying causes and areas of greater risk.

Yet, Mr. Flowers points out that city officials shouldn't be tempted to blindly make huge investments in "smart city" information technology in order to foster such insights. Indeed, his group relied on off-the-shelf spreadsheets to compile the data that led to New York's dramatic analytics breakthroughs. "I cannot caution enough against calling IBM as your first step, because then you're doomed," he says, noting that his team spent just $5-million initially. "None of this has to be expensive."

While most municipalities in recent years have released large tranches of raw information – road-closure locations, transit schedules, and other intelligence – through so-called "open data" portals, the game-changing potential lies in interpreting those mountains of quotidian facts and finding new ways of putting them together. The analytics is equal parts art and science. As Mr. Flowers says, "Nobody's good at this."

But a handful of cities, led by New York, Chicago and Boston, have made far more progress than most, thanks to concerted efforts by activist mayors, most notably Mr. Bloomberg, who made a fortune on packaging and processing financial information; and Rahm Emanuel, former chief of staff to President Barack Obama and now mayor of Chicago.

Most Canadian cities, however, are not yet playing in those leagues, says Vancouver-based open-data expert David Eaves, who notes that U.S. mayors tend to have far more formal authority to spearhead such reforms. "Cities are often the most data-rich and data-dumb organizations out there."

Edmonton, under Mayor Don Iveson, is one of the few Canadian exceptions. The city has aggressively released data in recent years. But Kate Rozmahel, general manager of corporate services with the city, says the "really big next step" is to undertake the kind of analytics that Mr. Flowers's team pursued in New York.

Improved public safety is one of the first goals. Stephane Contre and Kris Andreychuk, both analysts for the city, worked with the Edmonton Police Service to compile a list of 233 location-specific factors associated with personal or property crime – everything from complaints about loud parties to reports of abandoned vehicles. They combined that information with a database of reported crimes and a map of the city divided into 11,000 units (each about 250 metres square), and then used an algorithm to look for patterns.

The result: 42 "rules" or combinations of factors that were strongly associated with certain types of crime. As an example, Mr. Contre says that when they identify an area with high levels of recovered stolen vehicles, noise complaints, nearby youth-services offices and an absence of picnic tables, there's a strong correlation with property crime. Indeed, notes Mr. Andreychuk, even the absence of picnic tables or front-yard gardens can point to areas with higher crime rates. The goal isn't simply to predict crime but to identify factors that may be correlated with increased criminal activity, and use that knowledge in city-planning efforts.

In Toronto, a small team of data analysts is looking to find insights by combining maps showing locations of power outages during the 2013 ice storm and areas with large concentrations of vulnerable individuals, for example, seniors living alone in high-rises. The goal, says Mark Bekkering, a city official, is to find ways of prioritizing investment in electrical-grid infrastructure improvements to reduce the risks for residents least able to manage during disasters.

With many cities working to make themselves more resilient in the face of climate change and extreme weather, this kind of analysis has become more pressing, adds Mr. Bekkering. Toronto has joined New York, Chicago and San Francisco in requiring building owners to report annual energy and water consumption. That data, he says, helps municipalities manage overall power use while directing retrofit services and incentives at the owners of buildings with poor performance.

New York, Mr. Flowers says, also found ways to use data analytics to boost local economic development. He gives the example of small restaurants. The period of greatest financial risk for owners is the time that elapses between the signing of a lease and opening day – a period when the owner not only has to fit out the new establishment, but also obtain permits from various city departments. "It can take six months," he says, "to get all the regulatory tickets punched."

After analyzing the paper trails of hundreds of restaurant-licence applications, Mr. Flowers's team was able to identify the bureaucratic bottlenecks that slowed the approvals process. The solution: Create joint inspection teams to expedite the paperwork. "We were able to shave 60 days off the time-to-opening period," he says. "Failed business is bad for New York City. There was no downside to the agencies [that process licence applications] to adopt this approach."

Such stories, however, underscore both the potential of savvy data mining and the bureaucratic risks. Vancouver's Mr. Eaves and others with experience in working with municipal governments point out that these kinds of revelations may reveal uncomfortable insights – inspectors who are missing warning signs, departments that aren't responding to complaints in a timely way.

Mr. Eaves points out that mayors and city officials who are interested in leveraging municipal data should always begin by focusing on real problems – sluggish ambulance response times, for example – and work backward as they try to isolate the variables that may affect outcomes. "Initially, this is all about asking good questions."

Mr. Flowers, a veteran public official who respects the civil service, offers some other trenchant advice to municipal officials: Don't trumpet these efforts until you've got successes to brag about, and always make sure to gut-check the data analysts' theories against the real-world experience of front-line civic workers.

He tells a story about how his group was developing a system for focusing building-inspection resources on high-risk buildings as a way of preventing fires. Mr. Flowers's group had set up a massive spreadsheet with every structure in the city, and poured in all the building-specific information – including inspections, tax liens, permits, 9-1-1 calls – that they could get their hands on. The result was a mostly complete record of what the city knew about every single structure.

Looking at the records of buildings that had experienced calamitous fires, Mr. Flowers tried to come up with common denominators. The group eventually isolated a series of traits that predicted fires in smaller residential buildings. But the factors that seemed to point to hazards in larger buildings eluded them.

So Mr. Flowers decided to spend three days on the road with a veteran New York building official. At one stop, an apartment building that Mr. Flowers's data scientists reckoned to be a fire trap, the inspector got out of his car, took one look and said, "There's nothing wrong with this place."

"How do you know that?" Mr. Flowers asked.

"Dude," the inspector replied, "look at the brick."

Indeed, the walls revealed fresh repairs: From years of experience, the inspector surmised that the owner was likely looking after the inside of the building as well. Studying the database, Mr. Flowers realized that a recent building permit indicates a reduced risk of fire. That insight, he says, "wouldn't have happened if we hadn't talked to the guy in the field."

Interact with The Globe