Friday, June 15, 2012

Some Notes on the Quality of Planning Research

The American Planning Association just released a report that has a lot of bold claims about what Americans want from planning. See here for a representative story. These types of reports are released all the time, and usually the are little more than an opportunity to abuse the use of pie charts (which you should never use). This new report, called "Planning in America," abuses pie charts nicely but also makes a number of claims that simply sound like a bunch of baloney. For instance, there is a claim that 50% of Americans think that having transit available is a high priority for an ideal community (Finding 5). Then the following finding (6) claims that local bus service ranks as a priority for on 36% of respondents and local train service is a priority for only 21%. I don't see how you reconcile finding 5 with the results presented in finding 6. This calls into question the methodology used to collect the survey responses. Transit usage is sufficiently rare in the U.S. that there is pretty much no way that half of American adults (as claimed in the report) think transit is a high priority. I suspect the truth is closer to finding 6, where most American simply don't care that much about transit because the overwhelming majority of Americans never use transit. This isn't to say that people are hostile to transit, just that they likely don't spend much time thinking about it at all.

Back to the methodology, it is too bad that the methodology is poorly described, and what is described is weak. Here is all the report says about the methods:

This research was conducted in the spring of 2012. Collective Strength, an Austin–based firm specializing in outreach and communications, designed the questionnaire and performed the analytics. Harris Interactive, one of the world's foremost survey research firms, reviewed the questionnaire to ensure objectivity and fielded the study during the month of March 2012.
This survey was conducted online within the United States by Harris Interactive on behalf of Collective Strength and their client the American Planning Association between March 8-12, 2012, among 1,308 U.S. residents age 18 years or older. 
Online only surveys are potentially problematic, and unlikely to be representative of the overall population. More critically, the methodology explains precisely nothing about how subjects were recruited. This is a major potential error. An additional problem is that online surveys have a high level of attrition as the survey goes on.  This is a pretty long survey, and I doubt all 1,308 respondents finished the whole thing. Since the report relies heavily on meaningless pie charts and does not supply the 'n' for any chart or table we simply don't know how complete the responses are. None of the companies listed in the section have any additional details listed on their websites. When the APA releases reports like this they are assumed to be high quality, authoritative reports. "Planning in America" is not an example of good planning research, however.

Reports like this bother me in part because I teach planning research courses and would be distraught if any of my students turned in a report of this quality (without additional explanation, anyway). But the larger issue is that low quality research--whether it confirms or opposes your personal preferences--reduces the signal to noise ratio. Reports like "Planning in America" are noise that cloud our ability to understand critical issues and policy (the signal in this case). At the very least the full methodology should be explained, pie charts jettisoned and sample sizes included in tables and graphs. As for planning research, reports like this are why I argue planning education should focus primarily on numerical literacy and well-crafted basic research with descriptive statistics rather than advanced regression analysis. We should train planners to communicate with data rather than claim to be psuedo-econometricians. Many of the greatest failures of planning can be directly attributed to planners' inability to understand the fundamentals of quantitative data. (See here for an explanation of the most egregious example.) Reports like "Planning in America" make the situation worse, at least as currently presented. Let's not get excited about the claims made in it.
Post a Comment