Back to the methodology, it is too bad that the methodology is poorly described, and what is described is weak. Here is all the report says about the methods:
This research was conducted in the spring of 2012. Collective Strength, an Austin–based firm specializing in outreach and communications, designed the questionnaire and performed the analytics. Harris Interactive, one of the world's foremost survey research firms, reviewed the questionnaire to ensure objectivity and fielded the study during the month of March 2012.Online only surveys are potentially problematic, and unlikely to be representative of the overall population. More critically, the methodology explains precisely nothing about how subjects were recruited. This is a major potential error. An additional problem is that online surveys have a high level of attrition as the survey goes on. This is a pretty long survey, and I doubt all 1,308 respondents finished the whole thing. Since the report relies heavily on meaningless pie charts and does not supply the 'n' for any chart or table we simply don't know how complete the responses are. None of the companies listed in the section have any additional details listed on their websites. When the APA releases reports like this they are assumed to be high quality, authoritative reports. "Planning in America" is not an example of good planning research, however.
This survey was conducted online within the United States by Harris Interactive on behalf of Collective Strength and their client the American Planning Association between March 8-12, 2012, among 1,308 U.S. residents age 18 years or older.
Reports like this bother me in part because I teach planning research courses and would be distraught if any of my students turned in a report of this quality (without additional explanation, anyway). But the larger issue is that low quality research--whether it confirms or opposes your personal preferences--reduces the signal to noise ratio. Reports like "Planning in America" are noise that cloud our ability to understand critical issues and policy (the signal in this case). At the very least the full methodology should be explained, pie charts jettisoned and sample sizes included in tables and graphs. As for planning research, reports like this are why I argue planning education should focus primarily on numerical literacy and well-crafted basic research with descriptive statistics rather than advanced regression analysis. We should train planners to communicate with data rather than claim to be psuedo-econometricians. Many of the greatest failures of planning can be directly attributed to planners' inability to understand the fundamentals of quantitative data. (See here for an explanation of the most egregious example.) Reports like "Planning in America" make the situation worse, at least as currently presented. Let's not get excited about the claims made in it.