Stop spending money on bogus surveys.
Instead of conducting surveys that are unscientific and easy to manipulate conduct a scientific poll using a large sample and really find out what the citizens want from their government.
L. M. PytlikZillig commented
It may be useful to in mind that scientific surveys and web based open surveys have different strengths and weaknesses.
The city has been conducting scientific surveys as well, and they cost much more than a web based survey. However the scientific survey is often quite short and by phone-- which is not a great medium by which to share a lot of information. (eg, the info about the implications of cutting various programs.) So you end up getting the off the cuff opinions of a random set of people who may not know much about the implications of their choices. This then leads people to ask "what would people decide if they knew more about those implications?" Web based surveys allow you to provide that information to those interested enough to take the survey (rather than random persons who may or may not have much interest)--and see what people then say. The community conversation on Saturday June 18 will take that even further and have people discuss and ask questions about the implications of different choices.
While online surveys can be manipulated, they are getting more and more sophisticated at detecting serious responses vs. Questionable ones. Our software, for example will tell us if responses were given in just a few seconds and from the same computer (which may mean ballot stuffing). While we do not quickly jump to conclusions about which valid and which is not, we do look carefully at whether the overall picture painted by the data changes if different types of data are included or excluded. We can also look at the data over time, and see how respondents answers change over time ( eg, we did see an increase in funding preferences for economic development during the few days after a certain notice was sent out..but then it tapered off again, and overall the ranking of economic development did not increase over other favored programs).
In addition, this year, on the survey, weve asked for written explanations, which we are analyzing to compare to the survey results to see if emphases made in the written answers match up with funding preferences.
But truly, no method is perfect-- so our strategy has been to use multiple methods to see if similar pictures emerge from each. When different pictures emerge, we wonder how accurate each method is and have to investigate further. But when the same picture emerges under different conditions, we start to feel more confident that we really are finding an accurate picture of the results.
Does anyone remember the licence plate survey? I think thats happening here...