Blog post for 9/18 readings
First off: Please take a moment to complete my silly survey (inspired by Richard) and I’ll be sure and report back on results next week: Joni’s Silly Survey
I had a total flashback reading this week’s reading ( “The value of online surveys” ) to November 2000. I was a reporter covering the Florida recount and delving into the differences and performance results between different voting methods: Touch-screen, punch card and optical scan. The verdict was a revelation to me in the technology age. The best way to get accurate results that truly reflect the voters’ intent is old written ballots when a voter can circle the name of a candidate and then it is counted by two individuals – one reading the ballot and one tallying the ballot and each checking each other. Why: Variables of choice are small (just circle one name and if you don’t the ballot is rejected); and human eyes can discern more nuance of voter intent when there is any ambiguity that a computer may never even register (i.e. voter puts a check mark instead of circling the name, etc.) Hence the reason Florida went with optical scan after the 2000 recount as among all the technologies that most simulates that structure – though there remains concerns about implementation, such as that while paper ballots are created, new recount rules make it highly unlikely they will every be surveyed directly to doublecheck the machines’ counting accuracy.
BIAS: Reading the Hofstra University article, the same tests apply to surveys, of course. Human surveys would seem to come closer to the ideal voting model I spoke of above – but only if the variables of answers are similarly limited and not subjective. Human surveys always run the risk of introducing questioner’s bias. Online surveys need to consider sampling for Internet user bias (i.e. not everyone is on the web). And here is another thought on bias that isn’t unique to the Internet: The pollster’s. The City of St. Petersburg is contemplating using online polling tools to collect citizen input on a new pier (after a previous plan was voted down in the August election). But there are big concerns about the volunteer pollster’s bias, as it helped the opposition in the August campaign. Tampa Bay Times article on the plan
NEW STANDARD: The upsides of online surveys – cost, reach and completion – speak to it becoming the default standard for most clients’ survey needs. Just the Consumer Reports anecdote spoke to its power even in early adaptation: Half the cost and double the response.
PRIVACY: I also think over time there will be more and more concern about security and privacy in these surveys, even beyond today. How does the industry truly assure respondents that their answers are anonymous, when clearly so much of the power of this technology is connecting data? I know on more than one occasion I have started to finish a so-called anonymous survey only to abandon it upon further thought that somehow my answers might be traced back to me in a way I would not want them to be. The NYTimes Magazine Target story suggests I am not paranoid…
1) Online surveys have amazing potential for political polling, which is now considered most valid when it is done by telephone. Do you think we’re close to having the technology and/or broad enough demographics on the web to do it considering seniors are the most-likely voter but also the most likely non-Internet user?
2) Considering the services we explored, when do you think feedback vs. surveys are better for an Internet vendor? And why?
3)And for the more technologically savvy: What software exists to cloak me on the Internet so I can fill out surveys, for example, but block the URL info?