I don't suppose I'm the only one whose heart sinks at the end of every course, activity, indeed just about anything these days, when the ubiquitous "feedback" forms appear, or when requests to spend anything from "5-10 minutes" to "should take no more than 30-40 minutes" of (precious!) time filling in surveys pop into my inbox, or, infuriatingly, dance across my screen when I land on a website (no! I haven't got time! and I haven't even had a chance to look at your website yet, because you interrupted me before I could!) On occasion I have had the equivalent of half a day's work's worth of feedback and surveys to do - every one may only take a little time but the sum of the whole is too much.
Once you start noticing, you will realise just how many "news" stories are actually based on surveys - either produced by consultants or, just possibly (surely not) by lazy journalists asking a couple of people in the office. These surveys can't be representative (you won't be asked to take part in a telephone survey if you are registered with the TPS, for instance). They are a written version of the vox pop - a cheap way of filling a newspaper or website.
How reliable are the responses? Put another way, how are the questions phrased? Surveys which have an end in view often contain leading questions (they are easy to spot if you are looking for them, but still, lots of people are not looking for them). Thus, a survey on behalf of a local authority which asks its residents "List in order of importance: child protection, social services for the elderly, public libraries" has an agenda behind it. Don't be surprised if such a survey is used as evidence that residents won't mind if their library closes.
We most certainly do need to listen to our users or customers, and the very ubiquity of the feedback forms should ensure that we get a broader view than if we only had complaints to consider. But, a plea from the other side of the fence - feedback fatigue! Regurgitation, even! Ill-considered responses rushed in hurry to get the form-filling done are not really very much use to anyone (and I've been as guilty as anyone of galloping through them). I wonder (from the point of view of the person who might be reading the feedback/survey) whether it ought to be impossible to get away with a sweeping generalisation or nebulous reply - should we always ask for concrete examples to back up what is being said? (I'd find this useful from the library point of view - though maybe not when I'm the one filling in the surveys!)
Those of you who like me work in higher education institutions will be aware that it's the time of year when the results of the National Student Survey are about to burst upon us. Food, or feedback, for thought!