I noticed yet another poll being commissioned on peoples views on Independence, and what way they were likely to vote come the day. This time the poll was commissioned by the Mail on Sunday and the polling carried out by somebody called Progressive Scottish Opinion.
In this poll they claimed that a total of 1,134 people were questioned for the poll, with 56% saying they did not support independence while 27% wanted Scotland to leave the UK and 17% said they did not know.
When asked if their household would be better or worse off if Scotland was independent, 49% said they thought they would be worse off while 23% believed they would be better off. A total of 13% thought there would be no impact and 16% did not know if they would be better or worse off.
In effect No has a 29% lead over YES at this stage.
Interestingly enough the Times had commissioned a similar poll by Panelbase just a week before, and it showed remarkably different results albeit with NO still in the lead by 9%, which is not a lot at all, just a 5point swing would see YES carry the day.
So given that polls seem to be wildly all over the place, depending on who is the pollster, what on earth is going on, and can we gather anything from all these polls which are constantly trumpeted at us from the media?
Taking these last two polls, both seemingly polling 1000 plus people, we can immediately say that Progressive Scottish Opinion is not primarily a polling company but a marketing company, and it is not registered in the same way as other polling companies are, and does not give any methodology for us to examine. That in itself is perhaps not all that important for most folks, it just means we have no information other than these figures to go by.
Panelbase is a recognised pollster, and must lay out its findings and methodology for examination.
Other recognised polls will come up with different results again, depending on questions asked and in what way these questions are presented.
It is because methodology and weightings change from pollster to pollster, that I rarely take all that much notice of polls, except to look for trends and direction of travel in public opinion.
We can do this by not comparing different pollsters findings, but by examining each individual pollsters previous polling and see what difference there has been between their most recent poll findings and the last comparative poll they have produced.
All in all though, I am disinclined to go shouting from the rooftops a positive result for what I support, or weep in disconsolation at what appears to be massive support for the opposing point of view.
The media of course will make hay and come to all sorts of spurious assumptions for each poll that comes out.. but we would be wise to take that all with a pinch of salt.
Recently I got a lot more insight to polling, when I trained briefly with Ipsos Mori as a field market researcher for them. It was an eye opening experience.
Each market research company has its own preferred methods of polling. Some like Ipsos Mori rely heavily on Face to face or telephone polling, focussing on specific areas of the country. Others rely on online polling, and some carry out polls with the same people time and time again, so aren’t actually getting any fresh perspectives from other people.
Ipsos Mori works on a quota system, where interviewers are asked to seek out quotas of people in all the various social groupings.. They may be asked to speak to X number of under 24s in the CDE social classes, and X number of 60+ year olds in the AB class, and X number of women working part time, or any other number of combinations. Once the interviewer has reached their quota of Interviews asked for in a particular grouping, they must ignore any other interviews with that group, and focus on others. In effect, a lot of opinion will be discarded if deemed outwith quota.
We also have to bear in mind, that polling organisations are well aware who is hiring them, and have a good idea what sorts of results their clients would like to see. So while questions are not meant to be leading, depending on how a series of questions are asked, the responses may be marginally influenced.
Polling companies often have a specific number of people they can turn to in order to guarantee an interview, such as those who are registered with Panelbase… so all we can really gain from that, is what difference that specific group of people have in their opinions over a period of time. For Scottish Independence as an instance, the results would show a direction of travel from No to YES or vice versa, or an increase in Dont Knows, or a movement by either side towards Dont know. But it doesn’t bring any new peoples opinion into play.
So…If the polls are hyped, take them with the pinch of salt that they deserve,and dont be influenced by what political groups and the media might trumpet about them.