Following on from some thoughts on how to select one, here's some pointers on your respective roles.
When it comes to quant, there's a big difference between 'full service' and 'field and tab'.
Full service is doing pretty much everything from your initial brief to the their final presentation of findings and analysis.
A word about 'analysis'. If they're going to present stuff to you and the client, make sure both of you are very aware of where analysis ends and thoughts on how to APPLY that analysis start.
If they present well, they'll have the client in the palm of their hand, right at that moment, they have the knowledge and the authority, you're the charlatan. If they make any recommendations on what to do next, it's very likely to stick. So I'd be very clear about where you want them to stop - and if they're going to do some recommending, make sure you're aware of what they are going to say, and hopefully have a hand in it.
I got caught out by a very good researcher who wasn't used to working with planners. First project he went too far with his recommendations - and lost us the client. Why do we need to pay you when we can just go to researcher direct? Was the reasoning. Their campaign bombed. I take no pleasure from this, it's just that the research guy concluded that the objective should be brand awareness. That was natural assumption since not enough people went for the brand as a matter of course. He hadn't appreciated that emotional 'image' advertising wouldn't work since people were after evidential assurances in a market with little trust betwen consumer and brand.
That goes for a quant agency are qual.
Field and tab is the service where you write the questionnaire and analyse the raw data. It's labour intensive and while it's cheaper, I really wouldn't try and write the questions. Leave it to the experts. But if you've had lots of practice analysing data, the saving is one thing, it might make makes sense for 'craft' purposes too. It's more work, but can be well worth it.
Analysis is not an exact science. It's interpretation - and that relies on what you look for, and your background knowledge. You may find something the research agency will not. So, if they're doing the analysis and report, I would still ask for the raw data and go over it with a fine tooth-comb. Not only could you find out something they've missed, you don't know where the project may go......and something may suddenly become relevant. You won't get that chance if you just get the summary. Look harder.
Here's an example of looking a little bit deeper. I pitched for Stanley Tools many moons ago. We won through a bit more rigour. Nothing else.
They'd spent a fortune on an in-depth qual/quant study that was used primarily for NPD. They'd used the summary to highlight where the tools available were falling down - like staple guns needing too much force on the trigger sacrificing accuracy. But they didn't have a clue about what the communications objective should be. The answer was in all the data they hadn't even looked at. And the other agencies didn't bother looking.
In case you're interested, the trade pro's thought the gear was for amateurs, they didn't know it was all developed in tandem with professionals, to solve professional problems. And since since the pro's influenced the DIY market - without getting them to re-appraise the brand first, a DIY campaign would be doomed. So we talked to the pro's in places the amateur could see.
And further down the line, we had to encourage growth in the single female DIY market. Back to the data. Those same numbers showed that women felt very empowered doing it for themselves, more than blokes, not least since their experience/prejudice of male 'experts' was all talk and no trousers. And they never finished on time. The campaign sort if wrote itself.
Anyway. As far as quant is concerned, expect them to:
Guide on methodology, sample size, stimulus material. Expect them to DO the questionnaire, fieldwork and admin, along with data processing.
Decide if you want them to do the analysis and report.
YOU need to supply the research objectives and timing. Decide the sample size and target. Make sure the things the client want to find out are included. You will supply the stimulus material. And it's up to you if you do the analysis and report. But make sure THEY analyse. YOU recommend.
With qualitative, we'll just cover groups for now.
THEY guide on number and composition of groups, stimulus and any projective techniques.
They DO the recruitment and all the admin. They do the fieldwork.
These days, they do the group moderation and analysis and debrief. And the same goes here for what's analysis and what's recommendation.
YOU make sure they know the objectives, timings, you decide the final structure and number of groups. You make sure they know what your hypothesis is, what the current knowledge is (and that means YOU NEED A HYPOTHESIS!). You organise stimulus - and deliver it on time.
Two big points on qual:
The right stimulus can make or break the group. Work hard on getting it right. More on this in a later post.
These days, it's less common, it's even more to think about, but it's really worth shouldering more responsibility and to do some of your own groups, seriously - if you can, you should. You can't get a proper feel for emotions and reactions unless you're in there with them. This matters more if you're getting feedback on strategy ideas, or creative work, rather than gathering new information.
No one should know how the ideas are meant to work better than you, so no one should be able to use feedback better. And welcome negative feedback. It will be the start of being to make it better, in terms of more effective.
I didn't do this once and saw a campaign die behind a viewing facility. It was work about people liking different things - each execution had a different person with a different quirk. They said they hated it since they didn't like what that person did. But he didn't probe about how we could make individualism work.........and it was crucial since they said they're selection criteria was highly personal. I know you know the solution's simple, but many (our client included) take respondent quotes very literally.
And finally, coming back to quant, if you are analysing the data, here are some pointers.
- Mark up a questionnaire with the 'total' results'. It will give a broad picture of topline findings, and and idea of where to probe first.
- Go though each table and 'read accross' from the 'Total' figure - watch for anything that stands out as wildly different. Like men's results being different to woman, Hello magazine readers different to non-readers.
- After you've looked at few, look for any patterns beginning to emerge. You'll soon see different 'trends' coming out, Why is that? What are the connections? What are the contradictions?
- In short - read the data, analyse it into information - the patterns.
5. What story does that begin to tell you?
A great and highly useful post for a lot of people I would imagine....not just me. I've just browsed through it quickly and will have to come back to it later when I have more time. Cheers NP.
Posted by: fredrik sarnblad | October 03, 2007 at 08:32 AM
Thanks Fred. Like all these posts it's probably too long, but I want it to be a proper reference, some times there are no short cuts.
Posted by: np | October 03, 2007 at 09:45 AM
"The answer was in all the data they hadn't even looked at. And the other agencies didn't bother looking."
I've made a career out of this. I call it turd polishing.
Posted by: Lee | October 04, 2007 at 11:28 AM
You said it Lee
Posted by: np | October 05, 2007 at 03:44 PM
"it's really worth shouldering more responsibility and to do some of your own groups ... if you can, you should. You can't get a proper feel for emotions and reactions unless you're in there with them."
Good that you say IF YOU CAN, bec while most planners can run groups it's not as easy as it might seem ...
... and the flip side of getting in touch with emotions etc is being rather biased about one idea of other - some planners act as if they're there to sell not to understand (sorry, but that is our experience)
We do lots of ad research and we share moderating with planners sometimes (ie they do some groups, we do others) - which works well when planner and researcher have a good (open, honest) relationship
Good, practical points about the planning and research issue - there's a few good papers on the subject, can look them out for you...?
Posted by: kevin | October 08, 2007 at 01:19 PM
Thanks for the feedback Kevin, the more views the better.
I think you've a good point about planners thinking their job is just getting the work through...personally I think much of that comes down to craft. Proper digging and thinking about what you find and what it means - as opposed to maming sure you find what you need to prove your point.
Whichever you cut it, you're dead right when you say it comes down to open relationships.
And any papers you have would be greatly appreciated.
Posted by: np | October 08, 2007 at 02:17 PM