If Men Are From Mars, Women from Venus, Does Research from Jupiter Need to be Questioned? What Questions Should We Ask of Research Reports?

by Marianne Richmond on July 5, 2006

In light of the recent questions raised by Toby Bloomberg’s efforts to clarify the methodology used by Jupiter Research to support the information in their June 26, 2006 press release it seemed to me that there are standards that we should hold such research reports to in order to evaluate their usefulness. Julpiter’s PR agency, Peter Arnold Associates, released a document that said Jupiter had conducted research that indicated that "35% of large companies plan to institute corporate Weblogs this year….{and} nearly 70% of all site operators will have implemented corporate blogs by the end of 2006."

The analyst who conducted the research was listed in the press release as Greg Dowling and David Schatsky, President of Jupiter Kagan was quoted as saying, "By engaging prospective customers in active dialogue, companies can showcase their expertise and domain knowledge, creating a forum for communication of their strategies and visions. In doing so, companies can generate buzz around their products or services, while eliciting feedback and collaboration from product evangelists." The contact phone number was that of Peter Arnold Associates. No other support such as definitions, sample composition, sample size, survey methods were provided. Toby Bloomberg used the contact information provided to seek additional support of the data but Jupiter’s PR agency refused to provide any further information to Toby and indicated that they would only provide information for free to accredited members of the press or clients. Fard Johnmar then purchased a copy of the report but did not find any further information about the research methodology.

Although Toby questioned the reasonableness of the research results, many bloggers merely passed along the Jupiter results, seemingly accepting the results as stated in the Jupiter report. Toby cautioned her readers to " look at Jupiter Research’s conclusions with a few grains of salt."

So, when we are reading research results and before we pass them along, what should we look for to support the conclusions provided in a report? Well, as Toby indicated, it is standard operating procedures to release the methodology used in any type of research. Jupiter did not do this in their press release. This should have been red flag #1 for everyone. Knowing the methodology used in the research is necessary in order to know whether the results meet the gold standards for designing and evaluating research, adequacy of the sample and measurement validity and reliability.  Simply put, this means does the research measure what it is intended to measure (adequacy of sample and validity) and can the results be replicated (reliability).

Most research reports released by Jupiter and other firms seeking to determine how many of those within a certain population behave in a certain manner, plan to behave in a certain manner or have certain attitudes, beliefs, or expectations are survey research. Since Jupiter does not state this in their press release, it is an assumption that they used a survey. Further, they do not indicate in their press release who was in their sample or the size of the sample. We might assume that they surveyed "large companies" but they could have surveyed Jupiter employees and asked their opinions from the information provided in the press release. We really don’t know. We also don’t know their definition of large companies.

Fard Johnmar was able to shed some light on some of these issues by buying the report but even still the survey questions were apparently not provided. Other issues remaining were, was the person responsible for the blogging decision the person who was interviewed and was this held constant. There is of course an important difference between facts and what people believe to be facts, so a statement regarding the involvement of the person answering the question with the specific issue, would be important.

Jupiter provides a page on their methodology on their web site and they mention the use of executive surveys in general in their research and say, "Jupiter executive surveys summarize the perspectives of top executives in dozens of market sectors." Again however, the issue in the corporate blogging research was “… 35 percent of large companies plan to institute corporate Weblogs this year. Combined with the existing deployed base of 34 percent, nearly 70 percent of all site operators will have implemented corporate blogs by the end of 2006.” So, is this perspective or fact? Further, regarding the who adequacy of the sample: Within what industries? Is is skewed toward specific industries? As Fard Johnmar notes, we don’t have a definition for a "site operator"

Again, the basics of research methodology are not provided by Jupiter. So what should we look for when we look at reports? The important questions for evaluation concern the sampling procedures and measurement validity and reliability. Does the measurement instrument actually accurately measure the concept being studied and can the results be generalized to the population as a whole (validity) and can the results be replicated (reliability).

Someone suggested that in order to evaluate ethics one standard to apply would be whether you would be willing to tell your mother that you did it. Similarly in evaluating survey research, one very simple way to evaluate research results would be to ask yourself if you would be able to explain how the results were obtained. Beyond that, here is a recommended top 10 questions to ask when evaluating the usefulness of survey reports released by research companies.

1. Are the survey results consistent with other data that is available? Toby noted the inconsistencies in the Jupiter report with the Fortune 500 Business Blogging Wiki. OK, so that is for Fortune 500 companies. It is also based upon observations made by bloggers and self reports. Jupiter is using a survey. A survey of "large companies;" Fard Johmar noted other inconsistencies with Makovsky & Company which was also a survey of defined "senior executives" in Fortune 1000 companies, but this leads to…

2. Are definitions provided for terms pertinent to the research such as "large companies" so that we know that we are all talking about the same companies. And regarding the survey itself…

3. How was the survey delivered? A mail questionnaire? A phone survey? Person to person? Why do we care? We care because if the question is delivered by mail we don’t really know that the questions are understood and this can be a source of errors. If the survey is delivered by phone or in person, are the questions close ended or open ended? It matters in terms of how the questions are asked by the interviewer and the way that the answers are scored. Again, this is most relevant when the results are in conflict with other surveys or even our own observations or experiences.

4. Is the sample a representative sample? Are we told the sample size and the relationship of those surveyed to the issue? Sample size is obviously important. If the issue is something that the person being interviewed about is directly responsible for or directly involved with, then we can give more credence to the answer. Are they measuring perspective or are they measuring concrete plans? What else would it be helpful to know about the sample? Is it a representative sample so that the results are generalizable. Are the results meant to be generalized? Jupiter said

5. Does the research company have a vested interest in the results of the survey?

6. Was the research funded by someone with a vested interest in the results of the study?

7. Are the research results part of a larger study or a different study. Context is important. Knowing that the results came from a survey specifically designed to obtain answers to the results being released is also important.

8. What are the qualifications of those analyzing the data? Are they available?

9. Have their been instances when research results from this company were questioned and if so, what was the company’s response?

10. Do I have a vested interest in the research results that might cause me to overlook inconsistencies or inadequacies of the research?

Tags: , , , , , ,ÂÂ

Powered by Qumana

Similar Posts:

    None Found

Be Sociable, Share!

{ 0 comments… add one now }

Leave a Comment

Previous post:

Next post: