By using this site, you agree to our Privacy Policy and our Terms of Use. Close
theprof00 said:
r505Matt said:

Because that's what reputable researchers do, they define the parameters of their studies to very specific terms and leave no questions or speculation as to how the research was done. Maybe that's not how they did their survery, I just gave the most likely possible example, but it could be anything.

I'm only talking about defining the parameters of a study, I'm not talking about whatever else kind of tracking or sales tracking they might do, that's irrelevant to this point, so why do you keep bringing it up?

The whole point of publishing studies is to NOT hide anything. When anything isn't clearly defined, such as the basic parameters and variables studied in the study, then it makes it harder to trust the study.

But yes, there is a big difference between a study with a sample size of 1000 and 22 not-so-random individuals. But his point is maybe they extrapolated the data, and twisted it to fit their needs. Kind of like I could ask 100 doctors if they prefer medicine X over medicine Y. Maybe 9 say yes. So 9 out of 10 doctors agree, medicine X is better than medicine Y. I'm not saying this is the case, but I think (and hope) this is the kind of thing Nikkom was getting at. Not so far-fetched when the study has NO data reported, only the conclusion.

I'm not saying the conclusion is false based on the data they collected, I'm just saying it's invalid without more substantial proof. They could just lie about the data, but then it at least makes they seem more reputable (weird how that works). Most importantly, where did they conduct this survey (other than online)? Is it only in the USA? Or only EU, or everything? The implications of the study change dramatically based on that, and that's a pretty important parameter to neglect to mention. Then again, the article not only got the % wrong, but also some of the facts don't line up with what the website says, so maybe it's not GamePlan's fault at all.

I'm glad that you are trying to be understanding here, but every single company does it this way. Either you deny all of them, or accept them. There's really no middleground in this. If any company ever tells you exactly how they did something, it is because they have contract information that nobody else can ever get, or because they are a massive corporation that can afford to hire a fleet of surveyers. Another reason would be if the validity was so important, that to keep any information, would have severe repurcussions on the outcome.

Voting for example. Voting surveys are explained very clearly, because if not, they will cause a lot of distrust and even change voter behavior because there will always be people who would look at a survey saying "Obama is in the lead" without any explanation, and the conservative base would then rally against what they considered  to be a liberal company lieing to the poulation in an attempt to influence others.

What you expect is unreasonable for what it is and I wish I could somehow enlighten you of that fact.

Well, with that info, it does seem like it's quite possible the GamePlan released such data, and the article just kept it out (though I don't know why) just another possibility here though, I'm not assuming any of them as truth.

No, I'm well aware that what I expect might be unreasonable, no enlightening needed. I'm used to reading scientific studies that very thoroughly explain their results. I've come to trust that, even when so many of those studies can conflict with each other. It is usually attributed to short-sightedness in terms of missing a variable. When I encounter studies that are not transparent, it makes me think there's a reason inherent in the study for that lack of openness. True, it could be just the company protecting some trade secrets, I won't deny the possibility, but that seems sketchy to me. Maybe I'm not well-versed enough in the non-scientific research community though.