By using this site, you agree to our Privacy Policy and our Terms of Use. Close
thismeintiel said:
r505Matt said:
thismeintiel said:

After looking at their site, I have no reason to not believe this survey they have conducted.  In 8 years they have become one of the top 25 trackers in the world (#24 to be exact). In fact, their weekly surveys remind me of the weekly political polls conducted here in the US. Everyone seems to qoute them all the time without questioning their exact methods. In fact, I know of no tracking site/service that releases their exact sources and methods. What would be the point? Then anyone could do it themselves. They always just release a general, one sentence explanation about their survey. Oh, something like "OXT’s GamePlan weekly tracking study surveys 1,000 U.S. gamers and buyers including hardcore gamers, casual gamers and everyone in between."

I find it ridiculous that people on a tracking site that does the exact same thing wish to criticize another tracking site. Maybe the survey is saying something people don't like. Something makes me believe that if PS3 was switched with 360 or Wii, and the game was Halo Reach or SMG2, there would be a different response. More in the line of "Awesome, that game is going to sell tons of 360's/Wii's!"

Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious.

If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.

 

How is "39% (actually 33%) of all gamers (surveyed) who plan to buy GT5 do not yet own a ps3" any different from "the study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3; 39% of those that responded yes to the first query responded no to the second"?  I'll give you the answer, there is no difference except for the wording.  Both of those statements give you the exact same info. 

Also, GamePlan is a tracking service.  It just offers much more info than just sales.  As such, they are not going to release the info on how exactly they do their research, so as not to be copied.  Just like this site.  Here is their website if you wish to do any further research on them.  http://www.gameplaninsights.com/

@ saicho

There is a huge difference between Nikkom doing a survey of 22 from his friends and family, most of which probably have similar tastes in gaming, and a completely unbiased company surveying 1,000 different gamers of all tastes and walks of life each week.  Also, the amount of gamers who said they plan to buy it truly doesn't matter, as not every gamer plans on buying it.  And even if the number of people who said they would buy it is small, then that would only prove further that it is accurate.  I say this because if GT5 sells 10 mil, that's still a very small number of gamers/consoles out there (Wii+360+PS3 as of now = 142.86 mil, so 7% plan to buy it).  And of that 10 mil, about 3.3 mil will probably be buying a PS3 between now and around the time it releases. 

Of course, 10 mil is just an estimate and it could very well go on to sell more than that.  Also, surveys are never 100% accurate and usually give themselves a 5-10% margin of error.  However, they are still the only way to get a better picture of things as a whole.

Because that's what reputable researchers do, they define the parameters of their studies to very specific terms and leave no questions or speculation as to how the research was done. Maybe that's not how they did their survery, I just gave the most likely possible example, but it could be anything.

I'm only talking about defining the parameters of a study, I'm not talking about whatever else kind of tracking or sales tracking they might do, that's irrelevant to this point, so why do you keep bringing it up?

The whole point of publishing studies is to NOT hide anything. When anything isn't clearly defined, such as the basic parameters and variables studied in the study, then it makes it harder to trust the study.

But yes, there is a big difference between a study with a sample size of 1000 and 22 not-so-random individuals. But his point is maybe they extrapolated the data, and twisted it to fit their needs. Kind of like I could ask 100 doctors if they prefer medicine X over medicine Y. Maybe 9 say yes. So 9 out of 10 doctors agree, medicine X is better than medicine Y. I'm not saying this is the case, but I think (and hope) this is the kind of thing Nikkom was getting at. Not so far-fetched when the study has NO data reported, only the conclusion.

I'm not saying the conclusion is false based on the data they collected, I'm just saying it's invalid without more substantial proof. They could just lie about the data, but then it at least makes they seem more reputable (weird how that works). Most importantly, where did they conduct this survey (other than online)? Is it only in the USA? Or only EU, or everything? The implications of the study change dramatically based on that, and that's a pretty important parameter to neglect to mention. Then again, the article not only got the % wrong, but also some of the facts don't line up with what the website says, so maybe it's not GamePlan's fault at all.