By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - 39% Of Gamers Who Plan To Buy GT5 Do Not Yet Own A PS3

r505Matt said:
thismeintiel said:

After looking at their site, I have no reason to not believe this survey they have conducted.  In 8 years they have become one of the top 25 trackers in the world (#24 to be exact). In fact, their weekly surveys remind me of the weekly political polls conducted here in the US. Everyone seems to qoute them all the time without questioning their exact methods. In fact, I know of no tracking site/service that releases their exact sources and methods. What would be the point? Then anyone could do it themselves. They always just release a general, one sentence explanation about their survey. Oh, something like "OXT’s GamePlan weekly tracking study surveys 1,000 U.S. gamers and buyers including hardcore gamers, casual gamers and everyone in between."

I find it ridiculous that people on a tracking site that does the exact same thing wish to criticize another tracking site. Maybe the survey is saying something people don't like. Something makes me believe that if PS3 was switched with 360 or Wii, and the game was Halo Reach or SMG2, there would be a different response. More in the line of "Awesome, that game is going to sell tons of 360's/Wii's!"

Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious.

If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.

 

You don't even know how sales tracking works. Do you really think that Brett checks some crazy internet database or calls every single store and asks how many they sold? Do you think he has contracts to have info sent to him every week? NO. It is done through statistics, which is exactly what these guys do.

Dude, c'mon. Any study that isn't based on fact will lose all accredidation to it's host firm/company. The most they can do is spin. Like it has been said 10 times already, a company that is 24th in the US in stat tracking is not going to tell you how it collected or analyzed the data. It is a trade secret. The way it works (yes I will lower myself to sit here and explain statistics to you) is that they gather information from several different sources, some by survey, some by gamefly pre-ordering, some by other methods. Each of these methods is then assigned a score based on how reliable they are, usually from looking at past data. They then look through everyone planning to buy gt5 without having a ps3. This is possible in several ways. 1) Surveys which have a list of games and systems with check bubbles which ask the recipient to check which they own, and which they plan to own in several different time slots: Next month, next 6 months, next year. Or, on a gamefly account, this is easy to see because all members have their consoles listed. All of these methods are then examined for probable falsification or errors in collection methods through a series of mathematical/statistical analysis.

Then they look at the number of people who plan to buy, and the number who don't plan to buy, and find the population deviancy, to see if these people actually belong to the entire population, or whether they are a small circle whom have accidentally been considered a real population. Testing and analysis is done on this again. First to make sure that their sample population is able to be extrapolated into the total population, and second to determine exactly how many of the population they represent. Through this, in a sample of 1000 people, 300 people can be turned into 1 million, for example.

Lastly, they test the validity and error. A valid test will be accurate some 95% of the time, and be off by no more than 5% in either direction. Then, they run tests on the theoretical data by surveying another few sets of 1000 people, calling, or whatnot. If they predicted that 4/10 people who want to by gt5 and don't own a ps3, then for the next few weeks, this should stay the same, give or take  a certain number of people. This is all then fed back into the validty testing and error, make sure they sync up, and then release an announcement about what they found.

That is how statistics, and more than 95% likely, how this study was done. I wouldn't expect any less from such a high profile company.



Around the Network
thismeintiel said:
r505Matt said:
thismeintiel said:

After looking at their site, I have no reason to not believe this survey they have conducted.  In 8 years they have become one of the top 25 trackers in the world (#24 to be exact). In fact, their weekly surveys remind me of the weekly political polls conducted here in the US. Everyone seems to qoute them all the time without questioning their exact methods. In fact, I know of no tracking site/service that releases their exact sources and methods. What would be the point? Then anyone could do it themselves. They always just release a general, one sentence explanation about their survey. Oh, something like "OXT’s GamePlan weekly tracking study surveys 1,000 U.S. gamers and buyers including hardcore gamers, casual gamers and everyone in between."

I find it ridiculous that people on a tracking site that does the exact same thing wish to criticize another tracking site. Maybe the survey is saying something people don't like. Something makes me believe that if PS3 was switched with 360 or Wii, and the game was Halo Reach or SMG2, there would be a different response. More in the line of "Awesome, that game is going to sell tons of 360's/Wii's!"

Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious.

If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.

 

How is "39% (actually 33%) of all gamers (surveyed) who plan to buy GT5 do not yet own a ps3" any different from "the study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3; 39% of those that responded yes to the first query responded no to the second"?  I'll give you the answer, there is no difference except for the wording.  Both of those statements give you the exact same info. 

Also, GamePlan is a tracking service.  It just offers much more info than just sales.  As such, they are not going to release the info on how exactly they do their research, so as not to be copied.  Just like this site.  Here is their website if you wish to do any further research on them.  http://www.gameplaninsights.com/

@ saicho

There is a huge difference between Nikkom doing a survey of 22 from his friends and family, most of which probably have similar tastes in gaming, and a completely unbiased company surveying 1,000 different gamers of all tastes and walks of life each week.  Also, the amount of gamers who said they plan to buy it truly doesn't matter, as not every gamer plans on buying it.  And even if the number of people who said they would buy it is small, then that would only prove further that it is accurate.  I say this because if GT5 sells 10 mil, that's still a very small number of gamers/consoles out there (Wii+360+PS3 as of now = 142.86 mil, so 7% plan to buy it).  And of that 10 mil, about 3.3 mil will probably be buying a PS3 between now and around the time it releases. 

Of course, 10 mil is just an estimate and it could very well go on to sell more than that.  Also, surveys are never 100% accurate and usually give themselves a 5-10% margin of error.  However, they are still the only way to get a better picture of things as a whole.

Because that's what reputable researchers do, they define the parameters of their studies to very specific terms and leave no questions or speculation as to how the research was done. Maybe that's not how they did their survery, I just gave the most likely possible example, but it could be anything.

I'm only talking about defining the parameters of a study, I'm not talking about whatever else kind of tracking or sales tracking they might do, that's irrelevant to this point, so why do you keep bringing it up?

The whole point of publishing studies is to NOT hide anything. When anything isn't clearly defined, such as the basic parameters and variables studied in the study, then it makes it harder to trust the study.

But yes, there is a big difference between a study with a sample size of 1000 and 22 not-so-random individuals. But his point is maybe they extrapolated the data, and twisted it to fit their needs. Kind of like I could ask 100 doctors if they prefer medicine X over medicine Y. Maybe 9 say yes. So 9 out of 10 doctors agree, medicine X is better than medicine Y. I'm not saying this is the case, but I think (and hope) this is the kind of thing Nikkom was getting at. Not so far-fetched when the study has NO data reported, only the conclusion.

I'm not saying the conclusion is false based on the data they collected, I'm just saying it's invalid without more substantial proof. They could just lie about the data, but then it at least makes they seem more reputable (weird how that works). Most importantly, where did they conduct this survey (other than online)? Is it only in the USA? Or only EU, or everything? The implications of the study change dramatically based on that, and that's a pretty important parameter to neglect to mention. Then again, the article not only got the % wrong, but also some of the facts don't line up with what the website says, so maybe it's not GamePlan's fault at all.



r505Matt said:
thismeintiel said:
r505Matt said:
thismeintiel said:

After looking at their site, I have no reason to not believe this survey they have conducted.  In 8 years they have become one of the top 25 trackers in the world (#24 to be exact). In fact, their weekly surveys remind me of the weekly political polls conducted here in the US. Everyone seems to qoute them all the time without questioning their exact methods. In fact, I know of no tracking site/service that releases their exact sources and methods. What would be the point? Then anyone could do it themselves. They always just release a general, one sentence explanation about their survey. Oh, something like "OXT’s GamePlan weekly tracking study surveys 1,000 U.S. gamers and buyers including hardcore gamers, casual gamers and everyone in between."

I find it ridiculous that people on a tracking site that does the exact same thing wish to criticize another tracking site. Maybe the survey is saying something people don't like. Something makes me believe that if PS3 was switched with 360 or Wii, and the game was Halo Reach or SMG2, there would be a different response. More in the line of "Awesome, that game is going to sell tons of 360's/Wii's!"

Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious.

If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.

 

How is "39% (actually 33%) of all gamers (surveyed) who plan to buy GT5 do not yet own a ps3" any different from "the study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3; 39% of those that responded yes to the first query responded no to the second"?  I'll give you the answer, there is no difference except for the wording.  Both of those statements give you the exact same info. 

Also, GamePlan is a tracking service.  It just offers much more info than just sales.  As such, they are not going to release the info on how exactly they do their research, so as not to be copied.  Just like this site.  Here is their website if you wish to do any further research on them.  http://www.gameplaninsights.com/

@ saicho

There is a huge difference between Nikkom doing a survey of 22 from his friends and family, most of which probably have similar tastes in gaming, and a completely unbiased company surveying 1,000 different gamers of all tastes and walks of life each week.  Also, the amount of gamers who said they plan to buy it truly doesn't matter, as not every gamer plans on buying it.  And even if the number of people who said they would buy it is small, then that would only prove further that it is accurate.  I say this because if GT5 sells 10 mil, that's still a very small number of gamers/consoles out there (Wii+360+PS3 as of now = 142.86 mil, so 7% plan to buy it).  And of that 10 mil, about 3.3 mil will probably be buying a PS3 between now and around the time it releases. 

Of course, 10 mil is just an estimate and it could very well go on to sell more than that.  Also, surveys are never 100% accurate and usually give themselves a 5-10% margin of error.  However, they are still the only way to get a better picture of things as a whole.

Because that's what reputable researchers do, they define the parameters of their studies to very specific terms and leave no questions or speculation as to how the research was done. Maybe that's not how they did their survery, I just gave the most likely possible example, but it could be anything.

I'm only talking about defining the parameters of a study, I'm not talking about whatever else kind of tracking or sales tracking they might do, that's irrelevant to this point, so why do you keep bringing it up?

The whole point of publishing studies is to NOT hide anything. When anything isn't clearly defined, such as the basic parameters and variables studied in the study, then it makes it harder to trust the study.

But yes, there is a big difference between a study with a sample size of 1000 and 22 not-so-random individuals. But his point is maybe they extrapolated the data, and twisted it to fit their needs. Kind of like I could ask 100 doctors if they prefer medicine X over medicine Y. Maybe 9 say yes. So 9 out of 10 doctors agree, medicine X is better than medicine Y. I'm not saying this is the case, but I think (and hope) this is the kind of thing Nikkom was getting at. Not so far-fetched when the study has NO data reported, only the conclusion.

I'm not saying the conclusion is false based on the data they collected, I'm just saying it's invalid without more substantial proof. They could just lie about the data, but then it at least makes they seem more reputable (weird how that works). Most importantly, where did they conduct this survey (other than online)? Is it only in the USA? Or only EU, or everything? The implications of the study change dramatically based on that, and that's a pretty important parameter to neglect to mention. Then again, the article not only got the % wrong, but also some of the facts don't line up with what the website says, so maybe it's not GamePlan's fault at all.

I'm glad that you are trying to be understanding here, but every single company does it this way. Either you deny all of them, or accept them. There's really no middleground in this. If any company ever tells you exactly how they did something, it is because they have contract information that nobody else can ever get, or because they are a massive corporation that can afford to hire a fleet of surveyers. Another reason would be if the validity was so important, that to keep any information, would have severe repurcussions on the outcome.

Voting for example. Voting surveys are explained very clearly, because if not, they will cause a lot of distrust and even change voter behavior because there will always be people who would look at a survey saying "Obama is in the lead" without any explanation, and the conservative base would then rally against what they considered  to be a liberal company lieing to the poulation in an attempt to influence others.

What you expect is unreasonable for what it is and I wish I could somehow enlighten you of that fact.



theprof00 said:
r505Matt said:
thismeintiel said:

After looking at their site, I have no reason to not believe this survey they have conducted.  In 8 years they have become one of the top 25 trackers in the world (#24 to be exact). In fact, their weekly surveys remind me of the weekly political polls conducted here in the US. Everyone seems to qoute them all the time without questioning their exact methods. In fact, I know of no tracking site/service that releases their exact sources and methods. What would be the point? Then anyone could do it themselves. They always just release a general, one sentence explanation about their survey. Oh, something like "OXT’s GamePlan weekly tracking study surveys 1,000 U.S. gamers and buyers including hardcore gamers, casual gamers and everyone in between."

I find it ridiculous that people on a tracking site that does the exact same thing wish to criticize another tracking site. Maybe the survey is saying something people don't like. Something makes me believe that if PS3 was switched with 360 or Wii, and the game was Halo Reach or SMG2, there would be a different response. More in the line of "Awesome, that game is going to sell tons of 360's/Wii's!"

Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious.

If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.

 

You don't even know how sales tracking works. Do you really think that Brett checks some crazy internet database or calls every single store and asks how many they sold? Do you think he has contracts to have info sent to him every week? NO. It is done through statistics, which is exactly what these guys do.

Dude, c'mon. Any study that isn't based on fact will lose all accredidation to it's host firm/company. The most they can do is spin. Like it has been said 10 times already, a company that is 24th in the US in stat tracking is not going to tell you how it collected or analyzed the data. It is a trade secret. The way it works (yes I will lower myself to sit here and explain statistics to you) is that they gather information from several different sources, some by survey, some by gamefly pre-ordering, some by other methods. Each of these methods is then assigned a score based on how reliable they are, usually from looking at past data. They then look through everyone planning to buy gt5 without having a ps3. This is possible in several ways. 1) Surveys which have a list of games and systems with check bubbles which ask the recipient to check which they own, and which they plan to own in several different time slots: Next month, next 6 months, next year. Or, on a gamefly account, this is easy to see because all members have their consoles listed. All of these methods are then examined for probable falsification or errors in collection methods through a series of mathematical/statistical analysis.

Then they look at the number of people who plan to buy, and the number who don't plan to buy, and find the population deviancy, to see if these people actually belong to the entire population, or whether they are a small circle whom have accidentally been considered a real population. Testing and analysis is done on this again. First to make sure that their sample population is able to be extrapolated into the total population, and second to determine exactly how many of the population they represent. Through this, in a sample of 1000 people, 300 people can be turned into 1 million, for example.

Lastly, they test the validity and error. A valid test will be accurate some 95% of the time, and be off by no more than 5% in either direction. Then, they run tests on the theoretical data by surveying another few sets of 1000 people, calling, or whatnot. If they predicted that 4/10 people who want to by gt5 and don't own a ps3, then for the next few weeks, this should stay the same, give or take  a certain number of people. This is all then fed back into the validty testing and error, make sure they sync up, and then release an announcement about what they found.

That is how statistics, and more than 95% likely, how this study was done. I wouldn't expect any less from such a high profile company.

Ugh, can we stop talking about sales tracking? Behavioral research =/= sales tracking. Talking about the validity of a research study on gamers that plan to buy GT5 is NOT the same as sales tracking.

Everything you're talking about there, I understand, but where's your source? You can speculate and assume all you want, but if they didn't come out and say that, then why would you just assume that's their method? If there's a source, post it ahead of time next time, and don't just start speculating as if it's fact. And yes, you're not "explaining statistics" to me, you are speculating.

They don't need to make studies based on gamers fit the whole population anyways, they just need to make it fit gamers. I mean they have to make sure it's not a subgroup yes, but it doesn't have to fit the normal population, though that also depends on where they get the data from. That is unless a company wants some kind of information on how to market to non-gamers maybe? I don't know. Please note, I'm referring to this specific case with the GT5 study, nothing else.

Besides, pulling data from multiple sources for the same sample is a big no-no in this kind of research. Big no-no. They can do multiple studies and combine the data, but they can't pull a bit here and a bit there. That's about as unscientific as you can get. Assigning scores based on reliability is all well and good, but the most reliable method would be to seperate the different sources as different sets of data and not combine them. Otherwise they are increasing variables, sure they can assignment "relevancy scores" but why even take that chance? It's not a guarantee they won't mess up the data with that.

Back to one of your earlier points, I know, it seems stupid for them to possibly lie for something like this since it hurts them in the end, but that doesn't mean you should just assume it's true then.

It's just the defenders of this are assuming that the study is accurate, and the offenders (heh) are just questioning the validity. The only thing you can expect from any high profile company is that they'll do almost whatever it takes to turn profits. Even that shouldn't be assumed. You take too much for granted, and assume it's fact. And please stop being condescending "yes I will lower myself to sit here and explain statistics to you", it's gross and hurts your case.



Gran Turismo is Playstation.

Just wait and see what's gonna hit you...



Around the Network
theprof00 said:
r505Matt said:

Because that's what reputable researchers do, they define the parameters of their studies to very specific terms and leave no questions or speculation as to how the research was done. Maybe that's not how they did their survery, I just gave the most likely possible example, but it could be anything.

I'm only talking about defining the parameters of a study, I'm not talking about whatever else kind of tracking or sales tracking they might do, that's irrelevant to this point, so why do you keep bringing it up?

The whole point of publishing studies is to NOT hide anything. When anything isn't clearly defined, such as the basic parameters and variables studied in the study, then it makes it harder to trust the study.

But yes, there is a big difference between a study with a sample size of 1000 and 22 not-so-random individuals. But his point is maybe they extrapolated the data, and twisted it to fit their needs. Kind of like I could ask 100 doctors if they prefer medicine X over medicine Y. Maybe 9 say yes. So 9 out of 10 doctors agree, medicine X is better than medicine Y. I'm not saying this is the case, but I think (and hope) this is the kind of thing Nikkom was getting at. Not so far-fetched when the study has NO data reported, only the conclusion.

I'm not saying the conclusion is false based on the data they collected, I'm just saying it's invalid without more substantial proof. They could just lie about the data, but then it at least makes they seem more reputable (weird how that works). Most importantly, where did they conduct this survey (other than online)? Is it only in the USA? Or only EU, or everything? The implications of the study change dramatically based on that, and that's a pretty important parameter to neglect to mention. Then again, the article not only got the % wrong, but also some of the facts don't line up with what the website says, so maybe it's not GamePlan's fault at all.

I'm glad that you are trying to be understanding here, but every single company does it this way. Either you deny all of them, or accept them. There's really no middleground in this. If any company ever tells you exactly how they did something, it is because they have contract information that nobody else can ever get, or because they are a massive corporation that can afford to hire a fleet of surveyers. Another reason would be if the validity was so important, that to keep any information, would have severe repurcussions on the outcome.

Voting for example. Voting surveys are explained very clearly, because if not, they will cause a lot of distrust and even change voter behavior because there will always be people who would look at a survey saying "Obama is in the lead" without any explanation, and the conservative base would then rally against what they considered  to be a liberal company lieing to the poulation in an attempt to influence others.

What you expect is unreasonable for what it is and I wish I could somehow enlighten you of that fact.

Well, with that info, it does seem like it's quite possible the GamePlan released such data, and the article just kept it out (though I don't know why) just another possibility here though, I'm not assuming any of them as truth.

No, I'm well aware that what I expect might be unreasonable, no enlightening needed. I'm used to reading scientific studies that very thoroughly explain their results. I've come to trust that, even when so many of those studies can conflict with each other. It is usually attributed to short-sightedness in terms of missing a variable. When I encounter studies that are not transparent, it makes me think there's a reason inherent in the study for that lack of openness. True, it could be just the company protecting some trade secrets, I won't deny the possibility, but that seems sketchy to me. Maybe I'm not well-versed enough in the non-scientific research community though.



And we should stop quoting, it takes up too much space =P



I think you also have to take into account the amount of people buying a PS3 regardless of GT5. I mean if I plan on buying a PS3 tomorrow and plan on getting GT5 but I'm not buying the PS3 for GT5 itself.

So while I'm not saying that these percentages are wrong I think some people are reading them wrong, it's not saying GT5 will sell that many systems but as it says, they just don't have a PS3 yet.

These figures would mean slightly more if the questions asked were.
1. Are you buying GT5?
2. If yes, Do you own a PS3?
3. If No, is GT5 the reason you're buying a PS3?

Personally, whenever I get a PS3 I'll buy GT5 but I'm not gonna buy a PS3 for GT5 itself



Shorty11857 said:
I think you also have to take into account the amount of people buying a PS3 regardless of GT5. I mean if I plan on buying a PS3 tomorrow and plan on getting GT5 but I'm not buying the PS3 for GT5 itself.

So while I'm not saying that these percentages are wrong I think some people are reading them wrong, it's not saying GT5 will sell that many systems but as it says, they just don't have a PS3 yet.

These figures would mean slightly more if the questions asked where.
1. Are you buying GT5?
2. If yes, Do you own a PS3?
3. If No, is GT5 the reason you're buying a PS3?

Personally, whenever I get a PS3 I'll buy GT5 but I'm not gonna buy a PS3 for GT5 itself

Interesting point that we've all missed.

Must stop responding to these comments and sleeeeep.



First lesson of consumer psychology... just because someone says they plan to buy something... it doesn't mean they will. Additionally don't be surprised if a lot of people who do plan on getting it don't get it right away.  Numbers like these should always be taken with a grain of salt and expected to be exagerated.