r505Matt said:
Research studies =/= sales tracking. They offer studies into buyers' habits, so their clients can effectively market and plan for release dates and such. That is NOT the same as sales tracking at all, and if they want to keep it all behind closed doors, that's all well and good. But if a study surfaces (not a sales tracking estimate) like this study, and just gives a conclusion, then the validity of that study is suspicious. If they said "The study was done by asking participants if they planned to buy GT5, and if so, if they already owned a PS3. 39% of those that responded yes to the first query responded no to the second." Now maybe the study was done differently, but the exclusion of such an explanation makes me feel like there is something to hide, and makes me think the study is not truthful. There are false studies EVERYWHERE, and hiding simple information is a great way to earn mistrust.
|
You don't even know how sales tracking works. Do you really think that Brett checks some crazy internet database or calls every single store and asks how many they sold? Do you think he has contracts to have info sent to him every week? NO. It is done through statistics, which is exactly what these guys do.
Dude, c'mon. Any study that isn't based on fact will lose all accredidation to it's host firm/company. The most they can do is spin. Like it has been said 10 times already, a company that is 24th in the US in stat tracking is not going to tell you how it collected or analyzed the data. It is a trade secret. The way it works (yes I will lower myself to sit here and explain statistics to you) is that they gather information from several different sources, some by survey, some by gamefly pre-ordering, some by other methods. Each of these methods is then assigned a score based on how reliable they are, usually from looking at past data. They then look through everyone planning to buy gt5 without having a ps3. This is possible in several ways. 1) Surveys which have a list of games and systems with check bubbles which ask the recipient to check which they own, and which they plan to own in several different time slots: Next month, next 6 months, next year. Or, on a gamefly account, this is easy to see because all members have their consoles listed. All of these methods are then examined for probable falsification or errors in collection methods through a series of mathematical/statistical analysis.
Then they look at the number of people who plan to buy, and the number who don't plan to buy, and find the population deviancy, to see if these people actually belong to the entire population, or whether they are a small circle whom have accidentally been considered a real population. Testing and analysis is done on this again. First to make sure that their sample population is able to be extrapolated into the total population, and second to determine exactly how many of the population they represent. Through this, in a sample of 1000 people, 300 people can be turned into 1 million, for example.
Lastly, they test the validity and error. A valid test will be accurate some 95% of the time, and be off by no more than 5% in either direction. Then, they run tests on the theoretical data by surveying another few sets of 1000 people, calling, or whatnot. If they predicted that 4/10 people who want to by gt5 and don't own a ps3, then for the next few weeks, this should stay the same, give or take a certain number of people. This is all then fed back into the validty testing and error, make sure they sync up, and then release an announcement about what they found.
That is how statistics, and more than 95% likely, how this study was done. I wouldn't expect any less from such a high profile company.












