| damndl0ser said: Good god man, you proved nothing by posting an online poll. Anyone with half a brain cell will tell you that they aren't exactly scientific. But hay, if it makes you feel better, you got me. LOL. |
And anyone who knows anything would know that the way they did this poll makes it accurate, people always say online polls aren't accurate, but thats just an assumption they don't have any proof that online polls aren't accurate here read this
http://www.ehow.com/how_4500487_calculate-margin-error.html tells you how to tell how accurate a poll is
also I think you missed this the first time around
UPDATE
Due to overwhelming demand in regards to questions about this survey (our PR man is pulling what's left of his hair out) - we have decided to post answers to the most frequently asked questions on this page.
Was the survey only on No Fuss Reviews?
No, it was mainly delivered from No Fuss Reviews but also other satellite sites we own.
How long was the survey conducted over?
The survey was conducted over 280 days. It started on the 7th of July 2009.
Under what criteria was the survey presented?
It was presented at a 1/3 random basis to any person who:
a) arrived at our site from any search engine looking for any console specific game
b) had not already taken the survey
How many submissions were logged?
In total we measured 512,322 submissions, we discarded submissions in our final result after the 500,000 mark.
What was the demographic of the survey?
35% answered questions relating to the Wii
34% answered questions relating to the PS3
31% answered questions relating to the 360
How did you try to stop duplicate submissions?
We asked for users to enter their email address and this was stored in our database for the sole purpose of checking for duplicate submissions.
We also applied a session limiter, which only allowed users to take the survey once during a browser session. Finally we used randomised cookies to store the fact a user had taken the survey again to try to prevent duplicate submissions.
If users entered a duplicate submission they were not notified. The survey also relied on Javascript.
How was the survey presented?
The survey was presented in a 'lightbox' dialogue as 'opt-in'.
The survey questions were presented one at a time depending on the previous answer.
Was any other information collected?
IP addresses and browser versions were measured live to try and prevent 'bots' from taking the forms or predefined scripts.







