This test is performed by a robot which has an artificial finger that performs hundreds of precise taps across the entire display. The location of the tap is compared against where the device registered the tap. If the actual location and registered location are within 1mm of each other, the tap is displayed as a green dot–a pass. If the actual location and registered location differ by 1mm or more, then tap is displayed as a red dot–a failure.
Both the iPhone 5s and the iPhone 5c had lots of red dots. The Samsung Galaxy S3 had very little red dots and lots of green dots. But Arnott observes a peculiarity:
I havenâ€™t been able to find official documentation on this, but I think this behavior is intentional compensation being done by Apple. Have you ever tried tapping on an iPad or iPhone while itâ€™s upside-down to you, like when youâ€™re showing something to a friend and you try tapping while theyâ€™re holding the device? It seems nearly impossible. The device never cooperates. If the iPhone is compensating for taps based on assumptions about how it is being held and interacted with, this would make total sense. If you tap on a device while itâ€™s upside-down, not only would you not receive the benefit of the compensation, but it would be working against you. Tapping on the device, the iPhone would assume you meant to tap higher, when in reality, youâ€™re upside down and likely already tapping higher than you mean to, resulting in you completely missing what youâ€™re trying to tap.
Now I get why iPhone doesn’t allow you to flip it upside down. I agree; I believe this offset in iOS touch recognition was deliberately designed this way by Apple. (I wouldn’t be surprised if the Galaxy S3 or S4 in a subsequent update performs just as poorly as the iPhone 5s or 5c did on this test.)