optofidelity

Latest

  • Here's how Google checks for lag on your Android phone

    by 
    Jon Fingas
    Jon Fingas
    06.28.2015

    Yes, Google hates lag on smartphones as much as you do -- enough so that the search giant has a robot dedicated to spotting that delay between your finger input and what happens on screen. Meet the Chrome TouchBot, an OptoFidelity-made machine that gauges the touchscreen latency on Android and Chrome OS devices. As you can see in the clip below, the bot's artificial digit pokes, prods and swipes the display in a series of web-based tests (which you can try yourself) that help pinpoint problems in both code and hardware. This isn't the only gadget monitoring device lag at Google, but it could be the most important given how much the company's software revolves around touch. Don't be surprised if this automaton boosts the responsiveness of Mountain View's future platforms.

  • About that iPhone touchscreen accuracy test

    by 
    John-Michael Bond
    John-Michael Bond
    10.28.2013

    Recently, you may have seen a report from the automated testing company OptoFidelity regarding the accuracy of the touch panel on the iPhone 5c, 5s and the Samsung Galaxy S 3. The test was performed by having an artificial robot finger make hundreds of taps across each phone's display. The tap location was then compared against where the touch panel registered the tap. If the tap was registered within 1mm of where the robot finger actually hit, the tap was shown on OptoFidelity's display as a green dot. You can see the results in the image above. But that may not be the end of the story. Nick Arnott over at Neglected Potential has written a blog post questioning the roughly 75 percent inaccuracy rate OptopFidelity's testing showed for the iPhones. Someone reading their study would get the impression that the iPhone 5c and 5s have incredibly inaccurate screens, but something about the results didn't sit well with Arnott. So he took a finer look. He noticed that the main areas where the iPhone was getting the lowest scores were the left and right edges of the keyboard. OptoFidelity itself actually specifically mentions these areas in its study. Taking a finer look at where exactly the issues arose, Arnott noticed something that could easily explain the robot finger's odd tapping issues. Looking at the displacement of taps as you move away from the green area, there's a definite pattern. The more you move away from the easily-tappable area, the greater the "inaccuracy" of the tap. But the inaccuracy skews in a way that would make the target slightly closer to starting position of your thumb (which is likely the most frequently used digit for tapping). As your thumb stretches out from your hand, likely positioned near the bottom of the phone, the portion of your thumb that actually comes into contact with the screen when you tap changes. Your perception of the screen also changes slightly, as when you move higher on the screen, it's less likely that you're viewing the screen at exactly a 90 degree angle. These are factors that this automated test does not account for. The robot doing the test is viewing its tap target at a perpendicular angle to the screen. It is also tapping at a perpendicular angle every time. This isn't generally how people interact with their phones. While OptoFidelity's test indeed raises some questions about the iPhone's touch panel, Arnott raises an interesting point. Testers should always be asking questions. When OptoFidelity researchers received the test results, did they look at the methodology and how it affected the touch recognition on the panel? Did the OptoFideltiy test take into account the different sizes of the iOS devices and the Galaxy S3 and how they fit differently in users' hands? Arnott is not saying that OptoFidelity's tests are necessarily wrong, though he has issues with the results. He is merely reminding all of us that when we are testing a product or even just reading a report, we should be questioning the findings. That's how products get better, and how reports that raise questions can help answer them.

  • The machine apocalypse can wait; robot busy playing Angry Birds

    by 
    Michael Grothaus
    Michael Grothaus
    05.08.2011

    I've got to admit, I've never seen the appeal of Angry Birds. The game just isn't that fun to me. I know a lot of readers disagree with that, and now I know of one robot that would disagree as well. Yes, Finnish company OptoFidelity has created a robot with the sole purpose of playing Angry Birds. Impressive? Yes. Kinda stupid? Yes. But then again, it's better than them creating robots that can kill you, which other companies are doing right at this moment. Speaking of which, OptoFidelity has created a video, appropriately titled "Mac vs. Machine," which shows the robot getting its game on. Check it out below. [via TechCrunch]