Duke University has introduced "Autism & Beyond", an app that uses an emotion detection algorithm to track visible signs of autism in children. The team wants to use the front-facing iPhone camera to see if a user's reactions to videos can be used to make an early diagnosis. Another app, from Oregon Health & Science University, will look at iPhone images to study moles and melanoma. Participants from all parts of the world will be able to contribute to the research by documenting their mole growth with pictures over time. Based on these collections of photographs, the goal is to build an algorithm that will potentially detect and screen melanoma.
Johns Hopkins, on the other hand, has moved away from the iPhone. Their app, the EpiWatch, will use and test the wearable sensors on the Apple Watch to see if they can predict and detect seizures. The first leg of this epilepsy study will allow users quick, one-touch access to the app that will collect data from both the accelerometer and heart rate sensors in the watch, while it simultaneously alerts a designated contact or caregiver. EpiWatch will maintain a log of the epileptic episodes and will also allow patients to compare notes with other participants. For millions of epilepsy patients across the country, this app hopes to find a way to monitor seizures.