There's one big, glaring reason why you don't see neural networks in mobile devices right now: power. Many of these brain-like artificial intelligence systems depend on large, many-core graphics processors to work, which just isn't practical for a device meant for your hand or wrist. MIT has a solution in hand, though. It recently revealed Eyeriss, a chip that promises neural networks in very low-power devices. Although it has 168 cores, it consumes 10 times less power than the graphics processors you find in phones -- you could stuff one into a phone without worrying that it will kill your battery.
Eyeriss' trick is to avoid swapping data when possible. Each of the cores (which effectively serves as a neuron) has its own memory, and compresses data whenever it leaves. It also keeps the amount of work to a minimum. Nearby cores can talk directly to each other, so they don't need to talk to a central source (say, main memory) if what they need is close at hand. On top of that, a special delegation circuit gives cores as much work as they can handle without going back to fetch data.
There's no mention of how soon you could expect Eyeriss' technology in something you can buy, but the impact for machine learning could easily be huge. You could have your smartphone (or any other low-power device) handle AI-based processing locally, rather than farming it out to an internet server where delays and security are problems. Many of the devices you own would be better at adapting to new situations or learning about their environments. And it's important to note that one of NVIDIA's senior researchers helped make the chip -- this tech could easily become a practical reality before long.