Public Access

Community storytelling.

This post was created by a member of the Public Access community. It has not been edited for accuracy or truthfulness and does not reflect the opinions of Engadget or its editors.

Editor's Picks

Image credit:

Machine Learning Will Define Technology Trends in 2017

Genia Stevens, @geniastevens
10.04.16
49 Shares
Share
Tweet
Share
Save


Image credit: Pixaby.com

The increasing sophistication of machine learning and its applications will be a defining trend for 2017 in fields that extend far beyond basic technology.

Machine learning is already pervasive in the more pedestrian sectors of the internet. Search engines and social networks use machine learning algorithms to analyze user data and to match users with products, services, and new social connections. Machine learning is also used to create tailored news feeds and to auto-correct spelling errors.

The finance industry and financial markets have used machine learning for many years in basic consumer services, including to determine applicants' creditworthiness and to detect fraud. Machine learning has enriched traders and arbitrage experts who develop sophisticated algorithms to manage stock portfolios and to automate trading with split-second decision-making.

The field of space exploration is placing greater reliance on unmanned robots to explore distant worlds and the far reaches of space. Artificial intelligence algorithms in planetary rovers allow those vehicles to read complex terrain and to maneuver around obstacles without sacrificing functionality or reliability. This technology can be easily adapted to improve the capabilities of driverless cars in urban environments.

The continued expansion of machine learning into different technologies will be a function of developers' understanding of the basic components of machine learning, namely, artificial intelligence algorithms, computing power, and data. The algorithms and computing power are available to practically everyone.

A second and more subtle influence on any machine learning trend will be developers' understanding of how machine learning differs from artificial intelligence and deep learning. Alan Turing is considered, by many, the father of artificial intelligence, but for forty years after Turing decoded the Enigma machine, artificial intelligence was little more than a fringe effort by academics to replicate human logic in machines.

Machine learning came into its own in the 1980s and 1990s as coders developed neural networks that were able to use artificial intelligence to train themselves to perform specific tasks. Newer deep learning techniques grew from those machine learning efforts, as developers layered neural networks not just to train an individual network to perform one task, but to use multiple networks to make intelligent decisions. Any developer who seeks to apply machine learning to a particular industry needs to consider where their efforts need to be concentrated on this continuum.

ear iconeye icontext filevr