subnetworks

Latest

  • gonin via Getty Images

    MIT finds smaller neural networks that are easier to train

    by 
    Christine Fisher
    Christine Fisher
    05.06.2019

    Despite all the advancements in artificial intelligence, most AI-based products still rely on "deep neural networks," which are often extremely large and prohibitively expensive to train. Researchers at MIT are hoping to change that. In a paper presented today, the researchers reveal that neural networks contain "subnetworks" that are up to 10 times smaller and could be cheaper and faster to teach.