Meta-Learner with Sparsified Backpropagation

Paithankar, Rohan and Verma, Ayushi and Agnihotri, Manish and Singh, Sanjay (2019) Meta-Learner with Sparsified Backpropagation. In: International Conference on Cloud Computing, Data Science & Engineering (Confluence), 10/01/2019, Amity University.

[img] PDF
1458.pdf - Published Version
Restricted to Registered users only

Download (200kB) | Request a copy


In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involvescarefulselectionofappropriatetrainingexamples,tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learningto-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Meta-Learner, Back Propagation
Subjects: Engineering > MIT Manipal > Information and Communication Technology
Depositing User: MIT Library
Date Deposited: 13 Sep 2019 09:08
Last Modified: 13 Sep 2019 09:08

Actions (login required)

View Item View Item