ApDeepSense : Deep Learning Uncertainty Estimation Without the Pain for IoT Applications
Introduction :
The paper focuses on providing uncertainty estimates in deep learning outputs. Uncertainty estimates
are important for IoT applications which interact directly with the real world. The use of the existing
approaches requires intensive computing and testing which would be a bottleneck in resource
constrained IoT environments. To tackle this challenge, the authors propose ApDeepSense which proves
to reduce the execution time to 88.9% and 90% of the energy consumption when compared to the
state-of-the-art methods.
are important for IoT applications which interact directly with the real world. The use of the existing
approaches requires intensive computing and testing which would be a bottleneck in resource
constrained IoT environments. To tackle this challenge, the authors propose ApDeepSense which proves
to reduce the execution time to 88.9% and 90% of the energy consumption when compared to the
state-of-the-art methods.
Method :
The paper proposes novel layer wise approximations and builds over the previous results to prove the equivalence between the layers in neural networks and the layers in deep gaussian processes. The obtained posterior distribution is approximated as a multivariate gaussian process which is used to establish the uncertainty of predictions.
Thoughts/Comments :
1. The paper considers the resource challenges of IoT and brings attention to an overlooked area of
research.
research.
2. It proposes a great improvement on inference time but there is little discussion on the actual training
required and the increase in training time for convergence using drop out training.
required and the increase in training time for convergence using drop out training.
3. Authors discuss the challenge of figuring a closed-form output distribution as a potential road block to
extend to convolutional and recurrent neural networks.
extend to convolutional and recurrent neural networks.
4. The most important gain from the paper is that it removes the requirement of sampling and re-training of
the neural nets to provide the uncertainty estimates which could be really expensive in IoT environments.
the neural nets to provide the uncertainty estimates which could be really expensive in IoT environments.
-Aakhila Shaheen
Good point about the impact of dropout on training time. The main idea seems to have been published elsewhere, that stochastic nets via dropout can give you Gaussian estimates, but they have tweaked it for the IoT environment. A hard paper to evaluate given the amount of mathematics!
ReplyDelete