The brittleness of deep learning models is ailing their deployment in real-world applications, such as transportation and airport security. Most work focuses on developing accurate models that only deliver point estimates without further information on model uncertainty or confidence. Ideally, a learning model should compute the posterior predictive distribution, which contains all information about the model output. We cast the problem of density tracking in neural networks using Particle Filtering, a powerful class of numerical methods for the solution of optimal estimation problems in non-linear, non-Gaussian systems. Particle filters are a powerful alternative to Markov chain Monte Carlo algorithms and enjoy established convergence and performance guarantees. In this paper, we advance a particle filtering framework for neural networks, where the predictive output is a distribution. The mean of this distribution serves as the point estimate decision and its variance provides the model confidence in the decision. Our framework shows increased robustness under noisy conditions. Additionally, the predictive variance increases monotonically with decreasing signal-to-noise ratio (SNR); thus reflecting a lower confidence or higher uncertainty. This paper serves as a pioneering proof-of-concept framework that will allow the development of a theoretical understanding of robust neural networks.