Research Interests

I am interested in randomized iterative methods for optimization and analyzing machine learning algorithms for compressed data. I have studied data completion for structured data, classification methods using binary data, and sketch-and-project methods which include popular optimization methods such as coordinate descent, randomized Kaczmarz and stochastic gradient descent. Recently, I have been investigating extensions of methods for use with tensors.

While a research intern at Google, I built tools to diagnose sources of accuracy loss with integer quantization of machine learning models.

  • R. Gower, D. Molitor, J. Moorman, and D. Needell. “Adaptive sketch-and-project methods for solving linear systems.” arXiv preprint arXiv:1909.03604 Sept. 2019


  • A. Ma and D. Molitor. “Randomized Kaczmarz for Tensor Linear Systems.” arXiv preprint arXiv:2006.01246 (2020).

Journal Papers
  • “Randomized Kaczmarz with averaging” by J. Moorman, T. Tu, D. Molitor and D. Needell. BIT Numerical Mathematics, to appear 2020.
  • D. Molitor, D. Needell. “An iterative method for classification of binary data.” Information and Inference. April 2020.
  • D. Molitor, D. Needell, R. Ward. “Bias of gradient descent for the hinge loss.” Applied Mathematics and Optimization. April 2020.
  • “Using peano curves to construct Laplacians on fractals” by R. Strichartz, N. Ott, D. Molitor. Fractals, Vol. 23, No. 4, Dec. 2015.
  • “The structure of symmetric N-player games when influence and independence collide” by D. Molitor, M. Steel, A. Taylor. Advances in Applied Mathematics, Vol. 62, 15-40, Jan. 2015.
  • “Is more better? Higher sterilization of infected hosts need not result in reduced pest population” by D. Maxin, L. Berec, A. Bingham, D. Molitor, J. Pattyson. Journal of Mathematical Biology. June 2014.
Conference Papers
  • “Neural nonnegative matrix factorization for hierarchical multilayer topic modeling.” M. Gao, J. Haddock, D. Molitor, D. Needell, E. Sadovnik, T. Will, R. Zhang. Proc. IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), Dec. 2019.
  • “On inferences from completed data” by J. Haddock, D. Molitor, D. Needell, S. Sambandam, J. Song and S. Sun. Proc. Information Theory and Approximation Workshop, Feb. 2019.
  • “Model agnostic supervised local explanations” by G. Plumb, D. Molitor and A. Talwalkar. Proc. Neural Information Processing Systems (NeurIPS), Dec. 2018.
  • “A simple approach to hierarchical classification” by D. Molitor and D. Needell. Proc. International Traveling Workshop on Interactions between low-complexity data models and Sensing Techniques (iTwist), Nov. 2018.
  • “Matrix completion for structured observations” by D. Molitor and D. Needell. Proc. Information Theory and Approximation, Feb. 2018.
Reports and Articles
  • “Classification scheme for binary data with extensions” by D. Molitor, D. Needell, A. Nelson, R. Saab, and P. Salanevich Chapter in Compressed Sensing and its Applications, to appear, 2018.