About me

I am a Founding Scientist at Liquid AI. I earned my PhD in computational mathematics from the Institute of Computational and Mathematical Engineering (ICME) at Stanford University, where I was advised by Scott Linderman. My research is broadly related to leveraging dynamical systems to improve the capability, efficiency, and explainability of sequence modeling.

Prior to my time at Stanford, I completed masters’ degrees in the Leaders for Global Operations (LGO) program at MIT and obtained my bachelor’s degree from Georgia Tech. My industry experience includes machine learning research roles at NVIDIA on the Learning and Perception team, as well as at Dell and Goodyear.

Publications

Birdie: Advancing State Space Models with Reward-Driven Objectives and Curricula
Sam Blouir, Jimmy T.H. Smith, Antonios Anastasopoulos, Amarda Shehu.
Conference on Empirical Methods in Natural Language Processing (EMNLP) 2024.

Towards Scalable and Stable Parallelization of Nonlinear RNNs
Xavier Gonzalez, Andrew Warrington, Jimmy T.H. Smith, Scott W. Linderman.
Advances in Neural Information Processing Systems (NeurIPS) 2024.

Towards a theory of learning dynamics in deep state space models
Jakub Smekal, Jimmy T.H. Smith, Michael Kleinman, Dan Biderman, Scott W. Linderman.
Next Generation of Sequence Modeling Architectures Workshop at ICML 2024.
Selected for Spotlight Presentation (top 10% of accepted papers)

State-Free Inference of State-Space Models: The Transfer Function Approach
Rom N. Parnichkun, Stefano Massaroli, Alessandro Moro, Jimmy T.H. Smith, Ramin Hasani,
Mathias Lechner, Qi An, Christopher Ré, Hajime Asama, Stefano Ermon, Taiji Suzuki,
Atsushi Yamashita, Michael Poli.
International Conference on Machine Learning (ICML) 2024.

Convolutional State Space Models for Long-Range Spatiotemporal Modeling
Jimmy T.H. Smith, Shalini De Mello, Jan Kautz, Scott W. Linderman, Wonmin Byeon.
Advances in Neural Information Processing Systems (NeurIPS) 2023.

Simplified State Space Layers for Sequence Modeling
Jimmy T.H. Smith, Andrew Warrington, Scott W. Linderman.
International Conference on Learning Representations (ICLR) 2023.
Selected for Oral Presentation (top 5% of accepted papers, top 1.5% of all submissions)

Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems
Jimmy T.H. Smith, Scott W. Linderman, David Sussillo.
Advances in Neural Information Processing Systems (NeurIPS) 2021.

Bayesian Inference in Augmented Bow Tie Networks
Jimmy T.H. Smith, Dieterich Lawson, Scott W. Linderman.
Bayesian Deep Learning Workshop, NeurIPS 2021.