🌎

Kyle R. Chickering’s Homepage



I am a research scientist at Luma working on pre-training and dynamics. I completed my PhD in applied mathematics during the Summer of 2025 under the supervision of Prof. Muhao Chen at UC Davis, where I was a member of the LUKA Lab.

My research interests are roughly as follows:

  1. How can we efficiently and effectively scale model training from small (< 100M params.) to large (1T+ params.) models?
  1. Understanding the mathematical dynamics of training large models.
  1. Improving both the computational and statistical efficiency of model training.

I was the founding research engineer at an R&D startup building efficient algorithms for generative vision models. I have worked in industry as a researcher and engineer building foundation models at scale for video and text. I was also an intern on the LLM pre-training team at the MBZUAI Institute of Foundation Models where I studied μ\muP and scaling laws.

I previously worked in shock formation for analytic fluid mechanics (partial differential equations, mathematical analysis). Before that I was in software engineering working on (variously) data-center automation and classical computer vision.


🐙 GitHub

🔗 LinkedIn


💬

“Mathematics is what is done by mathematicians and mathematicians are those who do mathematics.” - Richard W. Hamming

Other Links

🐎 UCD Math Homepage

📝 UCD Math Prelim Notes

🌊 John Hunter PDE Notes

🎓 Past teaching