The role
You will invent, analyze and deploy the mathematical and algorithmic core of our design engine. Your mandate:
- Design and train physics-informed foundation models — equivariant models, conditional generative modeling, PINNs, learning non-equilibrium importance sampling, etc.
- Derive new objective functions and samplers that marry quantum-level fidelity with GPU-level throughput.
- Ship production-grade code: models that run on our cluster today, not “interesting ideas” parked on a branch.
- Collaborate with simulation physicists and wet-lab scientists to close the design-build-test-learn cycle in weeks, not quarters.
You might be a fit if you
- Hold a PhD (or equivalent experience) in Physics, Applied Math, CS, or a related field.
- Strong publication or project record in top ML or life-science venues (NeurIPS, ICML, ICLR, Nature, Science, Nature Methods, etc.)
- Have meaningful experience with multiple: atomic ML models, generative protein modeling, molecular dynamics and quantum-chemistry calculations, differential-equation modeling, enhanced sampling simulation, Monte-Carlo methods.
- Write clean, tested code and value reproducibility.
- Thrive in ambiguity, hunt down first principles, and communicate crisply across disciplines.
- Can derive a variational bound before breakfast and implement it in JAX or PyTorch before lunch.
Bonus points
- Publications on symplectic / energy-conserving neural nets, CNFs, diffusion modeling, or Boltzmann generators.
- An understanding of molecular biology, biocatalysis and/or protein science.
Why us
- Top-tier cash compensation plus generous equity.
- Hardware to match your ambitions (all the compute you need!).
- Full medical/dental, 401(k) with match, unlimited PTO.