Collaborate with our ADAS stack engineering team to test and deploy end-to-end algorithms for mass production vehicles
Collaborate with our simulation team to deploy data-driven generation algorithms for highly efficient and automated tools
Conduct research on architecture design, robustness, safety, and language modality incorporation for our differentiable stack
Work closely with Research Scientists and interns on high-quality research publications to submit to top-tier conferences
Key Requirements:
Experience with end-to-end autonomy, multi-view perception, 3D reconstruction, diffusion model for sensory generation, and/or closed-loop behavior generation
A passion for scalable, data-driven autonomy for real-world systems
Strong research skills and the ability to work both independently and collaboratively on projects
Nice to have:
MSc or PhD in machine learning and computer vision with autonomy and robotics applications or closely related field
Hands-on experience of imitation/reinforcement learning, behavior prediction, and closed-loop simulation
Experience with training with large-scale data and ML models and/or applying language modality in AV stacks
Passion for building and shipping customer-focused software frameworks or tools