We're looking for a data scientist to deploy machine learning models on edge devices, wearables, and mobile robotics platforms. You'll optimise our models to run efficiently on resource-constrained hardware, focusing on minimal latency, reduced power consumption, and compact model architectures.
Key Responsibilities
- Deploy and optimise machine learning models for embedded systems and edge devices
- Implement efficient inference pipelines in Rust using any of the following Burn, Candle,tch鈥憆s, Rust鈥態ERT, nnl, rustorch, rustyml framework
- Reduce model size and computational requirements through quantisation, pruning, and distillation
- Minimise latency in real-time systems whilst maintaining model performance
- Profile and optimise memory usage and power consumption for battery-powered devices
- Collaborate with robotics engineers to integrate ML models into hardware platforms
- Benchmark performance across different embedded processors and accelerators
Essential Requirements
- Strong data science background with practical ML model development experience
- Proficiency in Rust for systems programming and embedded applications
- Strong Python proficiency enabling quick experimentation and prototyping.
- Experience with any of: Burn, Candle, tch鈥憆s, Rust鈥態ERT, nnl, rustorch, rustyml or willingness to work extensively with Rust-native deep learning frameworks
- Experience with model optimisation techniques (quantisation, pruning, knowledge distillation)
- Understanding of embeddings and their efficient representation
- Knowledge of edge ML frameworks and deployment strategies
- Experience with robotics platforms or embedded Linux systems
- Understanding of hardware constraints (memory, compute, power)
- Experience with cloud platforms such as Azure or AWS.