Senior Machine Learning Researcher

Remote, USA Full-time
About the position Are you an experienced machine learning researcher ready to push the limits of AI in one of the toughest domains—maritime autonomy? At Tocaro Blue, your expertise in designing, training, and deploying custom ML models will directly advance our foundational perception stack, ProteusCore . As a Senior ML Researcher , you will be the lead architect of Radar (and secondary EO/IR) models for object detection, semantic segmentation, and tracking. You’ll design algorithms capable of distinguishing vessels, land, shoreline constructions, wakes, and markers in dynamic maritime environments where off-the-shelf models fall short . Your work will fuel products used by: · Defense customers developing USVs/ASVs for the U.S. Navy. · Commercial OEMs bringing advanced marine ADAS and autopilot features to market. This role is an opportunity to define the ML foundations of maritime autonomy—where perception evolves from situational awareness, to navigation assistance, to full autonomy. Responsibilities • Invent and refine custom deep learning architectures for Radar and EO/IR imagery, with an emphasis on semantic segmentation and temporal tracking , not just detection. • Develop multi-stage ML pipelines (context + characteristic models, segmentation + classification) tailored to low-SNR Radar returns. • Train models on proprietary large-scale datasets (millions of Radar samples and camera sequences) with design-of-experiment methods for data collection and annotation. • Optimize and deploy models to resource-constrained edge hardware (CPU-only and ARM64 platforms), including C++ inference layers. • Advance fusion-aware ML models that integrate Radar with EO/IR, AIS, and cartography for robust classification in GPS-denied or cluttered environments. • Collaborate with fusion and autonomy engineers to ensure ML outputs integrate seamlessly into multi-target tracking and SLAM pipelines . • Contribute to ML-Ops workflows : data management, large-scale training, continuous integration of new field data, and automated evaluation pipelines. Requirements • Advanced degree (MS/PhD) in Electrical Engineering, Computer Science, Robotics, or related field. • 7+ years applying machine learning and signal processing to real-world dynamic systems (graduate research counts if directly applicable). • Demonstrated mastery of semantic segmentation and object classification models , ideally applied to non-vision sensor modalities. • Expert-level Python skills with ML frameworks (TensorFlow/Keras, PyTorch, or equivalent). Nice-to-haves • Track record of developing ML models beyond standard YOLO-style detectors , particularly for segmentation of noisy or sparse data (Radar, sonar, or medical imaging). • Strong background in computer vision and temporal modeling (CNNs, transformers, RNNs for sequential sensor data). • Experience deploying ML to embedded/edge platforms with optimized C++ inference. • Knowledge of marine, automotive, or aerial robotics systems. • Contributions to large-scale ML data pipelines : annotation strategies, dataset balancing, simulation-to-real transfer. • Passion for pushing the boundaries of AI in GPS-denied, cluttered, and low-visibility environments . Benefits • Competitive Compensation & Growth • $132,000–$160,000 base salary with potential equity in a rapidly growing company. • Comprehensive benefits: 401(k) with 4% company matching, full health/dental/vision, life & disability insurance, generous PTO. • Continuous learning via conferences, training, and professional growth. • Innovation-First Culture • Direct impact on defining the AI backbone of maritime autonomy . • Work on problems unsolved in automotive AI : Radar segmentation, maritime multi-object tracking, sensor fusion in GPS-denied waters. • Collaborative environment with elite engineers and researchers. • Flexible Work Environment • Hybrid and remote options for Southeastern US-based candidates. Offices in Pensacola (FL), Birmingham (AL), and Atlanta (GA). • Hands-on field validation through monthly data collection trips at our Pensacola test facility . • A culture that balances innovation with personal growth. Apply tot his job
Apply Now

Similar Jobs

Machine Learning Scientist Remote

Remote, USA Full-time

Senior Technical Program Manager, Data Labeling, Data and Machine Learning

Remote, USA Full-time

Senior Snowflake / Python Developer (Machine Learning) | REMOTE

Remote, USA Full-time

Principal Machine Learning System Engineer

Remote, USA Full-time

Senior Machine Learning Engineer job at Zocdoc in New York, NY

Remote, USA Full-time

FPGA AI/ML Engineer – Part Time

Remote, USA Full-time

Principal AI Technical Consultant, AI Delivery Acceleration

Remote, USA Full-time

Senior AI Solution Consultant United States Senior AI Solution Consultant

Remote, USA Full-time

AI Systems Consultant – Healthcare E-Commerce Platform Review

Remote, USA Full-time

AI/ML Consultant – LLM Deployment

Remote, USA Full-time

Experienced Virtual Assistant – Data Entry Specialist for Remote Work Opportunities in the Aviation Industry at blithequark

Remote, USA Full-time

Remote Part-Time Amazon Virtual Support Agent (Entry-Level, No Experience Required)

Remote, USA Full-time

**Experienced Part-time Online Educator - Michigan - Remote Teaching Opportunity with Hoot Reading**

Remote, USA Full-time

Experienced Customer Service Representative for Remote Work - Competitive Hourly Rate

Remote, USA Full-time

Customer Service Representative (Remote) - Delivering Exceptional Customer Experiences at blithequark

Remote, USA Full-time

[Remote] Business Development Representative

Remote, USA Full-time

Join Today: ( Cloud DevOps Engineer ) Southwest Airlines Work

Remote, USA Full-time

Associate, New Verticals - Strategy & Operations

Remote, USA Full-time

Experienced Data Entry Specialist – Freshers Jobs Program for Detail-Oriented Individuals at blithequark

Remote, USA Full-time

**Experienced Customer Service Representative – Remote Work-from-Home Opportunity at blithequark**

Remote, USA Full-time
Back to Home