Senior MLops and Data Architect

Kettle · Remote (USA)

Data + Analytics
Posted 6 days ago

Data Science
Machine Learning
Claim this company

The Organization

Kettle's mission is to balance risk in a changing climate. Kettle uses deep learning and proprietary algorithms to reshape the reinsurance industry and better protect people from the growing risks associated with climate change. Kettle’s first product protects Californians’ businesses, homes, and livelihoods with wildfire reinsurance. To learn more, visit

You probably have never heard of the $300bn reinsurance industry, but it is the single most important industry in the world to protect ourselves and help society recover from the effects of climate change disasters. This industry is failing due to a 3x increase in $1B+ crises caused by climate change. Kettle’s aim is to use deep learning to ensure that people’s lives are not destroyed when these events happen. By building the most sophisticated machine learning models to predict when and where wildfires happen, Kettle accurately prices the cost of covering wildfire prone areas in California. We are a machine-learning-powered reinsurer, in that we sell reinsurance to insurers, we don't sell software to reinsurers.

Who we are/what we value:

  • Obsessive Fanatics - We obsess over big problems. We are fanatical about our business/mission. We drive the company towards the areas people describe as ‘impossible to understand or stop’
  • Initiative - Obsessive people don’t need micromanagement.
  • Data Focused - With the world burning worse every year, we can’t afford opinions and bias to derail data driven decisions.
  • Questioning Renegades - We work in a 600 year old industry. If everyone is running one direction, we generally sprint in the other.
  • Impact Driven - We are a mission driven business


The Role

We are looking for a passionate MLops and Data Architect to join our team and work with a growing team of engineers, deep learning experts, and other data scientists to model risk using the most advanced tools available,

We strongly encourage people from traditionally underrepresented populations in tech - such as women, People of Color, People with Disabilities, and LGBTQ+ people, etc - to apply!


Role Primary Responsibilities

As a member of Kettle, you:

  • Design and develop a cloud-based data architecture supporting complex workflows for data collection, prep, validation.
  • Design and develop strategies for monitoring data flow, prediction quality and other performance metrics.
  • Design and develop a research pipeline and support data scientists with training, serving and monitoring models
  • Work with diverse data formats with varying data quality
  • Explore the data space to constantly improve the quality of data sources
  • Advise data scientists on the optimal data engineering practices
  • Create a data catalog and data documentation



We are looking for a highly motivated, successful MLops and Data Architect to lead a growing ML infrastructure team at Kettle. They should be a demonstrated self-starter with at least five years experience in some sort of data analysis or engineering profession. This person needs to have the drive to make sure our technology and models are efficient, well-run systems to make the most accurate predictors of risk in the world. Below we have listed out some things we are looking for; you do not necessarily need to check all these boxes to be eligible for this position.

Essential Experience of a Successful Candidate

  • Bachelor’s, Master’s, or PhD in computer science or related field or equivalent work experience
  • In-depth experience with designing and developing data system
  • In-depth experience with satellite image data
  • In-depth knowledge of Amazon Web Services
  • Experience with building data science research pipeline
  • Some knowledge of Google Earth Engine data
  • Familiarity with machine learning
  • Excellent code proficiency in Python and Javascript
  • Self-starter with ability to work within a fast-paced and rapid-evolving startup
  • Eagerness to learn new skills and help with the task at hand

Useful Experience

  • Coding proficiency in building serverless APIs
  • Familiarity with database design and architecture
  • Experience using SageMaker, TensorFlow, and other machine learning tools
  • Developing simulation pipeline
  • Code proficiency in golang and C++


We offer a competitive package that is based on location and experience. We also offer the following benefits:

  • Stock: Ownership in a fast-growing venture-backed company.
  • 401k matching: We care about your ability to save for your future.
  • Family Focus: Parental leave and flexibility for families.
  • Time Off: Flexible vacation policy to encourage people to get out and see the world.
  • Healthcare: Platinum level Medical, dental, and vision policies.
  • Goodies: Whatever hardware and software you need to get the job done.
  • Team Fun: Regularly scheduled events, annual retreat, and celebrations.
  • Learning: Learning & Development Opportunities to grow your skills and career.
  • Great team: Working with fun, hard-working, kind people committed to making a difference!
  • Flexible culture: We are results-focused. We don’t work at the office every day.
  • ...And much more! Lots of other perks make this company an incredible place to work.
To apply for this job please sign in or enter your email below.

Related Jobs

Senior User Researcher - WhereIsMyTransport
Remote (USA) - Posted 4 weeks ago
Senior DevOps Engineer - Enervee
Remote (USA) - Posted 1 week ago
Senior Software Engineer - Voltaiq
Remote (USA) - Posted 3 weeks ago

Connect with your next key hire on Tech Jobs for Good.

Post a featured job Schedule a demo