About the Role
Title: Staff Data Engineer
Location: Remote, USA
Job Description:
About Life360
Life360’s mission is to keep people close to the ones they love. Our category-leading mobile app and Tile tracking devices empower members to protect the people, pets, and things they care about most with a range of services, including location sharing, safe driver reports, and crash detection with emergency dispatch. Life360 serves approximately 77 million monthly active users (MAU), as of November 2024, across more than 170 countries.
Life360 delivers peace of mind and enhances everyday family life with seamless coordination for all the moments that matter, big and small. By continuing to innovate and deliver for our customers, we have become a household name and the must-have mobile-based membership for families (and those friends that basically are family).
Life360 has more than 500 (and growing!) remote-first employees. For more information, please visit life360.com.
Life360 is a Remote First company, which means a remote work environment will be the primary experience for all employees. All positions, unless otherwise specified, can be performed remotely (within the US) regardless of any specified location above.
About the Team
The Data and Analytics team is looking for a high intensity individual who is passionate about driving our data platform forward and changing the way we do data. Your mission will be to analyze our current ways of doing things and architect new solutions using cutting edge tools. You will be expected to become the go-to member in our team for our data platform strategy, implementation, and use. We want open-minded individuals that are highly collaborative and communicative. We work together and celebrate our wins as a team and are committed to building a welcoming team where everyone can perform their best.
About the Job
At Life360, we collect a lot of data: 60 billion unique location points, 12 billion user actions, 8 billion miles driven every single month, and so much more. As a Staff Data Engineer, you will contribute to enhancing and maintaining our data processing and storage pipelines/workflows for a robust and secure finance data lake. You should have a strong engineering background and even more importantly a desire to take ownership of our data systems to make them world class.
$166,500 – $245,000
What You’ll Do
- Design, implement, and manage scalable data processing platforms used for real-time analytics and exploratory data analysis.
- Manage our financial data from ingestion through ETL to storage and batch processing.
- Automate, test and harden all data workflows.
- Architect logical and physical data models to ensure the needs of the business are met.
- Collaborate across the data teams, engineering, data science, and analytics, to understand their needs, while applying engineering best practices.
- Architect and develop systems and algorithms for distributed real-time analytics and data processing.
- Implement strategies for acquiring data to develop new insights.
- Mentor junior engineers, imparting best practices and institutionalizing efficient processes to foster growth and innovation within the team.
- Champion data engineering best practices and institutionalizing efficient processes to foster growth and innovation within the team.
What We’re Looking For
- Minimum 7 years of experience working with high volume data infrastructure.
- Experience with Databricks and AWS.
- Experience with dbt.
- Experience with job orchestration tooling like Airflow.
- Proficient programming in Python.
- Proficient with SQL and the ability to optimize complex queries.
- Proficient with large-scale data processing using Spark and/or Presto/Trino.
- Proficient in data modeling and database design.
- Experience with streaming data with a tool like Kinesis or Kafka.
- Experience working with high volume event based data architecture like Amplitude and Braze.
- Experience in modern development lifecycle including Agile methodology, CI/CD, automated deployments using Terraform, GitHub Actions, etc.
- Knowledge and proficiency in the latest open source and data frameworks, modern data platform tech stacks and tools.
- Always learning and staying up to speed with the fast moving data world.
- You have good communication and collaboration skills and can work independently.
- BS in Computer Science, Software Engineering, Mathematics, or equivalent experience.