Job Description

Data Engineer

Remote /

Technology – Engineering /

Full-time

/ Remote

Hi from Pumpkin!

Pumpkin is a pet care company on a mission to help make ‘the best pet care possible fur all’. We want to revolutionize pet healthcare by making it easier for families to provide their pets with the wellness and medical care they need throughout their lives. Launched by Zoetis (ZTS), a Fortune 500 company that’s the world’s largest animal health company, Pumpkin is an early-stage startup with big dreams! As a pack, we share agility, guts, collaboration, and a relentless pursuit in creating a healthier, happier world for pets and their people.

Pumpkin is looking for an experienced Data Engineer to join our growing data and analytics team. In this role, you’ll enhance data accessibility, establishing a dependable platform that enables decentralized teams to discover insights and create impactful product features. As a part of our data team, you will significantly influence the evolution of our data platform. Your input will directly affect Pumpkin’s capacity to reveal insights and enrich our knowledge of pet owners and their pets. If you’re passionate about data, enjoy problem-solving, and are excited to improve pets’ lives, we would love to hear from you!

This role is remote for those outside of a commutable distance of the New York City office. Some travel will be required (quarterly to NYC office). Individuals located in the NYC area will be considered hybrid, with a requirement of 2 days a week in the office.

#LI-Remote

What You’ll Do:

  • Collaborate with the team to create reliable pipelines that efficiently gather, transform, and load data into our data platform, ensuring seamless information flow between systems.
  • Utilize your skills in SQL, Python, and tools like dbt to structure data for accurate insights that meet business needs.
  • Maintain clear documentation for pipelines and databases, fostering an informed and efficient team.
  • Implement checks to identify and correct errors, ensuring the integrity of data for analysis and modeling.
  • Drive results through effective communication with fellow colleagues and stakeholders, working closely to exchange ideas.
  • Create innovative data practices, showing curiosity for emerging tools and identify enhancements to our data engineering processes.

What We’re Looking For:

  • 4+ years of hands-on experience in designing and implementing diverse data pipelines and transformations, including data integrations, ETL/ELT pipelines, streaming analytics, and/or big data analytics.
  • Proficiency in SQL and Python for data querying and manipulation.
  • Familiarity with ETL tools like Fivetran or similar, and experience with cloud platforms (e.g., AWS, Aurora, Postgres) and relevant data services (e.g. Airflow, etc.)
  • Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
  • Strong understanding of database concepts, with practical experience in both relational and NoSQL databases.
  • Demonstrated ability to learn and adapt to new tools and technologies effectively.
  • Skilled problem solver with a track record of resolving complex data-related issues.
  • Excellent collaboration skills, capable of effectively engaging with both technical and non-technical stakeholders to gather requirements and deliver valuable insights.

Bonus Points:

  • Previous experience creating data assets for reporting and visualization tools like Tableau, Looker, etc.
  • Familiarity with customer-level data sources like Segment/mParticle, Stripe, Salesforce, etc.
  • Experience working in an agile startup environment, with hybrid/remote teams across a variety of functions (marketing, sales, operations, engineering, etc.)

APPLY HERE