About the Role
Title: Data Architect
Location: USA
remote type: Remote
time type: Full time
job requisition id: JR105805
Job Description:
About Us
We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead.?
Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today!
Job Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have deep expertise in designing, implementing, and maintaining efficient and scalable data architectures, with a focus on MDM (Master Data Management), data governance, and data quality. This role is crucial for driving the strategic direction of data management systems and ensuring that data solutions meet business needs while maintaining integrity, consistency, and performance.
Key Responsibilities:
- Data Architecture Design: Lead the design and implementation of robust, scalable, and high-performance data models in Snowflake. Ensure alignment with business requirements and technology strategies.
- Master Data Management (MDM): Collaborate closely with the MDM team to design and optimize the data architecture, ensuring the data hub effectively supports the MDM model and facilitates seamless integration and performance within the MDM hub.
- Data Integration & ETL Pipelines: Design and manage end-to-end ETL/ELT pipelines integrating Snowflake with various business and analytical systems, such as MDM hub, APIs, databases, and third-party services.
- Data Governance & Security: Establish and enforce data governance policies, rules, and standards, ensuring data quality, security, and compliance with regulations.
- Collaboration: Work closely with business analysts, data engineers, analysts and data scientists to understand business requirements and translate them into efficient data solutions.
- Data Modeling: Design and optimize enterprise data models, schemas, and structures to support current and future business reporting and analytics needs.
- Performance Optimization: Monitor and optimize the performance of enterprise data models and transformations for high performance and scalability.
- Resolving & Issue Resolution: Investigate and resolve data-related issues and inefficiencies in the data pipeline, ensuring minimal downtime and disruption to business operations.
Required Skills and Qualifications:
- Education & Experience: Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field. 10+ years of experience in the data domain, with 5+ years of experience showing proficiency as a Data Architect or a similar role.
- Cloud Platforms: Expert level experience with cloud-based data platforms such as Snowflake & Databricks. Demonstrable experience working with cloud platforms such as AWS, Azure, or GCP, and cloud data architectures.
- Good experience with dbt for building and managing data transformation pipelines, optimizing workflows, and ensuring efficient data processing.
- MDM Process Knowledge: Deep understanding of Master Data Management (MDM) processes, including data governance, data quality management, and the integration of MDM with other systems.
- Good experience in crafting and managing data warehouses, data lakes, and other storage solutions.
- SQL & Scripting: Advanced knowledge of SQL to manage and manipulate large datasets efficiently. Proficient in scripting languages (e.g. Python, Shell) for automation of data workflows. Solid understanding of version control systems, like Git, Github, Bitbucket.
- Data Modeling & Design: Expertise in designing complex data models (dimensional, star/snowflake schemas), query optimization, and performance tuning.
- Data Integration & ETL Tools: Experience with modern data integration tools such as ADF, Fivetran, Estuary.
- Collaboration Skills: Ability to work in close collaboration with teams from various functions, displaying solid communication and leadership skills.
- Analytical Thinking: Strong problem-solving and analytical skills to find opportunities for data optimization and efficiency.
Preferred Skills:
- Experience with MDM technologies and platforms (e.g. Informatica MDM, Profisee, and Reltio)
- Familiarity with data visualization tools such as Tableau, Power BI, or SIGMA.
- Experience with CI/CD pipelines for data engineering.
- Understanding data quality frameworks and standard methodologies.
- Certification in relevant technologies such as Snowflake, dbt, or Microsoft Azure is a plus.