Title: Sr Data Engineer
Location: United States
Brief Description of Sunnova
Sunnova (NYSE: NOVA) is revolutionizing the energy industry as a leading Energy as a Service (EaaS) provider of solar, battery storage, EV charging, and other energy solutions with customers spanning the U.S. and its territories. Founded in 2012, our goal is to provide homeowners, businesses, and communities with a better energy service at a better price making clean, renewable energy more accessible, reliable, and affordable.
At Sunnova, we believe that our success comes from the diversity and creativity of our people. Our team is made up of forward-thinkers who are passionate about changing the energy industry for the better, and we’re looking for like-minded individuals to join us. We encourage our people to push beyond traditional limits and explore new horizons because only then can we truly transform the world for the better.
If you’re excited about being a part of the fastest-growing segment of the energy industry, we want you on our team!
The Sr Data Engineer Position
Sunnova Energy is currently searching for a Sr. Data Engineer responsible for data collection, modeling, and integration strategies, developing solutions for the enterprise using Python, AWS, and the Informatica Cloud platform, and applying data science principles to solve business problems. This individual will combine software development skills with business acumen to ingest data from APIs, files, and databases, transform and organize data at the database level, analyze and interpret the meaning of data, and produce reporting systems.
Sr Data Engineer Responsibilities
- Solves business-related problems using data-driven techniques through collaboration with business and IT colleagues. Regularly influences technical design and process.
- Collects large amounts of data and transforms it following data best practices into a more usable structure, from data at rest or streamed sources.
- Performs peer reviews of SQL, reports, and dashboards created by colleagues.
- Looks for order, patterns, and trends in data and translates them into business insights.
- Develops data integration solutions using cloud centric data tools and other programming languages, preferably Python, in accordance with business requirements and technical specifications.
- Troubleshoots and supports implemented applications.
- Participates in daily scrums, works with Scrum Master and QA Team on projects and supports delivery timelines and priorities.
- Responsible for designing, documenting, and presenting solutions to senior leaders in IT.
- Works effectively in a matrix environment where 1) day-to-day tasks are determined and executed on agile/scrum team under the leadership of a scrum master and 2) career development and coaching, goal setting, performance evaluations and issue escalation are coordinated through a separate manager.
- Bachelor’s degree or equivalent.
- 4-6 years in experience in data engineering.
- Ability to understand the company’s data model and how it fits into various organizational functions.
- Ability to analyze substantial amounts of data and draw meaningful trends and conclusions.
- Expertise in Python programming with a strong understanding of Pythonic principles and practices.
- Experience with SQL to efficiently extract and utilize large volumes of time-series data.
- Proficient in designing and implementing Omni-channel API integrations with external partners for seamless communication with various devices.
- In-depth knowledge of AWS architecture and services, especially Lambda functions, EC2, RDS, S3, DynamoDB, IAM, and CloudFormation for comprehensive cloud solutions.
- Strong understanding of Internet of Things (IoT) concepts and how to manage a fleet of internet-connected devices.
- Competence in using code version control systems, such as Git, and continuous integration/continuous deployment (CI/CD) pipelines.
- Familiarity with microservices architecture and containerization tools like Docker, Kubernetes, or similar technologies.
- Experience with test-driven development (TDD) and automated testing frameworks to ensure robust and reliable software components.
- Capable of writing clean, maintainable, and efficient code, following industry best practices and coding standards.
- Proven ability to create and maintain scalable, high-availability systems that can handle rapid growth and data volume expansion.
- Experience with Agile/Scrum development methodologies, with the ability to lead sprints and manage a technical team through various project phases.
- Strong written and verbal communication skills.
- Certifications around AWS, Python, Data Ware/Lake/Lake House.
- Background or experience with edge computing telemetry data.
- Proficient with software development using cloud-based infrastructure and database services from AWS. Such as RDS, Redshift, Kinesis, and Timestream.
- Familiar with data catalog, data quality, master data management (MDM), and data governance best practices.
- Familiar with data science concepts such as time-series forecasting, stochastic optimization, classification, and regression analysis.