Data & Integrations Engineer

Job Locations US-NC-Clemmons
Posted Date 2 days ago(7/30/2025 6:24 PM)
Job ID
2025-4902

Overview

Hayward Holdings Inc. (NYSE "HAYW") is the largest manufacturer of residential swimming pool equipment in the world, with a significant presence in the commercial pool market that is continuously growing. Hayward designs, manufactures, and markets a full line of residential and commercial pool and spa equipment including pumps, filters, heating, cleaners, salt chlorinators, automation, lighting, safety, flow control, and energy solutions at our company-owned facilities. Headquartered in Charlotte, North Carolina, Hayward also has facilities in Tennessee, Arizona, and Rhode Island as well as Canada, Spain, France, Australia, and China. This role can be based out of our facility in Clemmons, NC.

 

We’re seeking a Data and Integration Engineer to support our growing data initiatives, including integration, analytics, and modeling. This is an excellent opportunity for someone early in their career who’s eager to work with cloud technologies like AWS and contribute to building reliable and scalable data solutions.

Responsibilities

· Perform daily support of Snowflake and AWS infrastructure, to include administrative activities, such as creating accounts, monitoring jobs and data modeling.

· Ensure reliability, uptime and effective backup of software tools.

· Translate business requirements and end-user needs into data architecture blueprints that deliver cost effective & high performing, high quality data solutions in compliance with IT standards, policies, and long-term technical strategies.

· Design, develop, manage and update logical and physical data models; catalog data structures, data relationships and models based on industry standards enforcing data quality, reliability, and governance.

· Migrate datasets between existing and future database platforms.

· Perform integration development and support tasks around our Microsoft Dynamics 365 Finance & Supply Chain Management (FSCM) program.

· Understand and implement best practice solutions per development standards.

· Produce ad-hoc data extracts to answer business questions quickly and thoroughly.

· Collaborate with Project Managers, IT Architecture, Business Analysts, and Report Developers to achieve business objectives.

· Develop technology documentation to support production deployments as well as ongoing maintenance of reporting and data solutions.

· Troubleshoot, diagnose, and resolve data quality and performance issues related to data integration and transformation. · Communicate complex topics and analyses to non-technical business personnel.

· Present and participate in Architecture Review Board meetings. Review solutions presented by other IT architects and provide constructive feedback.

· Provide accurate level of effort estimates for data integration development to management.

Qualifications

 

· Bachelor’s degree in Computer Science, Information Technology, or a related field.

· 1–2 years of hands-on experience writing advanced SQL queries, preferably in Snowflake or

another cloud-based data warehouse.

· Introductory experience with programming languages such as Python, Java, or Groovy (via

coursework, personal projects, or internships)

· Solid understanding of data warehousing concepts, with some experience in star-schema design and implementation

· Exposure to DevOps environments, including basic support for production issues and troubleshooting workflows

· Familiarity with API and web services integration (REST, SOAP, or bulk data exchange).

· Understanding of distributed data architectures and processing frameworks (coursework, internships, or project experience acceptable)

· Some experience with CI/CD pipelines using tools like GitHub Actions, AWS CodeBuild, or similar

· Comfort working across different operating systems (Unix, Linux, Windows).

· Strong analytical and problem-solving skills with a willingness to learn and grow.

· Collaborative mindset with the ability to communicate effectively with technical and non-technical team members.

· Self-motivated, curious, and committed to continuous learning and professional development.

 

Preferred Qualifications:

· Exposure to working or interning in a global or cross-functional team environment

· Basic understanding of BI reporting tools such as Power BI (academic projects or self-learning acceptable)

· Exposure to workflow orchestration and job scheduling tools like Apache Airflow, dbt, or similar

· Exposure to leading BI reporting tools, preferably cloud based, such as Power BI

· Foundational knowledge of statistical concepts and how they apply to data analysis

· Awareness of data science principles, including topics like machine learning and artificial intelligence

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Connect With Us!

Not ready to apply? Connect with us for general consideration.