Job Description
<h3>π Description</h3> β’ You are someone who takes full ownership of your work and thrives on delivering solutions.
β’ If thereβs a problem in our system, no matter where the root cause lies, you pursue it until itβs resolved β ensuring our customers experience a seamless service.
β’ You never compromise on quality and are always looking for ways to improve performance, scalability, and efficiency.
β’ With a proactive mindset, you aim to not just meet, but exceed customer expectations.
β’ Your can-do attitude drives you to find answers and make an impact, no matter the challenge.
β’ Lead the development, testing, and maintenance of scalable Python code for complex data ingestion, transformation, and processing across streaming and batch architectures.
β’ Architect and implement sophisticated data transformation models utilizing dbt for AWS Redshift, ensuring data quality and performance.
β’ Drive the design and optimization of robust data models that proactively support advanced analytics and evolving business intelligence needs.
β’ Spearhead the development and optimization of high-performance ETL/ELT workflows leveraging AWS DMS, Kinesis, and other cutting-edge data integration technologies.
β’ Proactively identify, diagnose, and resolve critical data pipeline issues, implementing preventative measures to guarantee data integrity and reliability.
β’ Collaborate with and mentor junior engineers, fostering best practices in advanced data engineering principles and data modeling techniques.
β’ Establish comprehensive documentation standards for code, models, and processes to enable seamless team collaboration and knowledge transfer. <h3>π― Requirements</h3> β’ Advanced Programming Proficiency: Expert-level skills in Python and SQL (including advanced optimization techniques in MySQL, Postgres, and SQL Server), coupled with extensive experience using DBT for complex transformations and a strong command of AWS services (DMS, S3, Airflow, Kinesis, DocumentDB, CloudWatch, Kubernetes, Glue).
β’ Advanced Data Warehousing Principles: Comprehensive understanding of data warehousing best practices for highly scalable, performant, and secure implementations in complex multi-tenant environments, including data governance and data quality frameworks.
β’ Data Engineering Mastery: 5+ years of progressive full-time experience in data engineering, demonstrating expertise in designing, conducting rigorous code reviews, and managing complex, large-scale ETL/ELT pipelines.
β’ Expertise in Cloud Data Platforms: Deep understanding and hands-on experience with enterprise-grade cloud data platforms such as AWS Redshift, BigQuery, and Snowflake, including their architectural nuances and performance optimization strategies.
β’ CI/CD and Infrastructure-as-Code Leadership: Proven ability to establish and maintain robust version control (GitHub) workflows, architect and implement sophisticated CI/CD pipelines for data infrastructure, and a strong advocate for infrastructure-as-code principles and tools. <h3>ποΈ Benefits</h3> β’ Medical with Blue Cross Blue Shield NC (2 options)
β’ Dental and Vision with Unum
β’ Company-paid Life insurance, STD and LTD
β’ Voluntary benefits including Supplemental Life Insurance, HSA, FSA and Dependant Care, Critical Illness, Accident and Pet Insurance
β’ 401(k) with up to 3% employer match and NO vesting period
β’ Flexible PTO policy
β’ 10 company holidays
β’ Parental Leave
β’ Community Impact Program (Volunteer)
β’ Tech and Wellness Stipend