Job Description
<h3>π Description</h3> β’ Senior Data Engineer to lead the design, development, and optimization of complex data systems. This role builds reliable, scalable data pipelines and infrastructure to support analytics and data-driven decision-making across the organization. The ideal candidate is a technical leader with deep expertise in data architecture, performance tuning, and mentoring junior engineers.
β’ Shift Structure: Night shift (9 AM to 6 PM EST)
β’ Responsibilities: Lead end-to-end design and development of robust, scalable data pipeline projects across multiple platforms and data sources; Optimize data systems for high performance, reliability, and scalability; Conduct thorough code reviews and provide constructive feedback; Mentor and guide junior data engineers; Collaborate with cross-functional teams; Troubleshoot complex data issues and ensure data quality, availability, and consistency.
β’ Primary Focus Areas: Advanced data engineering and pipeline architecture; Data performance tuning and system optimization; Technical leadership and mentorship; Cross-functional collaboration and support.
β’ Decision-Making Authority: The Senior Data Engineer will independently make technical decisions regarding data system design, architecture, and optimization strategies; set engineering standards and tools.
β’ Communication & Collaboration: Requires close coordination with Data Engineering Lead and other teams; Strong communication skills.
β’ Requirements and Qualifications: ... (as listed above)
β’ Compensation and Benefits: 2 rest days per week; 13th Month Pay; Employee must be able to perform essential functions with/without accommodation. <h3>π― Requirements</h3> β’ Bachelorβs or Masterβs degree in Computer Science, Engineering, or related field
β’ 5+ years of hands-on experience in data engineering or backend data systems development
β’ Data Pipeline Orchestration and Workflow Management
β’ Apache Airflow: Popular for scheduling and monitoring complex data pipelines.
β’ Prefect: Modern orchestration tool gaining adoption.
β’ Apache NiFi, Talend, AWS Glue, Azure Data Factory: ETL tools for data integration and transformation
β’ DataDog or similar tools for tracking and monitoring data pipelines
β’ DevOps and CI/CD Tools: Docker, Git, CircleCI (or other CI/CD pipelines) for containerization, version control, and automated deployments
β’ Experience in building data solutions at enterprise scale <h3>ποΈ Benefits</h3> β’ High Compensation Tied to Business Outcomes: You will be well-rewarded for meeting critical objectives and deadlines.
β’ Empowerment & Resources: We will provide the right tools and teams to help you succeed on your own terms.
β’ Flexibility & Accountability: You will have autonomy to structure your approach, but must deliver results.
β’ Paid Time Off: Relax and recharge with paid vacation and sick leaves.
β’ Bonus Boost: Enjoy an extra bonus with our 13th month pay.
β’ Health Matters: Get comprehensive HMO coverage upon regularization.
β’ Work-Ready: We provide the essential work device for seamless productivity.
β’ Work From Home: Enjoy the flexibility and convenience of a fully remote positionβwork from anywhere in the world without ever needing to commute to an office.
β’ 2 rest days per week
β’ 13th Month Pay