Back to jobsJob overview

About the role

Data Engineer, Data Center Capacity Delivery at Amazon Data Services, Inc.

Required Skills

pythonsqlsparkawsetldata modelingdata warehousingdata lakebig data

About the Role

AWS Data Center Capacity Delivery is hiring a Data Engineer to support global data center construction. The role involves building and maintaining ETL pipelines, data warehousing, and analytics solutions using AWS services. The engineer will work with a diverse team to deliver scalable data infrastructure for business decisions.

Key Responsibilities

  • Develop and maintain automated ETL pipelines with monitoring using Python, Spark, SQL, and AWS services
  • Implement and support reporting and analytics infrastructure for internal business customers
  • Develop and maintain data security and permissions solutions for data warehouse and data lake implementations
  • Develop data objects for business analytics using data modeling techniques
  • Develop and optimize data warehouse and data lake tables using best practices for DDL, partitioning, and compression

Required Skills & Qualifications

Must Have:

  • 1+ years of data engineering experience
  • Experience with data modeling, warehousing, and building ETL pipelines
  • Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
  • Experience with one or more scripting language (e.g., Python, KornShell)
  • Bachelor's degree in Computer Science, Computer Engineering, or related fields

Nice to Have:

  • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
  • Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.

Benefits & Perks

  • Inclusive culture empowering bold ideas and ownership
  • Total compensation including equity, sign-on payments, and benefits
  • Full range of medical, financial, and other benefits
  • Workplace accommodations for disabilities during application and hiring