Job Description
<h3>📋 Description</h3> • Assist in building and optimizing data pipelines and data architectures on cloud technologies.
• Support the development and maintenance of database models, ensuring the expansion and scalability of our data warehouse.
• Participate in data quality assurance, data migration, integration, and deployment activities to support business needs.
• Collaborate with internal stakeholders to gather requirements and translate them into data solutions.
• Write and maintain ETL processes, and support the creation of real-time and offline analytics tools.
• Contribute to the continuous improvement of data infrastructure and tooling.
• Learn from senior data engineers and actively participate in team knowledge sharing. <h3>🎯 Requirements</h3> • Mandatory: Proficient in Hive or Spark SQL, with SQL tuning/optimization capabilities.
• Basic experience or coursework in data engineering (e.g., SQL, Python).
• Familiarity with cloud technologies, data pipeline tools (e.g., Airflow), and data warehouse concepts.
• Some knowledge of CI/CD processes and version control systems.
• Strong problem-solving skills and attention to detail.
• Eager to learn and adapt to new technologies in a dynamic environment.
• Collaborative mindset with a willingness to support team objectives.
• Excellent communication skills and ability to work in a team-oriented environment.
• A proactive and curious attitude, willing to take on challenges and seek continuous improvement.
• Preferred: Familiarity with exchange or financial services business.
• A technical university background or relevant internship experience is a plus. <h3>🏖️ Benefits</h3> • Enjoy work flexibility
• Supportive team
• Environment that nurtures your ideas
• Performance-based annual bonus for all contributors at WOO 💪