Job Description
<h3>π Description</h3> β’ Gathers requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users.
β’ Partner with business and data analysts, as well as business power users, to understand needs and create repeatable solutions and insights.
β’ Architect automated data workflows, database views, and data marts that streamline data processing and reduce manual intervention.
β’ Propose appropriate design solutions for building impactful data products.
β’ Use expert knowledge in designing, implementing, and continuously expanding data pipelines by performing extraction, transformation, and loading activities, ensuring that the data architecture is scalable and maintainable.
β’ Review and revise existing data flows, views, and workflows to improve processing speed, scalability, and performance.
β’ Collaborate with Data Governance team to ensure that data assets adhere to established data governance standards and align with business definitions for system-approved data assets.
β’ Sets up appropriate monitoring and notification for potential issues within data pipelines, flows, and processes.
β’ Expert at investigating, troubleshooting, and resolving issues while communicating with relevant parties.
β’ Propose and implement processes and structures for thorough testing of data solutions and production processes, following DevOps and DataOps best practices under departmental guidelines.
β’ Participate in Agile scrum development process to build data products in a fast yet reliable way for iterative delivery to realize immediate business value.
β’ Train and mentor junior developers on the team, promoting best practices in data modeling, transformation, and automation.
β’ On an as needed and assigned basis, provides expert maintenance and support of legacy and production processes and products. <h3>π― Requirements</h3> β’ 7 years SQL, Python and related data programming languages as well as ETL methodologies and tools such as Informatica and SSIS.
β’ 5 years Data warehousing and database modeling/design using industry standard tools and concepts.
β’ 3 years Experience with cloud data tools/technologies, especially Microsoft Azure and Databricks.
β’ 5 years Experience with Epic EMR databases (Clarity and Caboodle) and/or Workday ERP Reporting.
β’ EPIC or Other Application Certification or Accreditation (Cert Exam passed in no more than three attempts) - EPIC Epic Cogito Fundamentals and Epic Clarity and/or Caboodle data model or related certification - if and as required by role.