Job Description
<h3>π Description</h3> β’ Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
β’ Develop & maintain pipelines to Ingest onchain events, API feeds and thirdβparty sources into Dune/BigQuery/Snowflake, with monitoring and alerting
β’ Optimize pipeline health: Implement monitoring, alerting and rootβcause workflows to quickly detect and resolve data issues
β’ Collaborate & iterate: Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
β’ Centralize data sources: Merge disparate feeds into a unified repository while provisioning data to where itβs needed
β’ Plan & build inβhouse models: As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
β’ Champion best practices: Contribute to open standards in the Uniswap and DEX communities
β’ Stay current: Evaluate emerging dataβengineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack <h3>π― Requirements</h3> β’ Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse
β’ Experience building subgraphs or equivalent custom indexers (e.g., The Graph, Ponder)
β’ Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes
β’ Advanced degree in Computer Science, Data Engineering, or a related technical field