Job Description
<h3>📋 Description</h3> • We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and Python skills, and a sharp analytical mindset to support both data validation initiatives and ongoing analytics work.
• The analyst will work across a variety of data workflows, from validating metrics during system migrations to supporting day-to-day data analysis and reporting needs.
• You’ll leverage advanced BigQuery features—such as authorized views, materialized views, UDFs, partitioning strategies, and time series analysis—to ensure data integrity and surface meaningful insights.
• Comfort working in Python with data frames and relevant packages is also essential, particularly for tasks involving data manipulation, anomaly detection, or prototyping workflows.
• A solid understanding of data engineering fundamentals and GCP infrastructure is important, as is the ability to read and interpret code in Java or Scala when collaborating with engineering teammates.
• Familiarity with Airflow (Composer) will help you understand orchestration logic, though this won’t be a core responsibility.
• Experience with BigQuery ML, anomaly detection frameworks, or Vertex AI is a plus. <h3>🎯 Requirements</h3> • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field.
• 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills.
• 5+ years of experience building and operating solutions on Google Cloud Platform (GCP).
• Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies.
• Proficient in Python, including use of data frames and common analytical libraries.
• Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis.
• Strong analytical skills and experience validating data across systems during migrations and ongoing operations.
• Basic ability to read and understand Java or Scala code to support engineering collaboration.
• Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows.