Data Engineer
We’re looking for a data engineer who not only builds pipelines but understands why the data matters. This role is ideal for someone who can translate business goals into reliable, meaningful data models — ensuring every transformation, metric, and dashboard accurately reflects real-world performance.
You’ll work collaboratively across analytics, engineering, and business teams to design, implement, and maintain high-quality data pipelines and models using the Google Cloud Platform stack.
Key Responsibilities:
- Ensure Data Accuracy & Business Relevance
- Consistently validate outputs against expected business outcomes, spotting anomalies and ensuring data makes sense in real-world context.
- Partner with analysts and stakeholders to interpret data requirements and ensure transformations reflect business logic and intent.
- Design & Maintain Modern Data Infrastructure
- Build and optimize ELT pipelines using Airflow (Cloud Composer), Dataform, and BigQuery.
- Support the design and implementation of robust GCP data warehouse architecture, ensuring scalability and governance.
- Model and Transform Data for Insight
- Develop efficient SQL transformations and modular workflows in Dataform (or dbt).
- Build and maintain semantic layer models in LookML to power Looker and Looker Studio dashboards.
- Data Quality, Observability, and Governance
- Implement proactive data quality checks, anomaly detection, and pipeline monitoring.
- Own data integrity through version control, DevOps practices, and QA of production outputs.
- Collaborative Problem Solving
- Work closely with analysts and business stakeholders to align metrics, data definitions, and use cases.
- Provide input on data management strategy, master data, and architecture evolution.
- Continuous Improvement & Innovation
- Stay current on advancements in GCP, data modeling, and machine learning.
- Recommend new tools and techniques to improve data reliability, observability, and usability.
Required Skills & Experience:
- 3+ years building and maintaining data pipelines using GCP tools (BigQuery, Composer, Dataform/dbt, Dataflow, Pub/Sub, etc.)
- Strong SQL (especially in BigQuery) and Python proficiency
- Proven experience validating and reconciling data against business logic or benchmarks
- Experience designing data models and transformations that reflect business processes and KPIs
- Experience orchestrating complex data workflows (Airflow/Composer or similar)
- Familiarity with GCP best practices, Cloud Monitoring, and Cloud Storage
- Prior experience integrating data from GA4, Salesforce, or similar SaaS platforms
- Strong communication skills and a curiosity to understand business needs deeply
Preferred Skills
- Knowledge of Salesforce objects and APIs
- Experience with Looker / LookML development
- Familiarity with JIRA, Git, and DevOps practices
- Understanding of data governance, lineage, and MDM concepts
Our ideal candidate is the kind of data engineer who:
- Thinks critically about whether the data reflects reality — not just whether the query runs.
- Enjoys troubleshooting ambiguous data issues and connecting them to business context.