The Opportunity
- Describe the role and team the candidate will be joining
As a Data Engineer, you will:
- Build and maintain scalable, best-in-class data infrastructure and pipelines that serve as core components of a multi-tenant data platform.
- Ensure our data pipelines and data warehouse are optimized for accuracy, performance, and accessibility.
- Manage architecture frameworks and participate in the development of data, experimentation, and analytics solutions in collaboration with cross-functional partners in the Product and Engineering organizations.
- Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Test and clearly document data assets and warehouse implementations to enable others to understand the implementation and definition of data methodologies easily.
- Design data integrations and a data quality framework.
- Work closely with Product and Engineering teams to develop a strategy for long-term data platform architecture.
To be successful in this role, you’ll need:
- Demonstrated ability to build, manage, and optimize core data infrastructure at scale in a multi-tenant environment.
- A propensity to independently identify opportunities for optimization and drive forward high-impact projects with minimal guidance
- Proficiency in SQL and strong programming skills in Python, with experience in Bash scripting for automation and workflow management.
- Deep knowledge and experience with Snowflake and advanced features like Snowpipes, storage integrations, stages, streams, tasks
- Experience with the AWS ecosystem and securely deploying and managing applications using serverless tools like ECS and Lambda
- Experience building and maintaining custom ingestion pipelines using tools like dlt or requests
- Ingest data from third-party APIs and custom ingestion pipelines with Dagster and Snowflake (Snowpipe, streams, and tasks)
- Proficiency with workflow orchestration tools (Dagster or similar tooling like Airflow or Prefect) and data transformation tools (dbt)
- Experience with DataOps tools, such as Docker, GitHub Actions, and Terraform.
- Experience with AI or ML is a plus.
You’ll love this role if you are:
- Excited to build foundational data infrastructure that powers many e-commerce brands.
- Energized by the opportunity to abstract repeated data problems into platform-level solutions.
- Passionate about working cross-functionally across engineering, product, and data teams.
- Motivated by working in a fast-paced and iterative environment.
- Excited by the opportunity to be an early, critical member of a rapidly growing organization.
- Personally aligned with our mission to make commerce accessible.