About this role
About the Team:
You will be joining a newly launched Forward Deployed Engineering team dedicated to client engagement and platform adoption. This team operates with a startup mentality within the company – high collaboration, agility, and a focus on delivering results. As a principal FDE, you’ll serve as a key leader in this team, helping establish its practices and culture. You’ll work alongside seasoned data platform engineers and product managers, directly contributing to our mission of empowering users to unlock maximum value from data.
Job Summary:
We are seeking a Principal-level Forward Deployed Engineer (FDE) Lead to spearhead client-facing technical engagements for our Enterprise Data Platform (EDP). In this high-impact individual contributor role, you will bring product-grade data engineering expertise directly to internal (and later external) clients to drive adoption of our EDP.
This role is not primarily a pipeline engineering role. Instead, you will act as a technical authority, advisor, and enablement lead—working with our Data Platform as a Service (DPaaS) adopters to design high-quality data pipelines, teach best practices, and guide teams to build and operate pipelines effectively themselves. You will help clients understand how to build pipelines well on EDP, review designs and implementations, and unblock complex issues, while leaving daytoday pipeline construction and ownership with the adopting teams.
This role combines deep hands-on technical work (in Airflow, Snowflake, dbt, Great Expectations, DataHub, etc.) with strategic advisory responsibilities. As a founding member of the Forward Deployed Engineering team, you will play a pivotal role in shaping how we accelerate data platform adoption and deliver insights in the investment management domain. The position is remote-friendly with minimal travel (occasional client visits if needed).
Key Responsibilities:
Direct Client Engagement & Solution Delivery: Serve as a trusted technical advisor for DPaaS adopters, partnering with teams to understand their business goals and data needs. Guide clients in designing EDP-native data pipelines, helping them select appropriate patterns, architectures, and platform features. Enable teams to become self-sufficient and effective builders by transferring knowledge, patterns, and best practices rather than owning pipeline delivery.
Data Pipeline Design & Best Practices: Provide expert guidance on EDP data pipeline design, including ingestion, transformation, orchestration, and consumption patterns. Advise on Airflow DAG design, dbt project structure, dependency management, and operational best practices—without routinely building or owning pipelines yourself. Help teams adopt scalable, maintainable approaches aligned to medallion (bronze–silver–gold) architectures, data mesh principles, and EDP standards.
Snowflake Architecture & Optimization Guidance: Act as a Snowflake subject-matter expert, advising teams on data modeling, SQL design, warehouse sizing, and performance optimization. Review and provide feedback on query patterns, schema designs, and workload configurations to help adopters achieve cost-efficient, performant solutions. Support teams in diagnosing performance or cost issues, guiding them toward effective remediation.
Design Review, Quality & Governance: Review pipeline designs, data models, and implementation approaches to ensure alignment with enterprise data quality, governance, and security standards. Coach teams on implementing data quality checks using Great Expectations and on improving metadata, lineage, and discoverability via DataHub or similar tools. Promote consistent engineering standards, documentation practices, and operational readiness.
Data Quality & Governance: Implement and promote best practices in data quality, validation, and governance. Utilize Great Expectations for automated data quality checks and DataHub (or similar metadata/catalog tools) to improve data discoverability and lineage tracking. Ensure that solutions meet enterprise data governance standards and enable trust in data for decision-making.
Cross-Functional Collaboration: Collaborate with platform engineering, product management, analytics teams, and business units to align technical solutions with business needs. Provide feedback from client engagements to internal product and engineering teams to influence the EDP’s roadmap and enhancements. Act as a bridge between technical teams and client stakeholders, ensuring clarity and mutual understanding of requirements and outcomes.
Technical Leadership & Mentorship: As a principal-level expert, provide thought leadership in data engineering and analytics. Mentor and guide junior forward deployed engineers or analytics engineers in the team (though this role has no direct managerial duties). Lead by example in following agile methodologies, maintaining a data product mindset (delivering iterative improvements and focusing on end-user value), and upholding engineering best practices such as version control, CI/CD for data pipelines, and documentation.
Problem Solving & Support: Tackle complex technical challenges in real-time. Troubleshoot critical issues across the EDP stack – from pipeline failures to data inconsistencies – and drive issues to resolution. Serve as the go-to expert for diagnosing problems in SQL queries, data models, Airflow DAGs, and Snowflake performance, ensuring high reliability and responsiveness for client-facing data services.
Continuous Improvement: Stay up-to-date with the latest developments in data engineering, analytics, and the investment management industry. Proactively identify opportunities to improve our platform’s practices (e.g., new features in Snowflake, emerging tools in the data ecosystem) and help integrate them to continually enhance the value we deliver to clients.
Role Impact and Opportunities:
In this role, you will be at the forefront of how we enable data-driven transformation both within our company and for our clients. As an FDE Lead, you operate at the critical intersection of data platform engineering and real-world client needs. Your work will directly influence how effectively our internal teams and future external customers harness the Enterprise Data Platform to drive investment insights, operational efficiency, and innovation. By translating the EDP’s technical capabilities into real client success stories, you will help shape best practices, refine platform features through feedback, and amplify the overall value of data across the organization. This position offers the opportunity to make a significant impact without managing a team, perfect for a senior professional who thrives on solving complex data problems, building trust with clients, and leading through expertise.
Required Qualifications:
Extensive Data Engineering Experience: 8+ years of hands-on experience in data engineering, data architecture, or related fields, with a track record of designing and delivering large-scale data solutions.
Expert SQL Skills: Deep proficiency in SQL development and optimization, including writing and tuning complex SQL queries for both ETL processing and analytical reporting/BI use cases. Proven ability to refactor SQL code for efficiency and readability.
Pipeline & ETL Orchestration: Strong experience building and orchestrating data pipelines using tools such as Apache Airflow. Ability to develop, schedule, and monitor complex ETL workflows in a production environment.
Data Transformation & Modeling: Proficiency with dbt (data build tool) for data transformations and schema management. Solid background in data modeling (relational, dimensional modeling) and designing data architectures following medallion (layered data lake) and/or data mesh principles.
Snowflake Expertise: In-depth knowledge of Snowflake’s Data Cloud platform, including its internal architecture, unique features, and best practices. Experience with Snowflake performance tuning (e.g., optimizing warehouses, clustering keys, query profiling) and understanding of how to leverage Snowflake’s capabilities (like zero-copy cloning, data sharing, etc.) in solution designs.
Data Quality & Governance Tools: Hands-on experience with data quality/validation frameworks (e.g., Great Expectations) and familiarity with metadata management or data catalog tools (e.g., DataHub) to ensure transparency and trust in data.
Problem Solving & Analytical Thinking: Exceptional analytical and troubleshooting skills. Demonstrated ability to solve complex data engineering problems, optimize performance, and handle large data sets and complex data integration challenges in real-world scenarios.
Communication & Stakeholder Engagement: Excellent communication and interpersonal skills. Comfortable working directly with non-technical stakeholders and senior client leaders to gather requirements, explain technical concepts in business terms, and ensure solutions meet business objectives.
Agile & Product Mindset: Experience working in Agile/Scrum teams with an iterative delivery approach. Strong product mindset – ability to think of data pipelines and models as products that need to deliver ongoing value and adapt to changing requirements.
blackrock