About this role
As an AssociateontheEnterprise Data Platformteam, you will be part of a team responsible for building and modernizing the core framework of BlackRockās next-generation enterprise data platform. This platform is designed to automate and standardize data acquisition, ingestion, transformation, orchestration, and distribution at enterprise scale.
This is a hands-on engineering role for a strong software developer with experience in Python and backend API development, along with exposure to data platforms and workflow-driven systems. The role sits at the intersection of data platform engineering and backend platform development. You will help build and enhance modular, API-driven platform capabilities that make core services easier to access,operate, and scale.
A key focus of this role is designing and developing internal platform services and APIs that expose capabilities such as metadata, logging, orchestration, workflow generation, onboarding, and operational control. These services will support more flexible and standardized interaction patterns across the platform, including internal tools, automation, scripts, UIs, and AI-enabled workflows.
You will work closely with senior engineers, product partners, and platform teams to deliver scalable, reliable, and maintainable solutions that advance the firmās enterprise data capabilities. The ideal candidate brings strong Python engineering skills, practical experience building backend services and APIs, and an interest in applying those skills within a modern data platform environment.
Responsibilities
Design, develop, andmaintainscalable platform services and APIs that support metadata, logging, orchestration, workflow generation, onboarding, and other core enterprise data platform capabilities.
Build backend components and service interfaces that help evolve the platform toward a more modular, API-driven architecture.
Contribute to the development and enhancement of shared platform components used across enterprise data workflows.
Participate across the full software development lifecycle, includingrequirementsanalysis, design, development, testing, deployment, and ongoing enhancement.
Develop backend services using Python-based frameworks such asFastAPIorFlask, andcontribute to contract-first API development usinggRPCand Protocol Buffers.
Partner with senior engineers and cross-functional teams to translate business and operational requirements into robust technical solutions.
Improve performance, scalability, resilience, and observability across platform services and data workflows.
Contribute to engineering standards, API conventions, code quality, and development best practices.
Support the integration of AI-enabled capabilities and external AI services into platform workflows where they improve automation, quality, or developer productivity.
Required Qualifications
Bachelorās degree in Computer Science, Software Engineering, ora relatedfield.
3+ years of experience in software engineering, backend engineering, data engineering, or data platform engineering roles.
Strong hands-on programming experience in Python.
Strong SQL skills and experience working with relational or analytical data platforms.
Experience designing and developing backend services and APIs using REST-based frameworks such asFastAPIor Flask.
Experience withgRPCand Protocol Buffers, or other contract-first API development patterns.
Experience designing APIs that interact with orchestration systems, cloud storage, metadata services, or platform-control functions.
Experience building scalable, maintainable solutions in cloud-native or distributed environments.
Working experience with containers and Kubernetes-based deployment environments.
Experience with at least one major cloud platform; Azure experience is strongly preferred.
Strong understanding of object-oriented design, software engineering best practices, and production-grade development practices.
Strong communicationand collaboration skills, with the ability to work effectively across engineering and non-engineering partners.
Ability tooperateeffectively in a fast-paced environment and take ownership of assigned deliverables.
Preferred Qualifications
Experience with Apache Airflow or comparable workflow orchestration technologies.
Experience with Snowflake and related cloud data warehouse capabilities.
Experience withdbtor similar transformation frameworks.
Exposure toenterprise data platforms, ETL/ELT workflows, or reusable data framework components.
Experience with real-time, event-driven, or message-based platforms and technologies such as Kafka or streaming ingestion patterns.
Familiarity with authentication, authorization, and related API security concepts.
Experience integrating AI/ML or LLM-based services into enterprise platforms, developer workflows, or operational processes.
Understanding ofprompt engineering, model evaluation, or responsible AI practices.
Exposure to Java or experience working in mixed Python/Java enterprise environments.
What success looks like
You build high-quality platform services and APIs that make core enterprise data platform capabilities more accessible, reusable, and easier tooperate.
You contribute meaningfully to the platformās evolution toward a more modular, API-driven architecture.
You deliver reliable, maintainable code and grow into increasingly complex platform components over time.
You improve developer experience and platform efficiency through strong engineering practices, reusable services, and effective collaboration.
You build a strong understanding of the enterprise data platform and become a trusted hands-on developer within the team.
blackrock