The Opportunity:
CACI is seeking an HCM Data Migration Middleware Developer to support a large-scale federal HCM modernization program. This transformational initiative will deploy Oracle Fusion Cloud HCM as the unified human capital management platform forapproximately federalcivilian employees acrossnumerousagencies. In this role, you will design, build, andmaintainthe ETL pipelines and middleware that transform legacy HR data extracted from agency source systems into Oracle Fusion HCM Data Loader (HDL) and HCM Spreadsheet Data Loader (HSDL) formats for loading into the target platform. You will process agency migrations ranging from small agencies with 10,000 employees to extra-large large federal agencies, building scalable data transformation pipelines with robust error handling, data cleansing, and reconciliation capabilities. Your work will directlydeterminethe data quality and migration success for every agency transitioning to the new Oracle Fusion Cloud HCM platform. This is a primarily remote position with up to 10% travel. This position is at the T3 level, suited for a mid-career technical professional who applies solid working knowledge and established methods to independently perform a range of technical and functional tasks, resolving problems of moderate scope and complexity.
Responsibilities:
• Design and develop ETL pipelines using Oracle Integration Cloud (OIC), Informatica, Talend, or equivalent middleware tools to transform extracted legacy HR data into Oracle Fusion HCM HDL and HSDL load file formats
• Build data transformation rules that map legacy data values (codes, descriptions, hierarchies) to Oracle Fusion HCM reference data, value sets, and lookup values for each agency migration
• Develop data cleansing pipelines thatidentifyand remediate data quality issues including missing required fields, invalid date formats, duplicate records, and referential integrity violations before target system loading
• Design and implement batch processing frameworks capable of handling large-volume agency migrations ranging from small tovery largeagency employee populations with configurable parallelism and restart/recovery capabilities
• Build comprehensive error handling and exception management within ETL pipelines including error categorization, automated retry logic, and error reporting dashboards for migration teams
• Develop data reconciliation processes that compare source record counts, key data element values, and business rule validations between legacy extracts, transformation outputs, and target system loaded records
• Create andmaintaindata quality dashboards providing real-time visibility into migration pipeline status, error rates, data completeness metrics, and reconciliation results for each agency migration wave
• Design reusable transformation templates and parameterized mapping configurations that accelerate successive agency migrations byleveragingpatterns from completed migrations
• Collaborate with legacy system export developers (PeopleSoft/EBS team) to define extraction file formats, handoff procedures, and data interface specifications
• Support iterative mock migration cycles (typically 3-4 per agency) and final cutover data loads,optimizingpipeline performance and resolving data quality issuesidentifiedin each cycle
• Document ETL design specifications, transformation rules, data flow diagrams, and operational runbooks for each agency migrationin accordance withSAFeAgile artifact standards
• Apply established methods, standards, and practices to independently resolve functional and technical issues of moderate scope, contributing to team knowledge bases and consulting with senior staff on complex or unfamiliar problems as they arise
• Communicate effectively with project team members and direct stakeholders to report progress, explain technical approaches, and support collaborative problem-solving within assigned workstreams and Agile Release Train ceremonies
Qualifications:
Required:
•Bachelor's degree in Computer Science, Information Technology, Software Engineering, or related field
• Applicable combination of education and experience:
T3 - 4+ years of professional experience in ETL development, middleware development, or data integration for enterprise system implementations
T4 - 6+ years of professional experience in ETL development, middleware development, or data integration for enterprise system implementations
• 4+ years of experience in ETL development, middleware development, or data integration for enterprise system implementations
• 3+ years of hands-on experience with ETL/middleware tools such as Oracle Integration Cloud (OIC), Informatica PowerCenter/Cloud, Talend, or equivalent data integration platforms
• Strong SQLproficiencywith experience writing complex transformation queries, data validation scripts, and reconciliation procedures
• Experience with batch data processing for large-volume datasets (100,000+ records) including performance optimization, error handling, and restart/recovery design patterns
• Knowledge of data quality management principles including data profiling, cleansing, standardization, and validation
• Experience with file-based data integration formats (CSV, XML, JSON) and enterprise data loader specifications
• Familiarity with Agile/SAFedevelopment methodologies and iterative delivery practices
• Must be able to obtain andmaintaina Public Trust clearance (US Citizenshiprequired)
•Additionalexperience may substitute for degree
Desired:
• 6+ years of ETL/middleware development experience with specific experience on HCM or ERP data migration projects
• Oracle Integration Cloud (OIC) certification ordemonstratedproficiencywith OIC adapters, orchestrations, and monitoring
• Direct experience with Oracle Fusion HCM Data Loader (HDL) file specifications and HCM Spreadsheet Data Loader (HSDL) templates
• Informatica PowerCenter/Cloud certification or Talend Open Studio/Cloud certification
• Experience with federal government HR data migrations including EHRI data standards and federal personnel data formats
• Knowledge of Oracle Fusion HCM business o
caci