Job Description
Key Responsibilities
- Design, develop, and maintain GoldenSource Vendor File Implementations, ensuring smooth and efficient data ingestion processes.
- Perform ETL development and maintenance for data pipelines involving large-scale financial datasets.
- Implement and optimize Shell/Python scripts for file monitoring, automation, and load status tracking.
- Manage data loads into Hadoop Cloudera Data Lake, ensuring data quality and operational reliability.
- Perform GoldenSource data load exception analysis and fixes, addressing data discrepancies and ingestion failures.
- Execute new data reconciliations, recon remediation, and periodic full data loads across multiple systems.
- Collaborate with data operations and business teams to troubleshoot and resolve data quality issues.
- Work with DevOps teams to integrate deployment pipelines and ensure smooth CI/CD processes.
- Provide database-level expertise for performance optimization, query tuning, and schema design (preferably Oracle or PostgreSQL).
Required Skills & Experience
- 3-7 years of hands-on experience in GoldenSource or Markit EDM implementation and support.
- Proven experience with Vendor File Integration and Data Management workflows.
- Strong knowledge of ETL processes, Shell/Python scripting, and automation frameworks.
- Experience with Hadoop / Cloudera Data Lake environments.
- Expertise in data reconciliation, exception handling, and remediation within data management platforms.
- Strong SQL and database experience (Oracle, PostgreSQL, or similar).
- Working knowledge of DevOps and CI/CD pipelines (e.g., Jenkins, GitHub Actions, or Azure DevOps).
- Excellent problem-solving, analytical, and communication skills.