Responsibilities (inclusive of but not limited to):
Spearhead the architectural design and implementation of scalable data engineering solutions, utilizing advanced cloud data warehouse technologies (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, or Azure Synapse Analytics). This involves advocating for the adoption of ELT patterns over traditional ETL processes to improve data agility and efficiency.
Advocate for the development and assessment of proof of concept (POC) initiatives for the incorporation of an Operational Data Store (ODS) and other contemporary data processing frameworks, such as the Medallion Architecture. Ensure our approach remains technology-agnostic and aligned with best practices.
Supervise the optimization of data flows, utilizing ELT processes to streamline data loading and transformation in cloud data warehouses, ensuring high data quality and accessibility.
Lead and refine CI/CD processes for seamless data pipeline deployments, incorporating best practices in version control with Git.
Collaborate with cross-functional teams to capture and address comprehensive data requirements, ensuring robust support for business analytics and decision-making.
Uphold rigorous data security and compliance standards, aligning with financial industry regulations and evolving data privacy best practices.
Key Requirements:
A minimum of 5 years in Data Engineering, including 2+ years in a senior or leadership role, with a preference for experience in the financial services sector.
Technical Expertise: Proficiency in at least one major cloud data warehouse solution (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, Azure Synapse Analytics), with a strong emphasis on implementing ELT patterns and familiarity with modern data architecture frameworks like the Medallion Architecture.
Leadership and Innovation: Demonstrated leadership in driving the adoption of modern data processing strategies, with the ability to manage complex projects and innovate within the data engineering space.
Programming Skills: Strong proficiency in programming languages such as Python or Java and can demonstrate advanced knowledge of SQL on a cloud data warehouse solution, essential for developing and managing ELT processes.
Certifications: Cloud platform certification (e.g., AWS Solutions Architect, Google Cloud Professional Data Engineer, Snowflake SnowPro) is highly desirable.
Communication: Excellent verbal and written communication skills, essential for effective collaboration across teams and with stakeholders.
Minimum Qualifying Attributes:
Hands-on experience with CDC-based data ingestion tools and methodologies.
Comprehensive understanding of data modeling, ETL/ELT processes, and ensuring data security and privacy, especially within the financial industry.
Apply now and be a driving force in shaping the future of data engineering.
Careers24
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.