Group Technology Coe: Data Engineer X2

Centurion, GP, ZA, South Africa

Job Description

Introduction



Through our client-facing brands the Momentum Metropolitan Group, with Multiply (wellness and rewards programme), and our other specialist brands, including Guardrisk and Eris Property Group, the group enables businesses and people from all walks of life to achieve their financial goals and life aspirations.

We help people grow their savings, protect what matters to them and invest for the future. We help companies and organisation's care for and reward their employees and members. Through our own network of advisers or via independent brokers and utilising new platforms Momentum Group provides practical financial solutions for people, communities, and businesses.



We build and protect our clients' financial dreams.



Visit us at www.momentummetropolitan.co.za





Disclaimer

As an applicant, please verify the legitimacy of this job advert on our company career page.


Role Purpose


The Data Engineer is responsible for designing, developing, and maintaining scalable data pipelines and storage solutions that facilitate the requirements of business intelligence, analytics, and operational data requirements. This position guarantees various organisational stakeholders access to reliable, high-quality data.



As a member of the Centre of Excellence (COE), the Data Engineer executes the organisation's data strategy by adhering to the governance policies, best practices, and frameworks established by senior leadership. Technical execution, problem-solving, and collaboration with cross-functional teams are all necessary to successfully deliver high-quality data solutions within predefined policies and strategies.



Requirements



Bachelor's degree in Computer Science, Information Systems, Engineering, or a related discipline, or relevant years of Experience
Certification in Microsoft Azure Data Engineer or AWS Certified Data Analytics.
3-5 years of Experience in a related field or data engineering.
Practical Experience with cloud-based data solutions and significant data ecosystems.
Demonstrated history of successfully designing and implementing scalable data pipelines.
A master's degree in a field, such as data engineering or analytics (advantageous)
Experience with data processing that is driven by AI or machine learning (advantageous)
Proven Experience in an Agile DataOps environment or a Centre of Excellence (COE) (advantageous)




Duties & Responsibilities


Data Ingestion and Integration

Develop, implement, and optimise data ingestion protocols that are both efficient and scalable to extract data from various sources, such as relational databases, APIs, cloud services, and external data providers.
Guarantee the seamless integration of structured and unstructured data into centralised storage environments, including data warehouses and lakes.
Utilise data validation methodologies to guarantee the data's accuracy, integrity, and comprehensiveness.
Automate data ingestion workflows to minimise manual intervention and guarantee real-time or batch processing.


Data Transformation and Processing



Create ETL/ELT pipelines to prepare unprocessed data for analytical and business reporting purposes by cleansing, transforming, and organising it.
Employ big data processing frameworks (e.g., Apache Spark, Hadoop) to manage large-scale data processing.
Ensure that data transformation tasks are optimised for cost-effectiveness, efficiency, and performance.
Implement data enrichment procedures to improve datasets by incorporating supplementary context from internal or external sources.


Data Storage and Management



The design and maintenance of data storage architectures guarantee scalability, reliability, and efficacy.
Enhance data access velocities and minimise storage expenses by optimising database schemas, indexing, and partitioning strategies.
Guarantee that organisational data governance policies are adhered to by data storage solutions.
Support transparency and data discoverability by managing metadata, schema definitions, and data dictionaries.


Performance Optimisation and Data Pipeline Monitoring



Establish monitoring frameworks to ensure the data pipeline's reliability, performance, and health.
Promptly identify and resolve data processing bottlenecks, malfunctions, or latency issues.
Incorporate proactive alerting mechanisms to identify anomalies in data quality or movement.
Continuously optimise pipeline efficiency by employing automation and optimisation strategies.


Data Security, Governance, and Compliance



Implement data security policies consistent with company compliance standards and industry regulations.
Work with data governance teams to establish and execute data quality standards.
Implementing suitable encryption protocols and role-based access controls (RBAC) is imperative.
Support the observance of POPIA and other pertinent data protection laws.


Stakeholder Engagement and Collaboration



Collaborate closely with data scientists, business analysts, IT teams, and key stakeholders to comprehend their data requirements and convert them into technical solutions.
Contribute to cross-functional initiatives by offering expertise on engineering best practices and data infrastructure.
Foster a data-driven culture within the organisation, mentor novice team members, and share knowledge.
Ensure data strategies and use cases align by liaising between technical and business teams.



Competencies



Examining Information
Team Working
Following Procedure
Documenting Facts
Meeting Timescales
Managing Tasks
Adopting Practical Approaches
* Checking Things

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1398485
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Centurion, GP, ZA, South Africa
  • Education
    Not mentioned