Aws Data Engineer/jhb

Johannesburg, Gauteng, South Africa

Job Description


AWS Data Engineer/JHB

Contract Type: 12 months

Our client is seeking a highly motivated and experienced AWS Data Engineer to join their Global

Markets division. The role is responsible for closely working the traders, quantitative analysts, and

Data Scientists to design, develop, and maintain data infrastructure and pipelines for our global markets business.

The ideal candidate should have a strong background in C# or Python programming and building data pipelines on AWS, especially with AWS Glue Jobs using PySpark or AWS GlueSpark. This role offers an exciting opportunity to collaborate with leading financial institutions, contributing to the design and implementation of data pipelines to support global markets business. It will be advantageous if the candidate has the AWS Certified Machine Learning

Specialty Certificate or can develop machine learning (ML) models and automate ML pipelines.

Minimum Experience:

5 Years Python/C# Development

3 Years AWS Data Engineering

Education Requirements:

Bachelor's degree in Computer Science, Information Systems, or related field.

Advantageous: AWS Certified Machine Learning Specialty Certificate

Responsibilities differ across client engagements but may include:

Creating data models that can be used to extract information from various sources and store it in a usable format.

Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of data.

Utilize expertise in SQL and have a strong understanding of ETL and data modelling.

Ability to ingest data into AWS S3, perform ETL into RDS or Redshift.

Use AWS Lambda (C# or Python) for event-driven data transformations.

COMPETENCIES

Designing and implementing security measures to protect data from unauthorized access or misuse.

Maintaining the integrity of data by designing backup and recovery procedures.

Work on automating the migration process in AWS from development to production.

You will deliver digestible, contemporary, and immediate data content to support and drive business decisions.

You will be involved in all aspects of data engineering from delivery planning, estimating and analysis, all the way through to data architecture and pipeline design, delivery, and production implementation.

From day one, you will be involved in the design and implementation of complex data solutions ranging from batch to streaming and event-driven architectures, across cloud, on-premise and hybrid client technology landscapes.

Optimize cloud workloads for cost, scalability, availability, governance, compliance, etc.

Must have experience with AWS Glue Jobs using PySpark or AWS Glue Spark.

Real-time ingestion using KAFKA is an added advantage.

Strong SQL and C# or Python programming knowledge.

Objective oriented principles in C# or Python: classes and inheritance.

Expert knowledge of data engineering packages and libraries and related functions in

C# or Python.

AWS technical certifications (Developer Associate or Solutions Architect).

Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM).

Ability to understand and articulate requirements to technical and non-technical audiences.

Have experience working with RDBMS databases, such as Postgres, SQL Server and MySQL.

Apply knowledge of scripting and automation using tools like PowerShell, Python,

Bash, Ruby, Perl, etc.

Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building.

Effectively and efficiently troubleshoot data issues and errors.

Strong experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies.

Experience delivering in an agile environment.

Experience in implementing and delivering data solutions and pipelines on AWS Cloud Platform.

A strong understanding of data modelling, data structures, databases, and ETL processes.

An in-depth understanding of large-scale data sets, including both structured and unstructured data.

Knowledge and experience in delivering CI/CD and DevOps capabilities in a data environment.

Ability to clearly communicate complex technical ideas.

Experience in the financial industry is a plus.

An AWS Certified Machine Learning Specialty Certificate is an advantage.

ExecutivePlacements.com

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD1290092
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Johannesburg, Gauteng, South Africa
  • Education
    Not mentioned