Essential Skills Requirements:
- Terraform
- Python 3.x
- SQL (Oracle/PostgreSQL)
- PySpark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- PowerShell / Bash
- Enterprise Collaboration tools (e.g., Confluence, JIRA)
- Experience with data formats: Parquet, AVRO, JSON, XML, CSV
- Experience with Data Quality Tools (e.g., Great Expectations)
- Experience with REST APIs
- Basic Networking and troubleshooting skills
- Agile Working Model (AWM) responsibilities
Advantageous Skills Requirements:
- Expertise in data modeling (Oracle SQL)
- Strong analytical skills for large data sets
- Testing and data validation experience
- Precise documentation skills
- Self-driven with ability to work independently and in teams
- Experience with AWS Glue, Data Pipeline, or similar
- Familiarity with AWS S3, AWS RDS, DynamoDB
- Knowledge of software design patterns
- Specification preparation for programming
- Organizational skills
- AWS Components knowledge: Glue, CloudWatch, SNS, Athena, S3, Kinesis Streams, Lambda, DynamoDB, Step Function, Param Store, Secrets Manager, CodeBuild/Pipeline, CloudFormation, BI Experience, technical data modeling, Kafka, AWS EMR, Redshift
Qualifications/Experience Needed:
- Relevant IT/Business/Engineering Degree
- Certifications: AWS Certified Cloud Practitioner, AWS Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, HashiCorp Certified Terraform Associate
Role and Responsibilities:
Data Engineers at our IT Hub are responsible for building and maintaining Big Data Pipelines using Data Platforms. They ensure data integrity and security aligned with information classification requirements, enabling secure data sharing on a need-to-know basis.
ExecutivePlacements.com
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.