We are looking to hire a skilled AWS Cloud ETL Developer to design and build AWS Cloud ETL pipelines for ingesting and processing of Big data.
Your duties will include working with the AWS data lake development team, developing and implementing solutions to facilitate movement of data, testing the solutions and performing third line support for developed solutions.
- Consult with business teams to understand data ingestion and processing requirements
- Develop effective, secure, and fit-for-purpose ETL capabilities to ingest and process data to, from and on the cloud
- Identify suitable cloud components on the basis of performance, cost-effectiveness, security, and other relevant metrics
- Develop patterns which meet the business objectives, follow best practices, and are extendable or re-usable
- Follow the software development lifecycle
- Test and troubleshoot solution characteristics
- Provide documentation on how your solution should be used, and train business teams how to use them
- Provide third line support for developed solutions, including troubleshooting and implementing corrective measures, tuning and optimisation, and monitoring.
- Identify and assist with improvements to developed solutions, on the basis of operational excellence, security, reliability, performance efficiency and cost optimisation.
- Manage time effectively through planning and prioritisation of activities
- Discuss and communicate progress with the development team and stakeholders
- Be familiar with modern cloud concepts and best practices.
- Must have development experience on traditional relational databases and with big data technologies.
- Extensive knowledge of coding languages especially Python, PySpark and SQL.
- Experience working with AWS cloud technologies for processing business data
- Solid knowledge / experience with Business Intelligence architectures, best practices & principles and techniques.
- Proven work experience as a Data Engineer developing ETL code, pipelines and data engineering
- Understand and integrate with common Enterprise IT components such as Active Directory, virtual machines, shares, etc
- Ability to troubleshoot and solve complex technical problems
- Experience managing the software development lifecycle
- Creative problem-solving skills around data engineering and data integration.
- The ability to prioritise activities and deliver on time
- Be willing and able to learn and adapt to the changing tech landscape
- Have a positive attitude and strong written and verbal communication and interpersonal skills
- Data warehousing experience
- Big data experience
- Relevant AWS cloud certification/s
- Experience with Terraform
- Bachelor’s degree in Computer Science, Information Technology, Data Engineering or related field
- Ideally have at least one relevant AWS cloud certification