SUMMARY:
Data Engineer/Metahub Developer
POSITION INFO:
Our Client a Global Tech firm is seeking an Data Engineer/Metahub Developer to join their team in Rivonia on a contract basis on a Hybrid working model. They offer stability, growth, attractive rates and a great working environment.
Designs, builds and oversees the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs.
The Ab Initio Metadata Hub Developer is to prepare data products and advance technical thought leadership for the Enterprise, to deliver fit for purpose Metadata data products for consumption by the Enterprise.
Key roles and responsibilities:
- Develops, tests and deploys information extraction, analysis and management solutions
- Provides direct and responsive support for urgent analytical needs
- Collaborates with the development team to address issues within the framework of the project requirements
- Participates in architecture, design, testing and deployment of the applications
- Uses coding languages, scripting methodologies and sophisticated tools to solve a problem with a custom workflow
- Performs incremental testing actions on code, processes and deployments to identify ways to streamline execution and minimize errors Installs and maintains Spark, Apache Spark, Erlang, Spark cluster
- Manages the day to day operations of a Spark environment from provisioning, maintenance and monitoring
- Use Python for debugging and custom data mining
Experience and Qualifications required:
- Bachelor’s degree or Diploma Engineering, Computer Science
- Must have 8 years experience as a business analyst involved with assessing, mapping and optimizing business processes
- Understanding of data modelling and the ability to perform basic SOL database queries would be an advantage
- Experience within the financial services environment would be an advantage
The required specs and skillset that the Metadata Hub Developer must have, is as follows:
- Ab Initio Graph Developer
- Metadata Hub extractors
- Metadata Hub Imports – Lineage and application assignment
- Reports in Metadata Hub
- Metadata Hub Configurations
NB! This job is now closed. You can apply for other jobs by uploading your CV.