SUMMARY:
We are seeking an exceptionally talented hands-on Intermediate Data Engineer who combines deep technical expertise in data pipelines, data modelling, and integration.
This role is accountable for designing and delivering scalable, reliable, and high-performance data solutions that suppo...
POSITION INFO:
We are seeking an exceptionally talented hands-on Intermediate Data Engineer who combines deep technical expertise in data pipelines, data modelling, and integration.
This role is accountable for designing and delivering scalable, reliable, and high-performance data solutions that support both analytics and application use cases. The successful candidate will play a key role in shaping the organization’s data architecture, ensuring that data is structured, governed, and accessible in a way that supports business decision-making and product capabilities.
The ideal candidate is a self-starter who values clean, maintainable data solutions, understands trade-offs between normalized and analytical models, and continuously improves the quality and performance of the data platform.
What you'll do:
- Data Pipeline Development Integration
- Design, develop, and maintain scalable data pipelines using Azure Data Factory (ADF), including pipelines, data flows, triggers, and parameterization.
- Integrate data from APIs, flat files, databases, and cloud/on-prem systems.
- Implement robust ingestion patterns for structured and semi-structured data (JSON, XML, CSV).
- Ensure reliable, efficient, and secure movement of data across systems.
- Data Modelling Transformation
- Design and maintain both normalized (OLTP-aligned) and denormalized (analytical / reporting) data models.
- Apply best practices in dimensional modelling (fact/dimension tables) as well as normalized relational design.
- Implement transformations using SQL (T-SQL), stored procedures, and data flows to prepare analytics-ready datasets.
- Ensure data models are scalable, reusable, and aligned with business requirements.
- Manage historical data tracking, including slowly changing dimensions and auditability.
- Performance, Reliability Scalability
- Optimize SQL queries, ETL pipelines, and data storage for large datasets (millions+ rows).
- Implement indexing strategies, partitioning, and efficient data access patterns.
- Ensure pipelines are resilient with proper error handling, retry logic, and monitoring.
- Design solutions that minimize impact on transactional systems (clear separation of OLTP and reporting workloads).
- Proactively identify and resolve performance bottlenecks.
- Application API Integration
- Collaborate closely with backend (.NET) teams to support data access patterns and integration with application services.
- Design and deliver aggregated datasets and data structures optimized for API consumption.
- Support frontend (e.g., Vue.js) data requirements by enabling efficient querying, filtering, and pagination.
- Contribute to embedded analytics and application-driven reporting use cases.
- Collaboration Continuous Improvement
- Work closely with BI developers, analysts, and stakeholders to translate data requirements into scalable solutions.
- Continuously improve data platform architecture, tooling, and processes.
- Support CI/CD practices for data pipelines and deployments.
- Stay current with evolving data engineering tools, patterns, and best practices.
Education Qualifications:
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field, or equivalent practical experience.
Your Expertise:
- 5+ years in data engineering, ETL development, or related roles.
- Azure Data Factory (ADF): Strong hands-on experience with pipeline orchestration, data flows, triggers, parameterization, and monitoring.
- SQL / T-SQL: Advanced querying, performance tuning, indexing strategies, and stored procedure development.
- Data Modelling:
- Strong experience with both normalized (3NF) and denormalized (star/snowflake) data models
- Understanding of when to apply each approach
- Experience designing scalable and maintainable data schemas
- Data Platforms: Experience with Azure SQL, Synapse Analytics, or Data Lake architectures.
- ETL / ELT:
- Strong understanding of data pipeline design, incremental loading, and transformation strategies.
- Exposure to SSIS, Informatica, Talend, dbt, or similar tools.
- Data Warehousing: Solid knowledge of dimensional modelling (star/snowflake schemas) and data lifecycle management.
- Performance Scalability:
- Experience working with large-scale datasets and high-volume data pipelines.
- Strong understanding of indexing, partitioning, and query optimization techniques.
- Experience designing solutions that separate transactional and analytical workloads.
- Data Governance Quality:
- Experience implementing data validation, reconciliation, and quality controls.
- Strong understanding of data lineage, auditability, and consistency.
- Integration: Experience working with APIs and handling JSON/XML data formats.
- Familiarity with Power BI, Tableau, or similar platforms.
- Experience with Azure data services (e.g., Synapse, Fabric) is advantageous.
- Azure DevOps, Git, and CI/CD experience is a plus.
- Demonstrated experience delivering end-to-end data engineering solutions in production environments.
- Proven experience contributing to code reviews, enforcing standards, and improving engineering practices.
Other information applicable to the opportunity:
- Permanent position
- Location: Cape Town
Why work for us?
Want to work for an organization that solves complex real-world problems with innovative software solutions? At iOCO, we believe anything is possible with modern technology, software, and development expertise. We are continuously pushing the boundaries of innovative solutions across multiple industries using an array of technologies. 
You will be part of a consultancy, working with some of the most knowledgeable minds in the industry on interesting solutions across different business domains. 
Our culture of continuous learning will ensure that you will have all the opportunities, tools, and support to hone and grow your craft. 
By joining IOCO you will have an open invitation to developer inspiring forums. A place where you will be able to connect and learn from and with your peers by sharing ideas, experiences, practices, and solutions. 
iOCO is an equal opportunity employer with an obligation to achieve its own unique EE objectives in the context of Employment Equity targets. Therefore, our employment strategy gives primary preference to previously disadvantaged individuals or groups.