Provide BAU support for the Enterprise Data Warehouse (EDW), including monitoring, troubleshooting, performance tuning, and issue resolution
Support Qlik Replicate processes and ensure reliable data ingestion across systems
Implement enhancements and modification requests to the EDW based on business needs
Perform data validation, quality checks, and root cause analysis for data issues
Ensure operational stability, security, and performance of live data pipelines and applications
Document processes, configurations, and support procedures to maintain platform resilience
Project & Modernization Exposure
While this is primarily a BAU-focused role, the Data Platform Engineer will also:
Contribute to the migration of the EDW to Microsoft Fabric
Gain exposure to modern cloud data technologies and evolving data architecture patterns
Support development work related to platform modernization initiatives
Work under the guidance and mentorship of senior data engineers on strategic transformation efforts.
Follow coding best practices and ensure that the development works comply with the best practice guidelines.
Maintenance & continuous improvement
Perform ITIL Incident, Problem, and Change Management practices in accordance to SLAs and follow processes.
Identify key problem areas within the application and implement improvements. Evaluate and improve existing data analytics systems
Data Expertise
Understand the IB’s main business processes and how it relates to data that is generated or captured.
Understand associated data flows and dependencies between different enterprise systems
Required to independently research, test and problem-solve for technical issues or blockers, obtaining guidance from Leads or Architects when required.
Drive the troubleshooting of key technical issues or escalate and work with appropriate teams.
Identify key improvement areas and discuss with the technical lead where needed.
BSc/BA in Computer Science, Engineering or relevant field
Building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, applications and platforms.
Able to integrate multiple data sources & user-end applications with databases into one system. (to store the data and its retrieval from the databases)
Experience in designing and implementing robust data pipelines and ETL/ELT framework
Proven experience as a data warehouse developer, including full implementation of data warehousing solution.
Experience in data engineering solutions built on modern data lake or Lakehouse architectures, including Delta Lake or equivalent frameworks e.g. Microsoft Fabric
Good understanding of enterprise design concepts: re-usability, continuous integration, security, scheduling, monitoring, etc
Expertise with Azure Resource Management and templates is an added advantage
Exposure to cloud technologies (MS Azure, AWS) & desire to learn and deliver new things on a needs-basis. (big data, BI, data science, etc.)
Expertise in data warehouse design methodologies and technologies, data modelling (Data Vault modelling methodology experience is preferable), data quality and metadata
Effective oral, written communication and presentation skills.
Strong interpersonal skills. Self-motivated with a keen attention to detail and quality of work.
Digile is a future-forward technology partner, weaving AI into the fabric of enterprise systems.We design and deliver intelligent, scalable, and secure platforms that turn complexity into clarity, amplify decision-making, and accelerate digital growth. We don’t just adapt to the future - we engineer it.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.