We are looking for a Data Analytics Engineer, to join our team based in Cape Town(on-site).
Main Responsibilities:
- Define a structured approach to problem solving and delivery against it.
- Create role specific design standards, patterns, and principles.
- Assist and advise the planning and management of the workload of the team.
- Taking a lead in analytical and information layer design solutions and provide guidance to other data analytics engineers in the team.
- Collaborating with other data engineers and data modelers, you will design, implement, and manage the data pipeline to avail data in the information layer.
- Create and automate reports for use by a large user base across our branch network.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing models for greater scalability.
- Monitor and fine-tune the data pipeline, reports and information layer for optimal performance.
- Monitor and manage hardware and software in the SOH Analytics environment regarding maintenance, patching, upgrades and access.
- Use modern development and modelling techniques and tools to implement BI and data management solutions, including data quality, metadata, and reference data.
- Engage with a wide range of technical stakeholders including data scientists, data analysts, business analysts, other data engineers and solution architects.
- Support Data stewards to establish and enforce guidelines for data collection, quality improvements, integration, and processes.
- Design and implement dimensional models, or other structures in analytical layer.
Role Requirements:
Qualifications:
Bachelors degree in computer science, Statistics, Informatics, Information Systems, Engineering or other quantitative fields/ National Diploma in an Information Technology related discipline preferred.
Work Experience:
Must have 2-5 years of relevant experience in a similar environment working with the relevant tools and techniques.
Technical Knowledge and Experience:
- Strong understanding of data, data structures, and data sources.
- Application and data engineering background with a solid knowledge of SQL.
- Knowledge of database management system (DBMS) physical implementation, including tables, joins and SQL querying.
- Experience in Database technologies (e.g. SAP , HANA , and Web IDE , or similar) or Hadoop components including HDFS , Hive , Spark , Oozie & Impala preferred and highly advantageous.
- Knowledge & experience of all BOBJ components is desirable. Create & maintain universes & business layer through IDT, BOBJ report , building, maintenance, scheduling & publications with Web Intelligence and CMC .
- Object- orientated /object functional scripting languages (python , Java , Shell or related).
- Knowledge & experience of structured data, such as entities, classes, hierarchies, relationships & metadata.
- Strong Data Engineering background with a specific focus on staging high quality data.
- Understanding of data warehousing principles (Kimball and Vault).
- Experience in agile development.
- Ability to comply to & manage data assets under strict governance framework.
Preferred Skills include:
- Data Warehousing (Kimball & Data Vault patterns preferred) & dimensional data modelling (OLAP & MDX experience)
- Solid background in SQL, Informati0on Architecture & ETL procedures.
- Experience with object-orientated/functional/scripting languages (e.g. Oython, Unix Shell Scripting, Java, Scala etc.) is preferred but not essential.
- Data Management technologies (Informatica Data Quality (IDQ), Informatica Enterprise Data Catalog (EDC), Axon, EBX).