Responsible for the requirements, design, development, testing and documentation of new and legacy data ingestion, integration, processing, and storage software to meet the requirements of the company's IoT Platform Solution.
This IoT solution is a big data analytics platform for the mining industry which requires a focus on collecting, processing, contextualising, and managing data assets thus turning it into information, and finally leveraging this information to address the 4 pillars of analytics - descriptive, diagnostic, predictive, and prescriptive.
Responsibilities: - Data ingestion
- Data integration
- Data processing
- Data storage
- Digital twin management
- Data management and governance
Experience and Qualifications: - Bachelors degree in computer/software engineering, computer science
- Masters or doctoral level degree (advantageous)
- 2-5 years experience in computer science, software, or computer engineering, applied math, physics, statistics, or related fields.
- Experience with wireless and network communication technologies such as Wi-Fi, GSM, LoRaWAN, Bluetooth, TCP/IP.
- 2-5 years experience with data lake and warehousing solutions.
- 2-5 years experience with either Python, Java, C++, C#, Sparql , SQL databases.
- 2-5 years experience with either Apache Kafka (producers/consumers/connectors), Apache NiFi, Apache Spark, Apache ActiveMQ, MQTT, Modbus, REST API.
- Experience with developing, building and releasing of containerised services and microservice architectures using Docker.
- Experience within big data platforms, structured, unstructured and semi-structure data management.
- Experience with data analytics, machine learning and AI (advantageous)
- Experience with workflow management tools and BPMN eg. Camunda (advantageous)
- Experience with distributed systems and cluseter orchestration systems such as Kubernetes.