Website First Quantum Minerals
At First Quantum, we free the talent of our people by taking a very different approach which is underpinned by a very different, very definite culture – the “First Quantum Way”.
Working with us is not like working anywhere else, which is why we recruit people who will take a bolder, smarter approach to spot opportunities, solve problems and deliver results.
Our culture is all about encouraging you to think independently and to challenge convention to deliver the best result. That’s how we continue to achieve extraordinary things in extraordinary locations.
Job description:
Job title: Data Architect
Site: Kansanshi Mining Plc.
Department: IT & Digital
Section: Business Analytics and Engagement
Position reports to: IT Enterprise Architect Lead
PURPOSE
Manage site data governance and contribute to regional and group data engineering teams in pursuing vision of analytics driven mining. Provides expertise in building and deploying databases
The role includes the development and design of data and software strategies, monitoring and improving system and data performance.
The Data Architect is responsible for planning for future upgrades, expansion and capacity requirements. In addition, the Data Architect plans, coordinates, and implements security measures to safeguard the data and related environment.
The role also includes the design, configuration, and development standards for all databases onsite
The Data Architect determines database structural and functional requirements by analysing operations, applications and programming.
KEY RESPONSIBILITIES
Develop and oversee creating and updating of database solutions by designing proposed
systems/enhancements. Define database structure and functional capabilities, security, backup and recovery specifications; document design to implementation.
· Design, development, deployment, and support of enterprise data platform based upon Microsoft
Azure Services aligned to region and group data engineering and analytics guidelines
Create and enforce policies for effective data management, including techniques for data accuracy and legitimacy
Maintain database performance across the technical environment by identifying and resolving production and application development problems; calculating optimum values for parameters;
evaluating, integrating and installing new releases following an established version control
change management methodology; ensuring proper organization, indexing, and optimization for
efficient data retrieval and storage; completing maintenance activities; publishing release notes; and addressing user training and information needs.
Design, develop, integrate, and review real-time/bulk data pipelines from a variety of internal and external sources (streaming data, APIs, data warehouse, messages, images, video, etc.)
Perform data transformation and normalization tasks to prepare data for analytics, modelling, and reporting purposes. This also includes development of data models and structures that facilitate efficient data analysis and retrieval
Implement data quality checks, monitoring, and error handling mechanisms to ensure data accuracy, completeness, and consistency · Ensure the IT & Digital team is following established design patterns for data ingest, transformation, and egress
Develop documentation of Data Lineage and Data Dictionaries to create a broad awareness of the enterprise data model and its applications · Apply best practices within Data Ops (Version Control, P.R. Based Development, Schema Change Control, CI/CD, Deployment Automation, Test Automation, Shift left on Security, Loosely Coupled Architectures, Monitoring, Proactive Notifications)
Provide thought leadership in problem solving to enrich possible solutions by constructively challenging paradigms and actively soliciting other opinions. Actively participate in R&D initiatives
Architecture: Utilize modern cloud technologies and employ best practices from DevOps/DataOps to produce enterprise quality production Python and SQL code with minimal errors. Identify and direct the implementation code optimization opportunities during code review sessions and proactively pull in external experts as needed.
Develop interactive dashboards, reports, and visualizations using tools like Power BI, Python presenting data in a user-friendly and insightful manner
QUALIFICATIONS
Bachelor’s degree in engineering, computer science, analytical field (Statistics, Mathematics, etc.), Masters or PhD will be an added Advantage.
EXPERIENCE
Minimum 5 years related experience
Knowledgeable Practitioner of SQL development with experience designing high quality, production SQL codebases Knowledgeable Practitioner of Python development with experience designing high quality, production Python codebases
Knowledgeable in objected oriented programming languages like C#
Knowledgeable Practitioner in data engineering, software engineering, and ML systems architecture
Knowledgeable Practitioner of data modelling
Experience applying software development best practices in data engineering projects, including Version Control, P.R. Based Development, Schema Change Control, CI/CD, Deployment
Automation, Test Driven Development/Test Automation, Shift left on Security, Loosely Coupled Architectures, Monitoring, Proactive Notifications using Python and SQL
Data science experience wrangling data, model selection, model training, modelling validation, e.g., Operational Readiness Evaluator and Model Development and Assessment Framework, and deployment at scale
Working knowledge of Azure Stream Architectures, DBT, Schema Change tools, Data Dictionary tools, Azure Machine Learning Environment, GIS Data · Working knowledge of Software Engineering and Object Orient Programming Principles
Working knowledge of Distributed Parallel Processing Environments such as Spark or Snowflake · Working knowledge of problem solving/root cause analysis on Production workloads
Working knowledge of Agile, Scrum, and Kanban
Working knowledge of workflow orchestration using tools such as Airflow, Prefect, Dagster, or similar tooling · Working knowledge with CI/CD and automation tools like Jenkins or Azure DevOps
Experience with containerization tools such as Docker · Member of ICTAZ or EIZ
BEHAVIOURAL TRAITS
Effective communication
Ability to influence managers and employees
Ability to demonstrate leadership
Critical thinking
Conflict management
Problem solving skills
Ability to work in pressured and deadline-driven operating environment
Detail-orientated with the technical aptitude and ability to perform tasks accurately and comprehensively
Expert in multi-tasking, time management and planning of work
Excellent presentation skills
To apply for this job please visit zinstablog.com.