Senior Analytics Engineer


Remote job description

You must have at least 5 years data/software experience and live within -1/+ 3 hours UTC time zone to be considered for this new and exciting Senior Analytics Engineer opportunity. This role would suit a senior software engineer, data engineer or technical data scientist.

Role Summary

M-KOPA serves over a million customers across multiple (African) geographies and is growing fast. From lights to phones we are powering progress through internet connected devices. The Analytic Engineering team's mission is to upgrade the value our Data Science, BI and Engineering teams get from the data that we have. We want you to join the team and:

  • Increase the velocity of our transition to a modern data stack (dbt, Airflow, Python, Spark, Synapse, Kubernetes, Docker, + ?)
  • Influence ongoing data stack choice collaboratively
  • Implement the chosen tools, build the first versions of new tools usage with effective default patterns implemented, so that other teams are empowered to work with that tool/process too
  • Abstract logic into libraries and patterns of work that enable teams to build value from our data independently
  • Demonstrate best practices across the data stack, and optimize the highest leverage opportunities to improve data processes, whether at data load, or with data model redesigns
  • Build systems that enable us and other teams to deliver end to end value more easily

Specific Responsibilities

Ultimately, the responsibility of the Senior Analytics Engineer is to be an Enabler; to build systems and processes that provide reliable and easy access to our data and enable the wider data team to work more efficiently.

Below are some examples of how this can be achieved:

  • Increase the velocity of our transition to a modern data stack (dbt, Airflow, Python, Spark, Synapse, Kubernetes, Docker, + ...)
  • Design and collaborate on efficient data ingestion to our data warehouse from data sources including published events, data lakes, data bases and API integrations.
  • Develop automation frameworks for data pipelines and analytics processes.
  • Set high standards of work, including security, infra as code, documentation, etc.
  • Integrate and maintain the infrastructure used in data analytics workstream (data lakes, data warehouses, automation frameworks).
  • Contribute to the design and implementation of a clear, concise data model.
  • Contribute to the efficient storage of our data in the data warehouse, identifying performance improvements from table and query redesign.
  • Write quality ELT code with an eye towards performance and maintainability and empower other analysts to use and contribute to ELT frameworks and tools.
  • Improve the overall Data Team's workflow, through knowledge sharing, proper documentation, and code review.
  • Abstract logic into libraries and patterns of work that enable teams to build value from our data independently.

Knowledge / Skills:

The list below describes the key knowledge and skills that we ideally require at a senior level in this role, however, there is a little flexibility and there are opportunities to develop skillsets.


  • You enjoy abstracting, generalizing, creating efficient, scalable solutions.
  • You like creating patterns and processes, as well as solving presented problems.
  • Strong foundation of software development best practices (collaborative development via git, testing, continuous integration, deployment pipelines and infrastructure as code).
  • Strong SQL skills.
  • Experience with python.
  • Experience deploying code to production via automated deployments on the cloud.
  • Experience working and associated platforms (Azure, AWS etc).

Additional assets:

  • Experience building ingestion and/or reporting from streaming data sets and event architectures.
  • Experience with distributed compute tools such as Spark and DataBricks.
  • Experience with dbt or a good basis to learn from it.
  • Experience with orchestration tools, such as Airflow
  • Familiarity with using analytics or working with analytics teams.
  • Experience with Kubernetes, not expected but a plus.
  • Experience with data visualization tools such as PowerBI, Looker or Tableau, not expected but a plus.

In this remote role, we can offer a full time permanent contract of employment to candidates resident in the UK, Kenya, Uganda or Nigeria, where our offices are based, together with a competitive salary (dependent on skills and experience), company bonus (dependent on company performance and OKRs), and a range of company benefits. For applicants who are resident in countries outside of where our offices are based and within the required time zone (-1/+3 hours UTC), we can offer a 3 year consultancy contract, together with a freelance daily rate and an annual bonus incentive.

When COVID travel restrictions allow, you may be required on an occasional basis to travel as part of your role.

M-KOPA is an equal opportunity and affirmative action employer committed to assembling a diverse, broadly trained staff. Women, minorities and people with disabilities are strongly encouraged to apply. In compliance with applicable laws and in furtherance of its commitment to fostering an environment that welcomes and embraces diversity, M-KOPA does not discriminate on the basis of race, colour, creed, religion, national origin, sex (including pregnancy and parenting status), disability, age, sexual orientation, gender identity or expression, marital status or genetic information in its programs or activities, including employment.

Company: M-KOPA
Job title: Senior Analytics Engineer at M-KOPA (Nairobi, Kenya) (allows remote)
Job tags: python, dbt, spark, airflow, docker
  • location or timezone

    (GMT+01:00) Paris +/- 2
  • category

  • posted

    63 days ago

Share or copy

Job alerts