About Wasoko:
Wasoko (“people of the market” in Swahili) is transforming communities across Africa by revolutionizing access to essential goods and services. We are East Africa’s biggest digitized retail distribution platform powered by our own in-house logistics network. Tens of thousands of informal retailers across five countries (Kenya, Tanzania, Rwanda, Uganda and Zambia) use Wasoko to order everyday essential goods and receive working capital financing.
The informal retailers of Africa today are the primary if not the only channel used by consumers to purchase essential goods worth over $600 billion per year. The fragmented infrastructure across large land mass, distributed but diverse population and rather a smaller basket size spend does not yet lend themselves to big basket retail or mass consumer ecommerce. Instead, the informal retail ecosystem is the channel for building the plumbing for digital and consumer commerce across Africa.
This is Wasoko’s opportunity. Wasoko with its brand, scale and logistics network is best positioned to build the technology-leveraged rails to serve the 1+ billion African consumers through informal retailers. We are building a digital-first operating system for informal retailers, focusing initially on B2B distribution but quickly incorporating a host of other tools and services to help communities across Africa get more for less.
Summary:
We are seeking a skilled and motivated Data Engineering Manager to join our Business Intelligence team. As the Data Engineering Manager, you will play a crucial role in driving data-driven decision-making and providing valuable insights to optimize our business operations. You will lead the analytics engineering department and collaborate closely with cross-functional teams to deliver high-quality analytics solutions.
Duties & Responsibilities:
- Collaborate with stakeholders to understand business requirements and translate them into actionable analytics projects and initiatives.
- Lead a team of two talented data engineers
- Identify opportunities for process improvement and automation within the analytics function to increase efficiency and scalability.
- Ensure data accuracy, integrity, and security by implementing best practices and data governance policies.
- Further develop our Data Warehouse by building analytical data models and data marts.
- Build robust data pipelines using graphical ETL/ELT tools such as Dataform, Talend, or Dbt.
- Develop code that adheres to our standards for style, maintainability, and best practices.
- Perform performance tuning and optimization to enhance the efficiency of analytics processes.
- Ensure data security and compliance measures are implemented effectively.
- Create and maintain architecture and systems documentation.
- Maintain a data catalog to support self-service and single-source-of-truth analytics effectively.
Requirements:
- Identify yourself with our mission to revolutionize access to essential goods and services for communities across Africa and want to make a meaningful contribution.
- Excitement about the challenges of an e-commerce business, including forecasting demand, optimizing delivery schedules and routing, managing inventory, and increasing customer lifetime value.
- Possess a positive, can-do attitude.
- Have 8+ years of experience building data marts and analytical data models, preferably in an agile and fast-growing start-up environment.
- Have 8+ years of experience working with version control systems.
- Have 8+ years of experience working with graphical ETL/ELT Tools such as Dataform, Talend, GCP Composer, GCP Datastream, GCP Dataflow, and DBT.
- Have 3+ years of experience leading data engineering and data science teams.
- Hands-on experience with Cloud Data Warehouses in BigQuery, Amazon Redshift, or Snowflake.
- Strong attention to detail and the ability to work in a fast-paced, deadline-driven environment.
- Familiarity with DevOps and automation practices, including containerization (e.g., Docker) and orchestration (e.g., Kubernetes) for managing containerized applications.
- Knowledge of data security and compliance measures.
- Working knowledge of cloud infrastructure and networking.
- Familiarity with streaming technologies like Apache Kafka or Google Cloud Datastream.
- Experience in performance tuning and optimization.
- Python skills and experience with Apache Airflow are highly desirable.
- Experience with data visualization tools such as Looker, Tableau, or Power BI is a plus.
- Knowledge of B2B ecommerce or a similar industry is a plus.