About Company:
Interswitch is an Africa-focused integrated digital payments and commerce company that facilitates the electronic circulation of money as well as the exchange of value between individuals and organisations on a timely and consistent basis. We started operations in 2002 as a transaction switching and electronic payments processing, and have progressively evolved into an integrated payment services company, building and managing payment infrastructure as well as delivering innovative payment products and transactional services throughout the African continent. At Interswitch, we offer unique career opportunities for individuals capable of playing key roles and adding value in an innovative and fun environment.
Job Description:
We are seeking a visionary Lead Data Engineer to architect and manage the data backbone of our MVNO operations. This is a mission-critical role where you will design a secure, reliable, and governable Data Lakehouse capable of processing massive volumes of BSS/OSS data and Call Detail Records (CDRs). You will bridge the gap between raw data ingestion and advanced analytics, ensuring our data strategy is compliant with NCC regulations while enabling real-time decision-making. If you are an expert in streaming architectures and have a proven track record of scaling data infrastructure in high-intensity industries like Telecom or Fintech, we want you to lead our data evolution.
Requirements:
1. Lakehouse Architecture & Pipeline Engineering
End-to-End Ingestion: Architect and maintain high-performance batch and real-time pipelines using Kafka, Flink, and Airflow to ingest data from BSS/OSS, CRM, and CDR sources.
Format Mastery: Manage and optimize modern table formats (Delta Lake, Iceberg, or Hudi) to ensure atomicity and consistency across the lakehouse.
Productionization: Partner with Data Scientists to transform raw data into production-ready features and analytical models.
2. Regulatory Compliance & Metadata Governance
NCC Data Sovereignty: Implement robust schema validation and quality frameworks (e.g., Great Expectations) to ensure compliance with strict telecom data regulations.
Privacy-First Monitoring: Guarantee data fitness and security by monitoring pipeline health purely through metadata (latency, row counts, job status), ensuring sensitive data remains untouched during the monitoring process.
SLA Management: Develop and surface SLA dashboards to maintain high availability and performance across all data platforms.
3. Infrastructure, DevOps & Cost Control
GitOps & Automation: Automate the CI/CD deployment of complex data pipelines using Terraform, Docker, and Kubernetes.
Storage Optimization: Manage object storage and warehouse schemas, optimizing partitioning and clustering to reduce latency and cloud costs.
Access Governance: Own the security architecture, implementing granular access control policies and cost-governance frameworks across the data ecosystem.
Qualifications and Skills:
Technical Toolkit
Streaming & Orchestration: Hands-on expertise with Kafka, Flink, Pulsar, and Airflow/Prefect.
Languages & Formats: Expert-level Python/Scala and SQL; proficiency in Delta Lake, Iceberg, or Hudi.
Cloud & Warehousing: Advanced experience with Snowflake, BigQuery, or Redshift and cloud object storage.
DevOps: Strong grasp of CI/CD pipelines, Terraform, and Kubernetes.
Professional Background
Education: Bachelor’s or Master’s Degree in Computer Science or Software Engineering.
Experience: Minimum of 7 years in data engineering, specifically within Telecom, Fintech, or other high-data-intensity sectors.
Leadership: Proven ability to mentor junior engineers and lead complex technical projects from design through to deployment
Salary
Very attractiveApplication Closing Date: Not specified
Application Instructions:
Click the button below to apply
Job Information
Deadline
Not specified
Job Type
Full-time
Industry
Engineering
Work Level
Experienced
State
Lagos
Country
Nigeria