Data Engineer at HRD Solutions
Confidential
Publiée il y a 1 mois · Expire dans 3 semaines
Description du poste
Job Summary
Design, build, and maintain robust, scalable, and high-performance ETL/ELT data pipelines for reporting, business intelligence, and machine learning initiatives. This role is critical for ensuring the quality, lineage, and governance of all critical data assets.
Responsibilities
- Build and optimize data pipelines using tools like Airflow/Prefect to ingest data from core banking, payment, and third-party sources
- Design and implement dimensional and denormalized data models within the Data Warehouse (e.g., Postgres/Oracle/BigQuery)
- Utilize streaming technologies like Kafka and transformation tools like DBT to process data in real-time or near real-time
- Implement data quality checks and maintain data lineage documentation for governance
- Leverage Python and SQL extensively for scripting, data manipulation, and pipeline development
Key Performance Indicators
| KPI Area | Measure | Target |
|---|---|---|
| Data Pipeline Reliability | Pipeline SLA Adherence (Uptime) & Percentage of automated tests (e.g., dbt tests) | >99.9% Uptime, 90% of Critical Data Models Tested |
| Data Freshness & Delivery | Mean Latency for critical reports/data sets | — |
| Efficiency | Data Warehouse query run time (p95) | — |
Ce poste vous intéresse ?
Se connecter pour voir l'emailPas encore inscrit ? Créer un compte gratuit