Join Us and Build a Cutting-Edge Data Platform!
As a Data Platform Engineer, you will be working for our client, a leader in finance, developing a cloud-based data platform to enhance data processing and analytics capabilities. This greenfield project focuses on building an ELT pipeline using Azure Databricks or BigQuery on GCP, leveraging native cloud services for data ingestion, transformation, and orchestration. You will play a key role in designing and optimizing the data infrastructure, ensuring scalability, performance, and seamless integration with business processes.
Your main responsibilities: Design and implement scalable ELT pipelines on Azure Databricks or BigQuery
- Develop and maintain cloud-native data ingestion, transformation, and orchestration workflows
- Optimize database structures and data models for performance and scalability
- Ensure data quality, governance, and compliance with industry standards
- Collaborate with data scientists and analysts to enable advanced analytics
- Automate deployment and monitoring of data pipelines
- Troubleshoot and resolve data infrastructure issues
- Implement security best practices for data processing and storage
- Work in an Agile environment, contributing to continuous improvement
- Document processes, architectures, and best practices
You're ideal for this role if you have:
- Experience with cloud-based data platforms (Azure Databricks or BigQuery)
- Strong knowledge of ELT/ETL processes and data pipeline development
- Hands-on experience with data ingestion, transformation, and orchestration tools
- Proficiency in SQL and data modeling techniques
- Familiarity with programming languages such as Python or Scala
- Understanding of data governance, security, and compliance requirements
- Experience with infrastructure as code (Terraform, ARM templates, or similar)
- Ability to troubleshoot and optimize large-scale data workflows
- Strong problem-solving skills and the ability to work in a cross-functional team
- Good English, both written and spoken