We are looking for an enthusiastic Middle Data Engineer ready to work on projects from scratch and participate in creating an infrastructure for a large-scale solution.
The company, headquartered in Germany, is one of Europe’s leading logistics providers. Its competence lies particularly in the development and realization of integrated supply chain systems and in the use of data to optimize logistics processes.
- At least 3 years of experience in data engineering and with cloud computing service solutions in the area of data and analytics, Azure is preferred one
- Conceptual knowledge of data analytics fundamentals such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data
- Knowledge of SQL and experience with Python
- Experience of working in cross-functional teams
- Experience in database development and data modeling, ideally with Databricks/Spark
- Excellent communication and interpersonal skills
- Strong problem-solving and decision-making skills, with a focus on delivering results and meeting deadlines
- Upper-Intermediate level of English
- A subject matter expert in the logistics domain
- Implement architectures based on Azure cloud platforms (Data Factory, Databricks, etc.)
- Design, develop, optimize and maintain team-specific data architecture and pipelines that adhere to defined ETL and Data Lake principles
- Work closely with the client and other stakeholders to clarify technical requirements and expectations
- Discover, understand, and organize disparate data sources and structure them into clean data models with clear, understandable schemas
- Contribute to the evaluation of new tools for analytical data engineering on the project