#1 Job Board for tech industry in Europe

  • Job offers
  • Senior Data Engineer
    New
    Data

    Senior Data Engineer

    49 USD/hNet per hour - B2B
    49 USD/hNet per hour - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid

    Tech stack

      Polish

      C1

      English

      B2

      Python

      advanced

      SQL

      advanced

      Databricks

      regular

      Apache Spark

      regular

      Azure Data stack

      regular

      CI/CD

      regular

      Terraform

      regular

      Kubernetes

      regular

      AKS

      regular

      Docker

      regular

    Job description

    Project information:

    • Industry: Insurance
    • Rate: up to 185 pln/h net + vat, B2B
    • Location: Warsaw – first 2-3 months hybrid, then remote
    • Project language: Polish, English

     

    About the Role

    As a Senior Data Engineer, you will design and build scalable Data Hubs that support analytical, reporting, operational, and Generative AI use cases. This role involves defining standards, best practices, and architecture patterns, mentoring engineers, and ensuring high-quality data solutions.

     

    Responsibilities:

    • Data Hub Architecture – Design and implement efficient, scalable Data Hubs integrating multiple data sources.
    • Technical Leadership – Define best practices, patterns, and standards, ensuring consistency across implementations.
    • Mentoring & Strategy – Guide engineers, review code, and provide technical direction.
    • Data Pipelines – Develop batch and real-time data processing workflows.
    • Quality & Monitoring – Implement data validation, anomaly detection, and automated monitoring, taking action when needed.
    • Automation & CI/CD – Enable seamless deployment and infrastructure automation.
    • Security & Compliance – Ensure data governance, security, and regulatory compliance.
    • Documentation – Maintain clear, structured technical documentation.

     

    Requirements:

    • Python, SQL – Strong data engineering and automation expertise.
    • Databricks & Spark – Deep knowledge of Databricks (primary tool) and Apache Spark.
    • Azure Data Stack – Experience with Data Factory, ADLS, Synapse, Azure SQL, Event Hubs.
    • Real-Time Processing – Expertise in streaming data and real-time analytics.
    • Automation & DevOps – Proficiency in CI/CD, Terraform, Kubernetes/AKS, Docker.
    • Leadership & Mentoring – Ability to set direction, mentor engineers, and drive best practices.
    • Documentation & Communication – Clear, structured technical writing and knowledge sharing.
    • Language Skills – English proficiency, minimum B2 level.

     

    Technology Stack

    • Primary Tools – Databricks, Apache Spark, Delta Lake.
    • Azure Services – Data Factory, ADLS, Azure SQL, Synapse, Azure DevOps, Event Hubs.
    • Development & Automation – Python, SQL, GitHub, Terraform, Docker, Kubernetes/AKS.
    49 USD/h

    Net per hour - B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administra...more

    Check similar offers

    Data Engineer with RoR

    New
    Acaisoft
    6.65K - 7.97K USD/month
    Warszawa
    , Fully remote
    Fully remote
    MS SQL
    PostreSQL
    Sidekiq