#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer - Allegro Pay
    New
    Data

    Data Engineer - Allegro Pay

    Warszawa
    3 774 - 5 233 USD/monthGross per month - Permanent
    3 774 - 5 233 USD/monthGross per month - Permanent
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Permanent
    Operating mode
    Hybrid
    Allegro

    Allegro

    At Allegro, we build and maintain some of the most distributed and scalable applications in Central Europe. Work with us on e-commerce solutions to be used (and loved) by your friends, family and millions of our customers.

    Company profile

    Tech stack

      Polish

      B2

      English

      B2

      MS SQL Server

      regular

      Spark

      regular

      BigQuery

      regular

      Snowflake

      regular

      SQL

      regular

      Oracle

      regular

      Python

      regular

      Big Data

      regular

      DevOps

      nice to have

      CI/CD

      nice to have

    Job description

    Job Description

    The salary range for this position is (contract of employment):

    Mid role: 14 200 - 19 690 PLN in gross terms

     

    A hybrid work model requires 1 day a week in the office.

    Allegro Pay is the largest fintech in Central Europe – we are growing fast and need engineers who want to learn and develop, while at the same time solving problems related to serving thousands of RPSs. If, like us, you like flexing your mental muscles to solve complex problems and you would be happy to co-create the infrastructure which underpins our solutions, make sure you apply!

    In this role, you will be a contributor, helping us expand our modern cloud-based analytical solutions. We embrace challenging and interesting projects and take quality very seriously. Depending on your preference, your position may be more business-oriented or platform-oriented.


    We are looking for people who:

    • Have 2+ years of experience in building data-driven solutions using Python
    • Have practical knowledge in creating efficient data processing applications 
    • Simply like data to be processed efficiently, they feel satisfied when they ignite a lot of cores to quickly process terabytes of data
    • Can optimize SQL queries in traditional engines (SQL Server, Oracle), Big Data (Spark) or cloud engines (BigQuery, Snowflake)
    • Have experience in working with large data sets, understand database algorithms and data structures (e.g. they know what the difference between merge join and join hash)
    • Can independently make decisions in the areas entrusted to them and take responsibility for the code they create
    • Are not afraid of new technologies and want to expand their range of skills
    • Know how to build and deploy containerized applications on the cloud


    Nice to have:

    • Ideal candidate has DevOps experience - knows how to setup CI/CD pipelines and worked with IaC tools like Terraform or Pulumi to deploy and maintain cloud infrastructure
    • Experienced in programming in statically typed languages ​​(Java, Scala, C#) will be an advantage


    What will your responsibilities be?

    • Design, monitor and improve data flow processes implemented in Python, SQL, Airflow and Snowpark
    • Implement and maintain Data Mesh processes collecting data from many micro services and cloud sources
    • Work with various data formats and sources, utilizing novel storage solutions 
    • Optimize the costs associated with the cloud operations in Snowflake, Azure Cloud and GCP
    • Work with latest technologies such as: Snowflake, Airflow, dbt, .Net, Azure, GCP, Github Actions
    • Play an active role in decision-making processes regarding the selection and implementation of data frameworks


    What we offer

    • We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
    • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
    • 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
    • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
    • English classes that we pay for related to the specific nature of your job


    Why would you like to work with us:

    • You will work with an experienced team that carries out complex and demanding projects related to real-time processing of data produced by back-end and front-end. We design our data processes with software engineering rigour. 
    • You will work on projects related to the area of ​​finance where the scale, advancement of algorithms, business impact and technical requirements will be a key challenge
    • You will directly influence data processes that change in real time how millions of users use Allegro
    • Our employees regularly attend and present on conferences in Poland and abroad (Europe and the USA)


    Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)


    3 774 - 5 233 USD/month

    Gross per month - Permanent

    Check similar offers

    Data Engineer

    New
    Billennium
    27 - 37 USD/h
    Warszawa
    , Fully remote
    Fully remote
    AWS
    DBT
    Snowflake