#1 Job Board for tech industry in Europe

  • Job offers
  • Big Data Engineer (Spark Developer)
    New
    Data

    Big Data Engineer (Spark Developer)

    Kraków
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Permanent
    Operating mode
    Hybrid
    StoneX Poland

    StoneX Poland

    🔹 We are a member of the Fortune 100 with 4,500 employees. 🔹 140+ currencies - StoneX Group Inc. offers currencies across 185 countries. 🔹 StoneX Group Inc. connects with clients in nearly 80 offices across 6 continents.

    Company profile

    Tech stack

      Spark

      advanced

      Data

      regular

      Hadoop

      regular

      Agile

      regular

      Python

      regular

      Big Data

      regular

      PySpark

      regular

      Scala

      regular

    Job description

    Permanent, full-time, hybrid (3 days per week in an office).

     

    Connecting clients to markets – and talent to opportunity.

     

    With 4,300 employees and over 400,000 retail and institutional clients from more than 80 offices spread across five continents, we’re a Fortune-100, Nasdaq-listed provider, connecting clients to the global markets – focusing on innovation, human connection, and providing world-class products and services to all types of investors.

     

    At StoneX, we offer you the opportunity to be part of an institutional-grade financial services network that connects companies, organizations, and investors to the global markets ecosystem. As a team member, you'll benefit from our unique blend of digital platforms, comprehensive clearing and execution services, personalized high-touch support, and deep industry expertise. Elevate your career with us and make a significant impact in the world of global finance.

     

    Business Segment Overview: Empower individual investors – and yourself – in the world of retail through a range of different financial products rooted in innovation and market intelligence. From FX and CFDs to precious metals, master an exciting world of wealth management tools.


    Responsibilities


    Position Purpose: This role involves designing and developing Databricks applications in pySpark. Our team is rewriting an existing SQL on-prem data warehouse into a Data Lakehouse using Databricks.

     

    Primary duties will include:

    • Migration from on-prem SQL Data Warehouse to Databricks.
    • Developing pySpark applications and Spark jobs.
    • Maintaining Databricks workspaces, clusters, and jobs.
    • Integrating Databricks applications with various technologies.
    • Keep Databricks environment healthy.


    Qualifications


    To land this role you will need:

    • Must be a Subject Matter Expert in Spark.
    • Proficiency with Big Data processing technologies (Hadoop, Spark, Databricks).
    • Experience in building data pipelines and analysis tools using Python, pySpark, Scala.
    • Create Scala/Spark jobs for data transformation and aggregation.
    • Produce unit tests for Spark transformations and helper methods.
    • Design data processing pipelines.
    • Good to have experience with Hadoop / Databricks.
    • Passionate about learning new technologies.
    • Ability to learn new concepts and software quickly.
    • Analytical approach to problem-solving; ability to use technology to solve business problems.
    • Familiarity with database-centric applications.
    • Ability to communicate effectively in both a technical and non-technical manner.
    • Ability to work in a fast-paced environment.
    • Experience working in an agile environment using SCRUM methodology.
    • Communicate and interact with all levels of personnel within the organization, including senior management, and other departments.
    • Results oriented, team player with strong attention to detail.


    What makes you stand out:

    • Relevant experience in the finance services industry, FX, or brokerage experience a plus.
    • Good to have practical and theoretical knowledge of DW concepts (Kimball approach).
    • Good to have Spark certification.


    Education / Certification Requirements: 

    • Bachelor’s degree or relevant work experience in Computer Science, Mathematics, Data Engineering or related technical discipline.


    Working environment:

    • Hybrid (2 days from home, 3 days from the office) at ul. Mogilska 35, Cracow

    Check similar offers

    Middle Data Engineer (Logistics domain)

    New
    Sigma Software
    Undisclosed Salary
    Kraków
    , Fully remote
    Fully remote
    Microsoft Azure Cloud
    English
    Python

    Data Software Engineer (Upskilling position for Python Developers)

    New
    EPAM Systems
    Undisclosed Salary
    Kraków
    , Fully remote
    Fully remote
    SQL
    AWS
    Azure

    Software Engineer III, Site Reliability Engineering, Platforms Infrastructure

    New
    Google
    Undisclosed Salary
    Kraków
    , Fully remote
    Fully remote
    Computer science
    Data Structures
    Algorithms

    Data Analyst (Digital)

    New
    Profitroom
    2.94K - 4.41K USD
    Kraków
    , Fully remote
    Fully remote
    Power BI
    Tableau
    Python

    Analytics Engineer

    New
    Egnyte Poland
    Undisclosed Salary
    Kraków
    , Fully remote
    Fully remote
    Python
    DBT
    BigQuery