Hello! It's nice to have you here! Let us introduce ourselves😊
At HiQo we focus on developing entire IoT systems. We are US-based company, headquartered in Atlanta with delivery centers in Europe.
We design devices, write software that brings them to life, and create applications that connect them to end users. We solve technological challenges from all over the world and we do it by creating something from scratch. And at the end of the day we get to have that special sense of tangible achievement that you don’t always get elsewhere.
We are seeking an ETL Developer to design, build, and optimize data pipelines for efficient data processing and integration. This role focuses on ensuring data reliability, scalability and performance while maintaining high-quality standards. If you enjoy working with data and solving complex challenges, this opportunity is for you!
‼️ We don't yet know the exact requirements for your first project, but we are looking for a versatile person for long-term collaboration on various projects. So, even if you don’t meet all the requirements below- apply and let’s talk! ‼️
Requirements:
- Hands-on experience with ETL development and data integration for large-scale systems.
- Expertise in ETL/ELT tools (e.g., Informatica, Talend, SSIS, AWS Glue, Azure Data Factory).
- Advanced SQL skills: query optimization, complex joins, window functions, and execution plan analysis.
- Programming experience in Python for data manipulation and workflow automation; familiarity with PySpark or Scala is a plus.
- In-depth knowledge of data modeling methodologies (Kimball, Inmon) and schema design (star schema, snowflake schema).
- Familiarity with real-time data processing tools like Apache Kafka, Apache Flink, or Spark Streaming.
- Proficiency in cloud platforms (AWS, Azure, or GCP) and their data services (e.g., Redshift, Glue, Synapse).
- Familiarity with data quality frameworks (e.g., Great Expectations) and monitoring tools (e.g., Datadog, Prometheus).
- Strong analytical and problem-solving skills to design scalable, fault-tolerant ETL workflows.
- Very good English proficiency (min. B2- daily communication with the team and clients)
Key Responsibilities:
- Design, develop, and optimize complex ETL/ELT pipelines for large-scale data integration and processing.
- Implement real-time and batch data workflows using AWS Glue, Informatica, Azure Data Factory, or Talend.
- Architect and maintain data warehouses and lakehouses using platforms such as Snowflake, Redshift, or BigQuery.
- Implement Change Data Capture (CDC) strategies using tools like Debezium or AWS DMS to ensure real-time data synchronization.
- Ensure data quality and reliability through data validation, error handling, and idempotent workflows.
- Troubleshoot and resolve complex issues in ETL pipelines, optimizing performance and scalability.
- Leverage workflow orchestration tools like Airflow, Prefect, or Dagster to automate and monitor data processes.
How can we interest you?
- It’s a unique opportunity to join our offices in Krakow or Wroclaw or 100% working remotely- up to you!
- We work only with foreign customers and our team is multicultural- a good way to study and improve your English skills.
- We have a simple structure and no bureaucracy.
Other benefits:
- Possibility to choose the type of contract.
- Flexible working time- you can agree on it within the team.
- Necessary tools and equipment.
- Mentorship programs.
- Full-time English teachers.
- Medical insurance (for permanent employees).
- HIQO COINS - We have a system of rewarding loyal employees for extracurricular activities.
Our credo is “Do The Right Thing”. What’s yours? 😊