Responsibilities:
• Design, build, and optimize scalable data pipelines and architectures on Google Cloud Platform (GCP)
• Develop and maintain ETL/ELT workflows using tools like Dataflow, BigQuery, and Apache Beam
• Work with streaming data technologies (Pub/Sub, Kafka, Flink) to enable real-time analytics
• Implement data modeling strategies for structured and unstructured data
• Ensure data quality, governance, and security across all pipelines and storage solutions
• Optimize data warehouse performance using BigQuery partitioning, clustering, and query tuning
• Collaborate with data scientists, analysts, and software engineers to enable data-driven decision-making
• Automate deployment and monitoring of data workflows using Terraform, Cloud Composer (Airflow), and CI/CD pipelines
• Architect Level: Define and drive enterprise data strategies, lead cloud migration projects, and guide best practices in data lakehouse architectures
Requirements:
• Senior: 5+ years of experience in data engineering and cloud-based data solutions
• Architect: 8+ years of experience with data architecture, cloud computing, and large-scale data processing
• Strong experience with Google Cloud services (BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub)
• Proficiency in SQL and Python for data processing and automation
• Hands-on experience with Apache Spark, Apache Beam, or similar big data frameworks
• Familiarity with modern data lake and warehouse architectures (Delta Lake, Lakehouse, Snowflake, etc.)
• Experience with Terraform, Kubernetes, and containerized data workloads
• Strong knowledge of data security, governance, and compliance best practices
• Excellent problem-solving skills and ability to work on complex, high-scale data challenges
• For Architect Level: Proven experience in enterprise data architecture, cloud migrations, and strategic leadership
Offer:
• Workplace: 100% remote
• MultiSport Plus
• Group insurance
• Medicover Premium
• e-learning platform
Net/month - B2B
Check similar offers