Data Engineer (m/f/d)
Engineering
Remote
Full-time
You work closely with product managers and customers to design data-driven solutions that align with concrete business requirements and strategic goals.
You analyze existing processes, identify optimization potential and develop technical designs that are both functional and scalable.
Your focus is on bringing technical feasibility, user needs and business impact into harmony – from the first idea to productive implementation.
Your Tasks
- Building, maintaining and further developing scalable data pipelines in a hybrid cloud infrastructure (Azure and GCP)
- Development and optimization of ETL/ELT processes with technologies like Databricks, PySpark, SQL and BigQuery
- Integration of data from various sources (cloud, APIs, internal systems)
- Implementation of data governance, monitoring and automated tests along the data value chain
- Use of Infrastructure-as-Code (e.g. Terraform, OpenTofu, Ansible) for provisioning and managing cloud resources
Requirements
- Completed degree in Computer Science, Business Informatics, Mathematics or comparable qualification
- At least 3 years of experience in Data Engineering in cloud-based environments (especially Azure and GCP)
- In-depth knowledge of Apache Spark, GCP components and Azure Data Services
- Experience with CI/CD, Git, Docker, Kubernetes is a plus
- Very good German and English skills
- Independent, structured and results-oriented way of working
Benefits
- Work on innovative data products with modern tech stack
- Lots of creative freedom and independent work
- Flexible working hours & Remote-First
- Home office equipment - we offer you a standard for the remote working environment
- Training and development opportunities in cloud and data
Tech Stack
PythonSQLApache SparkDatabricksPySparkAzureGCPBigQueryTerraformDockerKubernetesCI/CD