Join the AI & Data Platform team to drive operational excellence for cutting-edge applied AI challenges in retail. We are seeking a skilled DataOps Engineer to design, build, and maintain the robust data infrastructure essential for our AI and Automation initiatives. This critical role requires blending deep cloud expertise with an unwavering commitment to operational rigor, applying software engineering principles (DevOps/CI/CD) directly to the data ecosystem.We offer a hybrid working set-up either in Hirschau, Munich, (Berlin) or Krakau.Your contribution to successGuarantee Data Reliability: Own DataOps implementation and CI/CD to ensure all production data pipelines are reliable, traceable, and scalable for AI initiatives.Optimize Data Velocity: Architect high-performance Airflow and platform solutions that dramatically accelerate data delivery and time-to-value for automation projects.Establish Data Trust: Drive quality and governance standards using dbt, transforming raw data into trusted, documented assets critical for robust ML applications.Orchestration & Reliability: Design, implement, and rigorously monitor production ETL/ELT pipelines using Apache Airflow (DAGs, custom operators). Implement advanced monitoring, logging, and error recovery to guarantee pipeline stability and data freshness for the AI platform.DataOps & CI/CD: Apply full DataOps and DevOps principles (CI/CD, version control, automated testing) across the entire data development lifecycle. Enforce rigorous code quality standards for all transformation logic (SQL/Python).Infrastructure & Performance: Manage and scale core data components (DW, Airflow) using Infrastructure as Code (IaC). Proactively optimize the AI & Data platform performance (query tuning, partitioning) to support low-latency AI inference and resolve critical production issues.Documentation: Maintain clear and comprehensive documentation for data pipelines and architecture.Your success factors3+ years in a DataOps or highly related Data Engineer role.Expertise in Apache Airflow (complex DAGs, custom operators, production management).Deep hands-on experience with dbt (model development, data quality testing, package management).Expert-level SQL and deep experience working with public Cloud (GCP preferred) data services (e.g., BigQuery, Snowflake, Redshift).Strong Python skills for ETL/Airflow development.Experience with Infrastructure as Code (IaC) tools like Terraform.Practical knowledge of software engineering principles (testing, CI/CD).Excellent collaboration and analytical skillsGerman language is a plus, Fluent English is neccessaryYour benefitsWe Are a Great Place to WorkOur tolerant and empathic work environment enables you to help shape our company’s future, thereby giving you lots of room for personal development. Our unique company culture was officially recognised by the Great Place to Work institute.We Improve Your Work-Life BalanceFlexitime and hybrid working arrangements as well as a special leave allowance enable you to organise your workday around your needs. And we care about your children. Our head office in Hirschau operates a child day-care centre that looks after your little ones.We Communicate at a Peer LevelWe are a family-run business. Means friendly interaction is paramount to us. And when it comes to being a trust-based horizontal organisation, around here, we practise what we preach.We Promote HealthWhich is why we offer a variety of services and activities centred around health and fitness. Regular medical checkups are top of the agenda. Moreover, for all those who like to keep fit, there is Conrad Sports Club based at our Hirschau office that provides a range of activities, coaching sessions and even an in-house tennis court. Being healthy isn’t just about exercising, it’s also about a balanced diet. This is why food served in our cafeterias at Hirschau and Wernberg uses local produce and is freshly prepared.Your Contact PersonStefan GrafTeam Lead HR Recruiting & Employer Branding+49 1755710155
Responsibilities
Job Requirements
Apply now