We are looking for an experienced Platform Engineer with strong expertise in Databricks and the Microsoft Azure ecosystem. In this permanent role, you will build and maintain a scalable Databricks platform for Reporting & Analytics, focusing on CI/CD pipelines, Lakeflow processes, and data integrations.You will collaborate closely with data engineers, BI analysts, and architects to ensure a secure, standardized, and high-performing data platform. This role requires 2–3 days per week on-site.Key ResponsibilitiesDevelop metadata-driven CI/CD pipelines in DatabricksImplement Lakeflow and medallion architectureManage Azure components (Key Vault, ADLS Gen2, AAD)Integrate data via APIsDeploy using Databricks Asset Bundles (DABs)Monitor and optimize cluster performanceManage reporting integrations (Power BI/Fabric) and publishing to the LakehouseContribute to documentation and knowledge sharingWhat You Bring5+ years as a Platform Engineer or Data Engineer within AzureStrong programming skills in Python and PySparkHands-on experience with Databricks (Unity Catalog, Lakeflow, DABs)Experience with testing, version control, and Azure DevOpsRelevant certifications (e.g., DP-203, Databricks Associate/Professional)Strong analytical and problem-solving skillsWhat We OfferCompetitive salary and excellent benefitsOpportunities for continuous learning and developmentHybrid working model with 2–3 days in the office
Responsibilities
Job Requirements
Apply now