


We are TXP. We help businesses and organisations move forward, at pace and at scale. We believe in the transformative power of combining technology and people. By providing consulting expertise, development services and resourcing, we work closely with organisations to solve their most complex business problems.Our work transforms organisations – and we take that responsibility seriously. We focus on success, pursue excellence and take ownership of everything we do.But achieving that level of performance requires an inclusive and supportive working environment. We believe in the power of technology and people, and we help everyone here to succeed. At TXP, you can multiply your potential.Role Purpose The Data Engineer is a client-facing delivery role responsible for designing, building and maintaining data pipelines and platform infrastructure on Microsoft Fabric. Operating within a consulting model, this role deploys across engagements of varying scope and duration, delivering production-grade data solutions that meet client requirements. The Data Engineer works within cross-functional delivery teams to transform raw data into reliable, governed and analytically ready assets. Key Responsibilities:Data Pipeline Development Design and build end-to-end data pipelines using Microsoft Fabric Data Factory, dataflows and notebooks. Implement ingestion patterns for batch and near-real-time data from diverse source systems including APIs, databases, flat files and event streams. Develop transformation logic using PySpark, Spark SQL or T-SQL within Fabric lakehouses and warehouses. Build and maintain medallion architecture (bronze, silver, gold) to structure data processing layers. Platform Engineering Configure and manage Microsoft Fabric workspaces, capacities and compute resources across client tenants. Implement OneLake storage strategies, including shortcuts and mirroring, to enable unified data access. Establish CI/CD pipelines for Fabric assets using Azure DevOps or GitHub Actions. Manage deployment across development, test and production environments with appropriate governance controls. Data Quality and Governance Implement data quality checks, validation rules and monitoring within pipelines. Apply data cataloguing, lineage tracking and metadata management practices using Purview integration. Design and enforce access control models, row-level security and sensitivity labelling. Document data models, pipeline logic and operational procedures to consulting-grade standards. Client Delivery Translate client requirements into technical designs and implementation plans. Operate within consulting delivery frameworks, managing scope, timelines and stakeholder expectations. Contribute to estimation, solution architecture and proposal development for data workstreams. Present technical approaches and progress updates to client stakeholders at varying levels of seniority. Conduct knowledge transfer sessions and produce handover documentation for client teams. Collaboration and Standards Work within multidisciplinary delivery teams alongside analytics engineers, data scientists and business analysts. Contribute to internal capability development through reusable accelerators, templates and reference architectures. Participate in code reviews, design reviews and retrospectives. Stay current with Microsoft Fabric platform updates, features and best practices. Required Skills and Experience: Demonstrable experience building data pipelines in production environments. Proficiency with Microsoft Fabric, including Data Factory, lakehouses, warehouses, notebooks and semantic models. Strong skills in PySpark, Spark SQL and T-SQL for data transformation. Experience with medallion architecture or equivalent layered data processing patterns. Working knowledge of OneLake, delta tables and Fabric capacity management. Proficiency with version control (Git) and CI/CD practices for data platform assets. Understanding of data governance principles, including cataloguing, lineage and access control. Experience working in a consulting, professional services or client-facing delivery environment. Strong written and verbal communication skills, with the ability to explain technical concepts to non-technical audiences. Desirable Skills Experience with Azure Data Lake Storage, Azure Synapse Analytics or Azure Data Factory prior to Fabric migration. Familiarity with Microsoft Purview for data governance and compliance. Exposure to Power BI semantic models and report development within Fabric. Experience with event-driven architectures using Azure Event Hubs or Kafka. Knowledge of infrastructure-as-code tools such as Terraform or Bicep for Azure resource provisioning. DP-600 (Fabric Analytics Engineer) certification or equivalent Microsoft certifications. Qualifications Relevant Microsoft certifications are advantageous. Benefits:25 days annual leave (plus bank holidays).An additional day of paid leave for your birthday (or Christmas eve).Salary sacrifice, matched employer contributed pension (4%).Life assurance (3x).Access to an Employee Assistance Programme (EAP).Private medical insurance through our partner Aviva.Cycle to work scheme.Corporate eye-care vouchers.Access to an independent financial advisor.2 x social value days per year to give back to local communities.Grow with us:Work on exciting new projects.If you want to avoid getting stuck with the mundane, you’re in the right place. We work in many sectors with fantastic clients, so you’ll always be working on something exciting and challenging.Career growth – we’ve got you!We recognise that you might have a career path planned out and you might need some support to help you move forward. We’re here to support you and make the most out of your time with us, through challenging work, opportunities to grow and learning and development opportunities.Be part of the TXP growth journey.We are a high growth, fast paced environment. We currently have 200+ employees and work with clients across the UK. Joining TXP means you’ll be part of that.
Lorem ipsum dolor sit amet, consectetur adipiscing elit.