Senior Data Engineer – Dublin Hybrid – 3 days in the officeWe are seeking a Senior Data Engineer to join our growing team as part of an exciting Security Innovation program. This is an opportunity to design and deliver scalable, high-performance data solutions that leverage AI and Machine Learning to combat financial fraud. You’ll work with cutting-edge technologies across big data, cloud, and modern data platforms to build pipelines, optimize workflows, and support analytics at scale.What You’ll DoBuild, optimize, and maintain ETL pipelines using Hadoop ecosystem tools (HDFS, Hive, Spark).Assemble, process, and manage large, complex datasets to support analytics, BI, and AI-driven applications.Collaborate with Software Engineers, Data Scientists, and Architects to deliver efficient, reliable, and scalable data processing solutions.Perform data modelling, quality checks, and system performance tuning to ensure accuracy and efficiency.Design and implement process improvements for automation, workflow orchestration, and data scalability.Support modernization efforts, including cloud adoption and Databricks integration.Take ownership of clarifying requirements and proposing scalable, secure solutions before implementation.What You BringStrong experience with SQL and relational/NoSQL databases (Postgres, Oracle, CosmosDB).Hands-on expertise in big data frameworks such as Spark, Hadoop, Hive, Impala, Oozie, Airflow, and HDFS.Proficiency in Java, Scala, or Python for data engineering and automation.Solid understanding of distributed data processing, stream processing, and message queuing (Kafka, Spark Streaming, Storm).Cloud experience (preferably AWS) with services such as S3, Athena, EMR, Redshift, Glue, Lambda.Experience with Snowflake or similar data warehousing solutions.Familiarity with Databricks and AI/ML integration for data platforms.CI/CD, containerization (Docker), and workflow management experience.Comfortable working in Agile environments (Scrum, SAFe) and collaborating across cross-functional teams.Strong problem-solving skills and ability to extract value from large, complex datasets.Tech Stack You’ll Work WithProgramming: Java / Scala / PythonBig Data: Spark, Hadoop (Hive, Impala, Oozie, Airflow, HDFS)Cloud: AWS (S3, Athena, EMR, Redshift, Glue, Lambda), Azure or GCP experience a plusData Platform: Databricks, SnowflakeTools: Kafka, Docker, CI/CD pipelines
Responsibilities
Job Requirements
Apply now