Senior Data Engineer

Location: City of London, London Salary: £75000.00 - £85000.00 per annum Type: Permanent

Senior Data Engineer

Azure, Databricks, PySpark, Azure Data Factory

London - Hybrid (2 days in the office)
£75,000 to £85,000 + bonus & benefits

We are working with a leading Lloyd's & London Market insurer who are continuing to invest heavily in their Azure-based data platform. This role sits within a specialist data engineering team responsible for building & evolving a modern cloud data architecture.

You will work across a Databricks-led Azure data stack, contributing to both the design & development of scalable data pipelines & helping shape how the platform evolves. The team operate in a collaborative environment with strong technical capability, where engineers are encouraged to take ownership & influence solutions.

This role is suited to a hands-on Senior Data Engineer who is comfortable working across both development & design, with strong experience in Databricks & modern data engineering practices.

Responsibilities

  • Design & develop scalable data solutions using Azure technologies including Databricks, Azure Data Factory, ADLS & Synapse
  • Build & optimise data pipelines using PySpark & SQL within a lakehouse architecture
  • Work with Delta Lake & contribute to the implementation of structured data layers (e.g. bronze, silver, gold)
  • Contribute to solution design, translating business requirements into technical data models & pipelines
  • Collaborate with engineers, architects & product owners to deliver new data capabilities
  • Implement & support CI/CD pipelines & DevOps practices across multiple environments
  • Monitor & optimise performance of data pipelines, identifying improvements in scalability & efficiency
  • Support data quality, validation & governance practices across the data platform

Requirements

  • Strong experience as a Data Engineer within an Azure environment
  • Hands-on experience with Databricks, including PySpark & Spark SQL
  • Experience working with Azure Data Factory & Azure Data Lake (ADLS Gen2)
  • Good understanding of Delta Lake & modern lakehouse architectures
  • Strong SQL skills, including complex transformations & analytical queries
  • Experience building & maintaining ETL or ELT pipelines at scale
  • Exposure to CI/CD & DevOps tooling (e.g. Azure DevOps, GitHub Actions)
  • Experience working in Agile teams with strong stakeholder engagement

Nice to have

  • Experience with data modelling (dimensional / semantic models)
  • Exposure to data governance or tooling such as Unity Catalog
  • Experience with streaming or near real-time data pipelines
  • Experience within Financial Services or the London Market

Why apply

  • Opportunity to work on a modern Azure & Databricks data platform
  • Strong technical team with real ownership & influence
  • Exposure to both engineering & solution design
  • Clear opportunity to further develop within a cloud-first environment