Technical Data Architect

New Yesterday

Overview

Join to apply for the Technical Data Architect role at Death with Dignity.

We are hiring for Technical Data Architect with location: Central London. Type: Permanent. Hybrid role (2-3 days from client location).

We are seeking a highly skilled Technical Data Architect with expertise in Databricks, PySpark, and modern data engineering practices. The ideal candidate will lead the design, development, and optimization of scalable data pipelines, while ensuring data accuracy, consistency, and performance across the enterprise Lakehouse platform. This role requires strong leadership, technical depth, and the ability to collaborate with cross-functional teams.

Key Responsibilities

  • Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks.
  • Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake.
  • Guide the team in migrating legacy ETL processes to modern cloud-based data pipelines.
  • Ensure data accuracy, schema consistency, row counts, and KPIs during migration and transformation.
  • Collaborate with Data Engineer, BI Engineers, and Security teams to define data standards, governance, and compliance.
  • Optimize Spark jobs and Databricks clusters for performance and cost efficiency.
  • Support real-time and batch data processing for downstream systems (e.g., BI tools, APIs, reporting consumers).
  • Mentor junior engineers, conduct code reviews, and enforce best practices in coding, testing, and deployment.
  • Validate SLAs for data processing and reporting, ensuring business requirements are consistently met.
  • Stay updated with industry trends and emerging technologies in data engineering, cloud platforms, and analytics.

Required Skills & Qualifications

  • 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role.
  • Strong expertise in Databricks, PySpark, and Delta Lake. DBT.
  • Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling.
  • Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms (AWS/GCP).
  • Strong knowledge of data warehousing, transformation logic, SLAs, and dependencies.
  • Hands-on experience with real-time streaming, near-realtime batch processing is a plus; optimisation of Databricks and DBT workloads, and Delta Lake.
  • Familiarity with CI/CD pipelines, DevOps practices, and Git-based workflows.
  • Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC2, ISO).
  • Excellent problem-solving skills, leadership ability, and communication skills.

Preferred Qualifications

  • Certifications in Databricks, Azure (good to have).
  • Experience with DBT, APIs, or BI integrations (Qlik, Power BI, Tableau) – good to have.

Seniority level

  • Mid-Senior level

Employment type

  • Full-time

Job function

  • Information Technology

Industries

  • IT System Design Services
#J-18808-Ljbffr
Location:
City Of London, England, United Kingdom
Salary:
£125,000 - £150,000
Job Type:
FullTime
Category:
IT & Technology

We found some similar jobs based on your search