Contract: Outside IR35
Location: UK-based, Hybrid / Remote
Clearance: Active SC clearance required (minimum)
Start: June
Day Rate: Competitive market rates
Overview
We are recruiting a contract Databricks Data Engineer for a large-scale UK Government programme delivering critical, cloud-native data platforms. This is a long-term engagement within a high-profile public sector transformation, working with sensitive, mission-critical data at national scale.
You must already hold active SC clearance to be considered.
Responsibilities
- Design, build and maintain Databricks data pipelines using Apache Spark (PySpark & Spark SQL)
- Implement Delta Lake / Lakehouse architectures (Bronze, Silver, Gold)
- Develop scalable ELT/ETL workflows for batch and streaming data
- Optimise performance, reliability and cost across cloud data platforms
- Apply data quality, governance and security controls aligned to government standards
- Collaborate with architects, analysts and platform teams in an agile delivery environment
- Contribute to CI/CD pipelines and automated testing practices
Essential Skills & Experience
- Proven commercial experience as a Databricks Data Engineer
- Strong PySpark and SQL capability
- Hands-on delivery within AWS, Azure or GCP
- Experience with Delta Lake, data lakes and data warehouses
- Strong understanding of data modelling and scalable pipeline design
- Experience with Git, CI/CD and agile ways of working
- Active SC clearance (mandatory)
Desirable Experience
- Previous UK government or public sector contracts
- Experience with Airflow, Azure Data Factory, AWS Glue or Kafka
- Exposure to Snowflake or hybrid data architectures
- Knowledge of secure, regulated data environments