Outside Spy
  • Join Outside Spy
  • Member Area
  • Blog
  • Member Sign in
  • Sign up
  • Join Outside Spy
  • Member Area
  • Blog

2 data engineer jobs found

Refine Search
Current Search
data engineer £400 - £500 England
Refine by City
Newbury  (1)
Outside Spy
Feb 18, 2026
Duration not stated
Data Engineer at staffing connect, Newbury, £400 per day
£400 per day
Outside Spy Newbury, Berkshire
Details Newbury Hybrid 2 days onsite mandatory Client Gamma RRF ID 2026-18851 £400 per day Outside IR35 Duration 3 months initial Start ASAP Overview We are looking for an experienced Data Engineer to design and deliver scalable, secure, and high performance data solutions. The role focuses heavily on Snowflake architecture, cloud data platforms, and modern data pipeline development. Key Responsibilities Design and architect data solutions using Snowflake Optimise workflows and integrations using Openflow Develop and maintain ETL and ELT pipelines from Salesforce, relational databases, files, and NoSQL sources Implement dimensional modelling to support data warehousing and BI reporting Build and optimise data pipelines across Azure and AWS Ensure data quality, governance, and security best practices Enable reporting and analytics through Power BI and Tableau integration Collaborate with cross functional teams to translate business requirements into...
Outside Spy
Feb 23, 2026
Duration not stated
Senior Databricks Consultant at IO Associates, South West England, £350-£400 per day
£350 - £400 per day
Outside Spy South West England, UK
We're supporting a major financial-systems transformation programme and are looking for a Senior Databricks Engineer to help build and enhance a scalable data-engineering framework used across critical reporting processes. You'll operate as the senior hands-on engineer, shaping reusable libraries, optimising PySpark pipelines, and guiding offshore developers to deliver high-quality, production-ready code. You'll work across ingestion, validation, transformation, and mapping layers within a Databricks-on-Azure environment, helping to establish consistent engineering patterns and ensuring all deliveries meet high standards of performance, traceability, and maintainability. What you'd be doing: Building and extending reusable Databricks/PySpark libraries (ingestion, validation, transformation, mapping). Developing scalable, optimised PySpark pipelines aligned to metadata-driven or hybrid data models. Implementing robust validation and control logic suitable for...
  • Info
  • Contact
  • How it works
  • Terms & Conditions
  • Help on Search & Alerts
  • Employer
  • Request Employer Account
  • Employer Sign in
  • Contractor
  • Join for Access
  • Social
  • Twitter
  • LinkedIn
© 2019-2026 Powered by Contract Intelligence