Data Engineer (Mid Level)
London – 4 Days in Office
Reporting to: Director, Special Projects
Note: We are unable to provide Visa Sponsorship for this role
Data at Pubity moves fast. We’re looking for an engineer who wants to take real ownership early in their career and become the first dedicated data hire at the company.
You will help build the foundations of how data works across the business. That means creating reliable pipelines, structuring messy inputs, and building systems the team can trust to make decisions.
If you enjoy building things from the ground up, working closely with stakeholders, and developing scalable data systems as you grow into the role, this is a rare opportunity to do it.
About the Role
As our first Data Engineer, you will help shape Pubity Group’s data function from the ground up.
This is a hands on role where you will design and build the first generation of our data pipelines and models. You will work closely with the Director of Special Projects and teams across Social, Commercial and Studio to ensure data is reliable, accessible and useful across the business.
You won’t be expected to arrive with a fully built playbook. Instead, we want someone capable, curious and ambitious who wants to take ownership of the data layer and grow into the role as the function scales.
Key Responsibilities
You will:
* Take ownership of the company’s early stage data infrastructure and help shape how it evolves
* Build and maintain data pipelines and models in GCP using BigQuery, SQL and Python
* Develop reliable Python pipeline code using tools such as requests and pandas
* Build and maintain API integrations across platforms including Meta, TikTok, YouTube and X
* Expand pipelines into internal systems such as CRM platforms, project management tools and Slack
* Implement data validation, monitoring and quality checks to ensure reliability
* Help establish metric definitions and consistent reporting standards across platforms
* Schedule and orchestrate data workflows using GCP tools such as Cloud Composer
* Work with teams across Social, Editorial, Commercial and Studio to deliver reporting and dashboards
* Document pipelines, definitions and processes so the data function can scale properly over time
What We’re Looking For
Must Haves
* Experience building and maintaining production data pipelines
* Strong SQL skills and experience with BigQuery or a similar warehouse
* Strong Python skills for data pipelines and API integrations
* Experience working with APIs and ingesting external platform data
* Familiarity with modern data modelling tools such as dbt or Dataform
* A mindset focused on reliability, monitoring and good engineering practices
Important
* Experience with orchestration tools such as Cloud Composer, Airflow or similar
* Experience connecting data models to BI platforms such as Power BI, Looker or similar
* Ability to work with stakeholders and prioritise requests across teams
Nice to Have
* Experience optimising queries and managing warehouse costs
* Postgres or Supabase experience
* Node.js scripting for integrations
* Exposure to multiple cloud environments such as AWS or Azure
Platforms and Pipeline Scope
You will help build and scale pipelines across:
Meta (Facebook and Instagram) which already have integrations
Next platforms including TikTok, YouTube, X, Snap, LinkedIn and Google Ads
Internal systems such as CRM platforms, project management tools, meeting trackers and Slack
Much of this infrastructure will be built for the first time, and you will play a key role in defining how it works.
You’ll Thrive Here If
* You enjoy building things from scratch
* You want ownership and impact early in your career
* You like solving messy data problems and making systems reliable
* You work well with non technical teams and can translate data into useful outputs
#J-18808-Ljbffr