Evergreen Passive Opening We’re always interested in talking to strong candidates, even if we’re not hiring for this role right this second. We are Manufacturing the Future! Geomiq is revolutionizing traditional manufacturing by providing engineers worldwide with instant access to reliable production methods through our digital platform. As the UK’s leading Digital Manufacturing Marketplace, we offer an AI-powered B2B MaaS (Manufacturing as a Service) solution, seamlessly connecting buyers and suppliers to drive efficiency and innovation. With our headquarters in London and quality branches in India, China, and Portugal, we collaborate with leading brands like BMW, Rolls-Royce, Brompton Bikes, and Google—even contributing to space exploration. Check out our website! Our platform: Geomiq offers a revolutionary platform that completely digitizes the quoting and ordering process for custom manufactured parts, ensuring the highest operational and quality outcomes. Our primary customers include Design Engineers, Mechanical Engineers, and Procurement teams, all of whom are involved in creating the world’s most innovative products. See our platform in action! About the role: We're looking for a hands-on Senior Data Engineer to own the full data lifecycle - from pipelines and transformations to backend tooling and BI. This role is perfect for someone who thrives in fast-paced environments, operates comfortably across both engineering and analytics, and is excited about building internal tools that directly improve product and customer experiences. You'll be working with a mature stack (Python, BigQuery, dbt, FastAPI, Metabase), and your day-to-day will include both writing production-level code and making data actually useful for decision-makers. Main responsibilities: Build, maintain, and optimize data pipelines using Python and dbt Own and evolve the backend codebase (FastAPI, Docker) Ensure pipeline reliability, code quality, and proper testing/documentation Maintain and extend data models and the BI layer (Metabase) Collaborate closely with product, data science, and leadership on strategic data tools Design and deliver internal tools - potentially leveraging LLMs and OpenAI APIs Write clean, production-grade code with version control (GitLab) Experience Required: 5 years of Python, including writing production-level APIs Strong SQL and DBT for data transformation and modeling Experience with modern data stack components: BigQuery, GCS, Docker, FastAPI Solid understanding of data warehousing principles Proven ability to work cross-functionally with both technical and non-technical stakeholders Comfortable maintaining and optimizing BI dashboards ( Metabase preferred) Nice to have Experience: Familiarity with Streamlit or other lightweight internal tooling UIs Exposure to LLMs, OpenAI tools, or agent-based systems Experience using Google Analytics or reverse ETL tools like Hightouch Building and scaling data products from scratch Prior experience in startup or scale-up environments Benefits: Competitive Salary: We offer pay that reflects your skills and the value you bring. Stocked Kitchen: Enjoy snacks, fresh fruit, and drinks all day. 23 Days Annual Leave: Recharge with 23 days off, plus bank holidays. Birthday Off: Take an extra day to celebrate your birthday. Christmas Shutdown: Relax over the holidays with additional company-wide time off. Pet-Friendly Office: Bring your dog to our pet-friendly workspace. Team Events: Connect with colleagues through monthly team-building activities. Career Growth: Benefit from our focus on internal promotions and development. Cycle to Work Scheme: Save on commuting, reduce emissions, and stay active. Expanding Perks: Look forward to more benefits as we grow