AI Deployment Engineer
Location: Bristol/London - Hybrid, 3 days in the office
Salary: £80,000 - £130,000 p/a dependent on experience + excellent benefits
The deep technical engine behind our internal AI deployment team. Where the AI Deployment Strategist scopes and builds agents, you own the infrastructure underneath — data pipelines, system integrations, and everything that ensures deployed AI solutions work reliably at scale.
You will take prototypes to production, ensure all business systems communicate correctly, and build the technical backbone that our AI tooling depends on.
Key duties & responsibilities:
• Own the data infrastructure underpinning AI deployments — pipelines, storage, and data serving
• Integrate AI solutions into the existing business ecosystem: CRMs, ERPs, SaaS tools, and internal systems
• Build and maintain APIs, webhooks, and middleware that allow AI agents to interact with business systems
• Take Strategist-built prototypes to production-grade — hardening, scaling, and ensuring reliability
• Set up monitoring, logging, and alerting across deployed pipelines and agent infrastructure
• Manage data models, schemas, and storage supporting current and future AI deployments
• Troubleshoot integration failures, data inconsistencies, and production issues
Key skills:
• Python & SQL (production-grade pipeline development)
• REST APIs, webhooks, OAuth, event-driven architecture
• Orchestration tools: Airflow, Prefect, or Dagster
• Cloud platforms: AWS, GCP, or Azure
• Docker & Kubernetes Microsoft 365 & Microsoft Copilot
Desirable skills:
• Vector databases and embedding pipelines
• Real-time streaming (Kafka, Flink)
• RPA tooling (UiPath, Power Automate) dbt for data transformation
• Claude Code, Claude Cowork, or Claude Skills
• Experience with vector databases, real-time streaming (Kafka, Flink), or RPA tooling (UiPath, Power Automate).
Experience & qualification:
• 3–5 years in software or data engineering with strong exposure to system integrations, data pipelines, and production infrastructure.
• Strong Python and SQL skills; experienced building robust, production-grade data pipelines from scratch.
• Deep familiarity with integration patterns: REST APIs, webhooks, OAuth, and event-driven architectures.
• Experience with orchestration tools (Airflow, Prefect, or Dagster) and transformation frameworks (dbt or similar).
• Comfortable across cloud platforms (AWS, GCP, or Azure) and with containerisation (Docker, Kubernetes).
• Experience connecting disparate business systems — SaaS platforms, internal databases, and third-party APIs — and making them work reliably.
• Strong debugging instincts and a high bar for reliability and data integrity.
• Comfortable with Microsoft 365 and Microsoft Copilot. Familiarity with AI productivity tools including Claude Code, Claude Cowork, and Claude Skills is a plus.
• Bachelor's degree in Computer Science, Engineering, or equivalent practical experience
Application:
Please submit your CV along with a cover letter to .
Smartstream is an equal opportunities employer. We are committed to promoting equality of opportunity and following practices which are free from unfair and unlawful discrimination.