The Data Engineering Principal leads the AWS design, building and implementation of processes to capture, manage, store and utilise structured and unstructured data from internal and external sources, turning the most complex business needs into the data that supports the DD Ops and BT group's data strategies.
Responsibilities
* Lead the design and implementation of robust, scalable, and secure data solutions using AWS services such as S3, Glue, Lambda, Redshift, EMR, Kinesis, and more-covering data pipelines, warehousing, and lakehouse architectures.
* Drive the migration of legacy data workflows to Lakehouse architectures, leveraging Apache Iceberg to enable unified analytics and scalable data management.
* Operate as a subject matter expert across multiple data projects, providing strategic guidance on best practices in design, development, and implementation.
* Build and optimise data pipelines for ingestion, transformation, and loading from diverse sources, ensuring high standards of data quality, reliability, and performance.
* Own the development of automation and monitoring frameworks that capture operational KPIs and pipeline health metrics, enabling proactive performance management.
* Identify and resolve performance bottlenecks in data workflows, ensuring optimal resource utilisation and cost-efficiency.
* Collaborate closely with architects, Product Owners, and development teams to decompose solutions into Epics, leading the design and planning of technical components.
* Mentor and coach engineering professionals, fostering a culture of continuous learning, innovation, and technical excellence.
* Champion inclusive and open team culture, leading complex projects autonomously and facilitating high-impact technical discussions.
* Define and manage service level agreements (SLAs) for data products and production processes, ensuring reliability and accountability.
* Develop and optimise data science procedures, including storage strategies using distributed structures, databases, and other scalable technologies.
* Lead the implementation of continuous improvement initiatives, enhancing team processes and delivery capabilities.
* Serve as a trusted advisor to internal stakeholders, including data science and product teams, translating complex technical concepts into actionable solutions.
Qualifications
* Possess deep technical expertise in data engineering, with a strong command of modern practices and methodologies.
* Recognised as an expert in AWS cloud services, particularly in designing and implementing scalable data engineering solutions.
* Bring extensive experience in software architecture and solution design, ensuring robust and future-proof systems.
* Hold specialised proficiency in Python and Apache Spark, enabling efficient processing of large-scale data workloads.
* Demonstrate the ability to set technical direction, uphold high standards for code quality, and optimise performance in data-intensive environments.
* Adept at using automation tools and CI/CD pipelines to streamline development, testing, and deployment processes.
* An exceptional communicator, capable of translating complex technical concepts for diverse audiences including engineers, product managers, and senior leadership.
* Provide thought leadership within engineering teams, fostering a culture of quality, efficiency, and collaboration.
* Experienced in mentoring engineers, guiding them in advanced coding practices, architectural thinking, and strategic problem-solving to elevate team capabilities.
* Former Principal Engineer with a proven track record of leading teams in best practices across design, development, and implementation. Known for mentoring engineers and cultivating a culture of continuous learning and innovation.
* Extensive background in software architecture and solution design, with deep expertise in microservices, distributed systems, and cloud-native architectures.
* Advanced proficiency in Python and Apache Spark, with a strong focus on ETL data processing and scalable data engineering workflows.
* In-depth technical knowledge of AWS data services, with hands‑on experience implementing data pipelines using tools such as EMR, AWS Glue, AWS Lambda, Step Functions, API Gateway, and Athena.
* Proven experience in designing and delivering Lakehouse architectures, enabling unified analytics across structured and unstructured data.
#J-18808-Ljbffr