Direct message the job poster from HANNON Transport Ltd.
Location: Aghalee (base location, travel expected)
Job Information
* Job Title: Senior Application Support Engineer
* Department: IT
* Reports To: Head of IT
* Location: Aghalee (base location, travel expected)
* Salary Range: £45k to £55k depending on experience
Job Purpose
We’re hiring a senior, hands‑on engineer to own data and application integrations across the group. You’ll combine strong TSQL and Python to support and improve our operational systems (TMS/WMS/FMS), design and build a scalable BI/data platform (AWS + Microsoft), and drive automation that removes spreadsheet‑based, siloed processes. You will also help integrate AI into BI and operational workflows, working closely with stakeholders across Logistics, Operations, Finance, and Compliance.
Data Platform & BI (AWS)
* Design and implement modern data architectures: S3 data lake, Glue Data Catalog/ETL, Athena, Redshift (or equivalent), Lake Formation (governance), and QuickSight (or equivalent BI tool) (datasets, in‑memory acceleration, RLS/column‑level security).
* Build reliable data pipelines (batch/stream) using Glue/Lambda/Step Functions, event‑driven patterns (SQS/SNS/EventBridge), and CI/CD for data (IaC with Terraform/CloudFormation).
* Create dimensional models (star/snowflake), conformed dimensions, and robust SQL/Python transformations with strong testing and documentation.
* Implement monitoring, lineage, and cost optimisation; define and track RPO/RTO for analytics platforms.
Applications & Integrations (TMS/WMS/FMS)
* Provide Level 3 application support across transport, warehouse, and fleet systems; triage incidents, tune SQL, and deliver permanent fixes.
* Build and maintain systems integrations via REST/SOAP APIs, EDI (XML/JSON) schemas, and flatfile interfaces (SFTP/AS2).
* Author integration runbooks, error‑handling/retry patterns, and observability (dashboards/alerts) for data/app interfaces.
SQL & Python Engineering
* Write high‑quality TSQL (procedures, functions, views), optimise queries/indexes, and manage release/change control.
* Develop Python services and utilities (ETL jobs, API clients, data quality checks, automation tools), packaged and containerised where suitable.
* Use Git and CI pipelines for versioning, testing, and releases.
AI in BI & Operations
* Integrate AI services to augment BI (e.g., automated insights, anomaly detection, natural‑language Q&A, intelligent alerting) using platforms such as AWS Bedrock (and/or Azure equivalents), embedding safely into apps/dashboards.
* Implement RAG/semantic search on approved datasets; enforce guardrails, access controls, and auditability.
Automation & Process Improvement
* Replace spreadsheet/offline processes with governed datasets, apps, or workflows; standardise master data and reference tables.
* Build lightweight internal tools (portals, scripts, APIs) to streamline business workflows; document and train users.
Security, Governance & DR
* Enforce least‑privilege, RBAC, and secrets management (KMS/SSM Parameter Store/Key Vault).
* Apply data governance (naming, cataloguing, classification, retention), access reviews, and audit logging across AWS/Microsoft estates.
* Contribute to backup/restore strategies and DR tests for data platforms and critical applications.
Essential
Knowledge, skills and experience required
* Bachelor's degree in Computer Science/Information Systems (or equivalent experience).
* 5+ years professional experience across SQL (TSQL preferred), application support, and BI/data platforms.
* Strong Python skills; proven experience delivering production ETL/ELT pipelines.
* Hands‑on with AWS analytics (S3/Glue/Athena/Redshift), and experience with a mainstream BI platform Power BI currently in use; QuickSight.
* Solid understanding of data modelling, performance tuning, and testing; comfortable with Git/CI and infrastructure as code.
* Excellent stakeholder engagement, documentation, and problem‑solving; able to translate business needs into data solutions.
* Experience supporting and integrating operational systems (TMS/WMS/FMS) and working with APIs/EDI/XML/JSON.
Desirable
* AWS: Data Analytics Specialty, Solutions Architect, Developer or SysOps; Quicksight Author/Admin.
* Microsoft: DP203 (Data Engineering), PL300 (Power BI), Fabric Analytics Engineer.
* Databases: Microsoft SQL Server (MCSA/DP300), PostgreSQL, MySQL experience beneficial.
* Integration/Security: ITIL v4, experience with Purview, Azure Data Factory, Kafka/Kinesis, or message brokers.
* Familiarity with Bedrock‑hosted LLMs (e.g., Claude) or Azure OpenAI, and vector databases.
Personal Attributes
* Analytical mindset with a solutions‑focused approach
* Strong communication and collaboration skills
* Ability to manage multiple tasks and deliver to deadlines
* Proactive, innovative, and adaptable in a changing environment
Why join Hannon Transport
* Join a company focused on innovation, sustainability and growth
* Exposure to large‑scale digital transformation projects in a leading logistics organisation
* Professional development and training opportunities
* A competitive salary and company benefits are available for the right candidate
Note: This description is intended to be a guide of what duties are most likely to be but should not be taken as a definitive list. Hannon may adapt duties as deemed necessary.
#J-18808-Ljbffr