Overview
Are you ready to revolutionise the world with TEKEVER?
At TEKEVER, we lead innovation in Europe as the European leader in unmanned technology, where cutting-edge advancements meet unparalleled innovation.
Digital | Defence | Security | Space
We operate across four strategic areas, combining artificial intelligence, systems engineering, data science, and aerospace technology to tackle global challenges - from protecting people and critical infrastructure to exploring space.
We offer a unique surveillance-as-a-service solution that delivers real-time intelligence, enhancing maritime safety and saving lives. Our products and services support strategic and operational decisions in the most demanding environments - whether at sea, on land, in space, or in cyberspace.
Become part of a dynamic, multidisciplinary, and mission-driven team that is transforming maritime surveillance and redefining global safety standards.
At TEKEVER, our mission is to provide limitless support through mission-oriented game-changers, delivering the right information at the right time to empower critical decision-making.
If you're passionate about technology and eager to shape the future - TEKEVER is the place for you.
Mission
As a Data Engineer, you will play a critical role in designing, building and maintaining the data pipelines and systems that support our data-driven initiatives, as well as supporting the evolution of our Data & Analytics Platform. You will work closely with data scientists, analysts and other stakeholders to ensure that our data & AI infrastructure is robust, scalable and efficient. The ideal candidate will have a strong background in data engineering, with experience in data integration, ETL processes, database management and Data & Analytics Platform development.
What will be your responsibilities
* Data Pipeline Development: Design, develop and maintain scalable and efficient data pipelines to collect, process and store large volumes of data from various sources.
* ETL Processes: Implement ETL (Extract, Transform, Load) processes to ensure data is accurately and efficiently transformed and loaded into data storage systems.
* Database Management: Manage and optimize databases and data warehouses to ensure data integrity, performance and availability.
* Data Integration: Integrate data from multiple sources, including APIs, databases and external data providers, to create unified datasets for analysis.
* Data & Analytics Platform development & expansion: support the expansion of our Data & Analytics Platform.
* Data Quality Assurance: Implement data validation and quality assurance processes to ensure the accuracy and consistency of data.
* Collaboration: Work closely with data scientists, analysts and other stakeholders to understand data requirements and provide the necessary data infrastructure and support.
* Performance Optimization: Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing and retrieval.
* Documentation: Maintain comprehensive documentation of data pipelines, ETL processes and database schemas.
Profile and requirements
* Education: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field.
* Preferred location of working: UK, Portugal, France and/or Spain
* Experience: 3+ years of experience in data engineering or a similar role.
* Technical Skills:
o Proficiency in programming languages such as Python, Java, or Scala.
o Experience with SQL and database management systems (e.g., MySQL, PostgreSQL, SQL Server).
o Familiarity with big data technologies (e.g., Hadoop, Spark) and data warehousing solutions (e.g., Redshift, Snowflake).
o Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services, with a focus on Google Cloud. Google Cloud certification is preferred.
o Knowledge of data integration tools and frameworks (e.g., Apache Nifi, Talend, Informatica).
o Experience with data modeling and schema design.
o Experience with Iaac (e.g. Ansible, Terraform), data pipeline orchestration (e.g. Airflow), log exploration tools (e.g. Streamlit, Dash), data extraction (e.g. PostGIS, Kafka, Airflow, FastAPI), pandas, scikit-learn, Docker.
o Basic understanding of DevOps best practices and tools: GIT, CI/CD, telemetry and monitoring, etc.
* Analytical Skills: Strong analytical and problem-solving skills with a focus on delivering scalable and efficient data solutions.
* Communication: Excellent verbal and written communication skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
* Language Requirements: Advanced proficiency in Portuguese and English, with proven fluency at the C2 level in both languages.
* Attention to Detail: High attention to detail and a commitment to ensuring data quality and accuracy.
* Adaptability: Ability to work in a fast-paced, dynamic environment and manage multiple priorities simultaneously.
What we have to offer you
* An excellent work environment and an opportunity to create a real impact in the world;
* A truly high-tech, state-of-the-art engineering company with flat structure and no politics;
* Working with the very latest technologies in Data & AI, including Edge AI, Swarming - both within our software platforms and within our embedded on-board systems;
* Flexible work arrangements;
* Professional development opportunities;
* Collaborative and inclusive work environment;
* Salary compatible with the level of proven experience.
Do you want to know more about us ?
Visit our LinkedIn page at https://www.linkedin.com/company/tekever/
Department DATA & AI Locations Tekever Bristol (UK) Remote status Hybrid
#J-18808-Ljbffr