Salary: £40,000 - 80,000 per year Requirements: Bachelors or Masters degree in Computer Science, Data Science, Engineering, or a related field 3-5 years of experience as a Data Engineer or in a similar role Proficiency in Python for web crawling (e.g., using libraries like Scrapy, BeautifulSoup, or Selenium) Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL) Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data tools (e.g., Spark, Hadoop) is a plus Prior experience in financial data analysis is highly preferred Understanding financial datasets, metrics, and industry trends Experience with API integrations and working with RESTful APIs (preferred) Knowledge of data visualization tools (e.g., Tableau, Power BI, or Matplotlib/Seaborn) (preferred) Familiarity with version control systems (e.g., Git) (preferred) Experience with containerization tools (e.g., Docker, Kubernetes) (preferred) Past experiences working in Fintech, Financial Services or related industries (preferred) Responsibilities: Develop, deploy, and maintain web crawlers using Python to extract data from websites and social media platforms Ensure the scalability, reliability, and efficiency of web scraping processes Perform data cleaning, standardization, and normalization to ensure data quality and consistency Handle missing data, outliers, and inconsistencies in large datasets Analyze extracted data using advanced statistical and machine learning models Collaborate with data scientists to implement state-of-the-art models for predictive and prescriptive analytics Leverage past experience in financial data analysis to provide insights and support decision-making processes Work with financial datasets to identify trends, patterns, and anomalies Design and maintain ETL (Extract, Transform, Load) pipelines to streamline data workflows Integrate data from multiple sources and ensure seamless data flow across systems Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders Communicate findings and insights effectively through visualizations, reports, and presentations Technologies: AI API AWS Azure Big Data Cloud Docker ETL Flow GCP Git Hadoop Support Kubernetes Machine Learning MySQL PostgreSQL Power BI Python SQL Selenium Spark Tableau TensorFlow Web numpy pandas GitHub More: Verityv is an innovative fast-growing Fintech start-up based in London and Cyprus, revolutionizing the way Non-traditional financial risks are delivered to the market. We are dedicated to leveraging cutting-edge machine learning and artificial intelligence technologies to evolve our product into an agentic AI system that seamlessly integrates into clients systems, automating compliance and portfolio risk analysis processes. We offer competitive salary and benefits packages, opportunities for professional growth and development, and a collaborative and innovative work environment. This position is based in Cyprus, for our UK subsidiary. last updated 4 week of 2026