The Samsung Research and Development Institute, UK (SRUK) conducts cutting-edge applied research on foundations and challenges of artificial intelligence (AI) to develop state-of-the-art solutions for real-world large-scale problems. The department often disseminates the research outputs in local and international research communities such as in research seminars, and top-tier conferences and journals.The accepted candidates will have the opportunity to work on diverse fields of AI as a member of the AI Research Team, including machine learning, deep learning and natural language processing (NLP).Role and ResponsibilitiesAs a forward-thinking company, we are at the forefront of innovation and seek an individual with a passion for pushing the boundaries of AI:Cutting-edge research to develop state-of-the-art solutions to existing problems and\/or propose novel research challenges considering real-world case studies in AI.Development of high-quality code with detailed documentation to support reproducible research in local and international research communities in machine learning.Review the state-of-the-art research papers and develop prototype solutions.Publication in top-tier conferences and journals, such as NeurIPS, ICML, ICLR, EMNLP, CVPR, ICCV, ECCV, AAAI, ACL, IEEE TPAMI, IEEE TNNLS, IJCV, and JMLR.Skills and QualificationsbRequired Skills:Academic Background: Pursuing a PhD in ML\/AI, Computer Science\/Engineering, or related fields.Mathematical and Computer Science Fundamentals: Proficiency in calculus, probability, statistics, linear algebra, optimization, algorithms, data structures, and parallel\/distributed computing.Machine Learning and Deep Learning: Strong understanding of ML and DL concepts.Research and Implementation:First-author publications in top ML\/AI conferences\/journals (e.g., ICML, NeurIPS, ICLR, EMNLP, CVPR, ECCV, IEEE TPAMI, AAAI).Hands-on experience in at least one of these areas: Generative AI, Parameter-Efficient Fine-Tuning (PEFT), Foundation Models, Large Language Models (LLMs), Data\/Model Privacy, Prompting Methods, and Model Compression (e.g., Quantization, Pruning, Knowledge Distillation).Technical Experience:Familiarity with Linux environments.Proficiency in programming languages like Python, Java, or C++.Experience with ML libraries such as PyTorch, SciKit, and NumPySoft Skills: Excellent communication, teamwork, problem-solving, and debugging abilities.Desirable SkillsAdvanced Expertise:Text Generation, PEFT-LoRA, Autoregressive Models, and Prompting Methods.Training\/Fine-tuning Foundation Models and LLMs.Data\/Model Privacy and Model Compression techniques.Retrieval Augmented Generation.Contract Type: 2-3 Month Internship starting ASAP and running until 31st Dec 25Location: Samsung R&D Centre, Staines-upon-Thames, Surrey, UKHybrid Working: Minimum of 3 days per week onsite and 2 days working from home