We are hiring for a VC-backed AI startup in stealth, building the most accurate AI on very complex data. Their HQ is in London, but operate with a Silicon Valley mindset. That means everyone has high agency, obsessed with the problem we are solving, embraces the intensity of a 6-day week and has the ambition to build something world-changing. We are hiring exceptional research engineers / scientists to invent new foundation model architectures at the intersection of graph representation learning, transformers, and reinforcement learning. What you will do Prototype new encoders and objectives. Build reward models in ambiguous domains. Adapt cutting-edge research in test-time compute and post-training. Develop scalable data pipelines and evaluation frameworks. What we’re looking for PhD/MSc in CS, AI, Math, Physics. Hands-on experience with foundation models (PyTorch, Ray). Strong background in graph neural nets transformer architectures. Publications at top conferences (NeurIPS, ICML, ICLR, ACL). Experience with RL / post-training of LLMs. What we offer Founding level equity competitive salary. Chance to shape the direction of a new foundation model from day one. Work in person with a small, world-class research team in central London.