Jason Weston is a Senior Director and Research Scientist at Meta Fundamental AI Research (FAIR) and a Visiting Research Professor at NYU. Prior to this, he held Research Scientist positions at the Max Planck Institute for Biological Cybernetics (2002-2003), NEC Labs (2003-2009) and Google (2009-2014), after receiving his PhD from Royal Holloway, University of London in 2000. Dr. Weston's research includes best paper awards at ICML'06 and ECML'10, a test of time award at ICML'18 and an outstanding paper award at ACL'25. He was part of the YouTube team that won a National Academy of Television Arts & Sciences Emmy Award for Technology and Engineering for Personalized Recommendation for Video Discovery. Dr. Weston has an h-index of 123 and 139,162 citations. Some of his notable work influencing the field of NLP includes the "NLP from scratch" work starting in 2008 which introduced pretraining and fine-tuning of language models, Memory Networks in 2014-2015 which introduced multi-layer attention pre-Transformers, DrQA in 2017 which introduced RAG-like methods, BlenderBot 1-3 and other LLM dialogue research pre-chatGPT in 2018-2022, and more recently work like Self-Rewarding LLMs for self-improvement.