About Me
Hi, I’m Younesse Kaddar, a French PhD student in Computer Science at the University of Oxford working on category theory, probabilistic programming, and deep learning for reasoning, especially program synthesis and automated theorem proving. Ultimately, what I’m passionate about is understanding cognition, and I see two paths toward this goal:
- A more algebraic one through formal methods and category theory (deeper mathematical understanding, but less immediately applicable); a first-step sweet spot on this front (in my very biased opinion) is the categorical semantics of probabilistic programming languages, and, more recently, Markov categories.
- And a more empirical one through state-of-the-art transformer-based reasoning (which has shown incredibly surprising results lately, but still lacks reliability, steerability, safety, and formal theory).
Last year, I did research internships to explore the latter direction. I spent six months at Mila (Quebec AI Institute) working in Yoshua Bengio’s team on GFlowNets for steering outputs of large language models (LLMs), trying to make LLMs more reliable in their reasoning with amortized sampling of high-quality reasoning chain-of-thoughts. I then joined Cohere For AI as a research scholar, focusing on detecting and mitigating LLM hallucinations.
Currently, I’m back in my PhD, working on LLM-guided probabilistic program synthesis: the idea is to use LLMs to automatically turn natural language descriptions into probability distributions over statistical models (expressed as probabilistic programs) to model real-world phenomena. The dream would be to make complex statistical modeling more accessible and interpretable while ensuring safety, by using GFlowNet-finetuned LLMs as “world model compilers” rather than unrestricted agents.
I am also a co-maintainer of LazyPPL, a Haskell probabilistic programming library for Bayesian nonparametrics, within Sam Staton’s team.
So more broadly, my academic work tries to bridge two approaches: formal methods/category theory on one hand, practical machine learning (especially LLM research) on the other, with the long-term goal of building AI systems we can trust, to, ultimately, better understand the nature of intelligence.
Outside of academia, I’m also the cofounder and CTO of RightPick, a startup which helps alumni from a network of European universities find top jobs in tech, finance, and consulting. Our iOS app is available on the AppStore, and RightPick has been featured on several sections of the Oxford University Careers website, including the Management Consultancy, Technology, Data, Machine Learning & AI, Business & Management, and Banking & Investment pages.
Publications and Research Experience
2025
- Uncertainty-Aware Step-wise Verification with Generative Reward Models
Z. Ye, L. C. Melo, Y. Kaddar, P. Blunsom, S. Staton, Y. Gal
Preprint 2025
Paper
2024
-
Can a Bayesian Oracle Prevent Harm from an Agent?
Y. Bengio, M. K. Cohen, N. Malkin, M. MacDermott, D. Fornasiere, P. Greiner, Y. Kaddar
Paper Code -
Amortizing Intractable Inference in Large Language Models
E. J. Hu, M. Jain, E. Elmoznino, Y. Kaddar, G. Lajoie, Y. Bengio, N. Malkin
International Conference on Learning Representations (ICLR) 2024 (Honourable Mention)
Paper Code -
Probabilistic Programming Interfaces for Random Graphs: Markov Categories, Graphons, and Nominal Sets
N. Ackerman, C. Freer, Y. Kaddar, J. Karwowski, S. Moss, D. Roy, S. Staton, H. Yang
Principles of Programming Languages (POPL) 2024
Paper
2023
-
A Model of Stochastic Memoization and Name Generation in Probabilistic Programming: Categorical Semantics via Monads on Presheaf Categories
Y. Kaddar and S. Staton
Mathematical Foundations of Programming Semantics (MFPS) 2023
Paper Slides -
Affine Monads and Lazy Structures for Bayesian Programming
S. Dash, Y. Kaddar, H. Paquet and S. Staton
Principles of Programming Languages (POPL) 2023
Paper Website Code
Conference Presentations
-
HOPE 2022: Higher order programming with probabilistic effects: A model of stochastic memoization and name generation
Younesse Kaddar and Sam Staton
Link Video -
Applied Category Theory (ACT) 2022 (Aug. 2022): Statistical Programming with Categorical Measure Theory and LazyPPL (demo)
S. Dash, Y. Kaddar, H. Paquet and S. Staton
Slides Video
Conference Panels
- Panelist at PADL 2024 (POPL workshop)
Topic: Declarative Languages for Safe AI
Chair: Ekaterina Komendantskaya (Heriot-Watt University & University of Southampton)
Website Video
Projects
RightPick (2023-present)
Cofounder & CTO
Website App Store
Open Source Projects
LazyPPL (2021-present)
Co-maintainer of this Haskell-based probabilistic programming library for Bayesian nonparametrics (Sam Staton’s team)
Website GitHub
Awards
2023
- 1st Prize & Impact Prize, Bio x ML Hackathon 2023, HuggingFace, OpenBioML, Lux Capital & LatchBio
Project: SVM - Generate unified protein embedding across multiple protein modalities
Code 🤗 Model
2022
- 1st Prize, OxfordHack 2022, LENS Main Challenge
Project: CheckyBoty: idealised anonymous spoofing detection probabilistic model
Report
Education
2020-present
DPhil in Computer Science, University of Oxford, UK
Scholarship: Oxford-DeepMind
Teaching and tutoring: Principles of Programming Languages, Bayesian Statistical Probabilistic Programming
2019-2020
Visiting researcher (Predoctoral research year), University of Cambridge, UK
Department of Computer Science and Technology
Supervisor: Marcelo Fiore
2018-2019
Parisian Master of Research in Computer Science (MPRI): 2nd year (M2R; Masters 2 Research)
École Normale Supérieure Paris-Saclay, Paris
Honors: Summa cum laude
2017-2018
Parisian Master of Research in Computer Science (MPRI): 1st year (M1; Masters 1)
École Normale Supérieure Paris-Saclay, Cachan / Paris
Overall rank: 1st (out of 27)
Courses: Category theory & λ-calculus, Advanced Complexity, Statistical Learning, Computer Vision, Robot Motion Planning, Initiation to Research, English
Extra courses: Proof assistants (LMFI Master, Paris-Diderot), Modules and finite groups (Math Master at École Polytechnique)
Cogmaster (Cognitive Science)
École Normale Supérieure Paris
Courses: Computational Neuroscience, Neuromodeling, Neurorobotics, Machine Learning applied to Neuroscience
2016-2017
Bachelor of Computer Science
École Normale Supérieure Paris-Saclay, Cachan
Overall rank: 1st (out of 28)
Courses: λ-calculus & Logic, Logic Projects (DPLL algorithm & Coq project), Discrete Mathematics, Programming & Semantics, Advanced Programming, Compiler Project, Formal Languages, Computability & Complexity, Algorithmics, Advanced Algorithms, Abstract Algebra, English, Computer Architecture
2013-2016
Classes Préparatoires aux Grandes Écoles, MPSI-MP*
Lycée Henri Poincaré, Nancy
Preparatory courses to nationwide competitive exams in mathematics, physics and computer science
2012-2013
Baccalauréat S, Lycée Henri Poincaré, Nancy
Major in mathematics, with highest honors
Research Internships
-
Cohere For AI Scholars Programme (Jan. 2024 – Aug. 2024)
Research Topic: LLM Hallucinations
Mentor: Beyza Ermiş -
PhD Internship at Mila (June 2023 – Jan. 2024), Quebec Artificial Intelligence Institute, Université de Montréal
Supervisor: Yoshua Bengio
Research Topic: GFlowNets for reasoning & AI safety -
Pre-doctoral Internship (Oct. 2019 – Aug. 2020), University of Cambridge, Department of Computer Science
Title: Ideal Distributors
Supervisor: Marcelo Fiore
Cambridge Internship Report -
M2 Internship (Apr-Aug 2019), Macquarie University
Title: Tricocycloids, Effect Monoids and Effectuses
Supervisor: Richard Garner
Sydney Internship Report -
M1 Internship (June – Aug 2018), University of Oxford
Title: Event Structures as Presheaves
Supervisor: Ohad Kammar
Oxford Internship Report -
L3 Internship (June – Aug 2017), University of Nottingham
Title: Type Theory forms a weak omega groupoid
Supervisors: Thorsten Altenkirch, Paolo Capriotti, Nicolai Kraus
Nottingham Internship Report
Teaching Experience
University of Oxford, Department of Philosophy, 2023
- Topics in Minds and Machines: Perception, Cognition, and ChatGPT
Philosophy Seminar, University of Oxford
Role: Lectured on Deep Learning and Large Language Models
University of Oxford, Department of Computer Science, 2022
- Bayesian Statistical Probabilistic Programming, University of Oxford
Role: Class tutor and marker - Imperative Programming in Scala III, University of Oxford
Role: Demonstrator
University of Oxford, Department of Computer Science, 2020-2021
- Principles of Programming Languages, University of Oxford
Role: Class tutor and marker, personal tutor (Exeter College)
Tutorial Teaching at ENS Paris-Saclay (2017-2018)
- Subjects: Computability and Complexity Theory, Algorithms, Automata Theory, Formal Language Theory
Role: Personal tutor
Online Research Programme at Immerse Education (Jul. 2020, Dec. 2020, Jul. 2021)
- Topics: Bayesian probabilistic programming, Algorithms, Supervised learning
Role: Personal tutor and supervisor
Additional Training & Professional Development
Online Courses and Reading Groups
- AI Alignment Course (12 weeks), BlueDot Impact Comprehensive technical AI safety curriculum focused on reducing risks from advanced AI systems
- AI Governance Reading Group, University of Oxford
Summer Schools
- Summer School in Neurosymbolic Programming (June 2024) Salem, Massachusetts, USA
- MIT Probabilistic Programming Mini School (August 2023)
- CalTech Neurosymbolic Programming Summer School (July 2022)
- Oregon Programming Languages Summer School (June-July 2022)
Languages
- French: First language
- English: Fluent (Cambridge Certificate in Advanced English, Score: 201/210, CEFR Level: C2)
- German: Basic (CEFR Level: B2)