Andreas Opedal

prof_pic.jpg
andreas.opedal@inf.ethz.ch

I am a PhD student at ETH Zurich, advised by Mrinmaya Sachan, Ryan Cotterell, and Bernhard Schölkopf, and supported by the Max Planck ETH Center for Learning Systems. I work on natural language processing, machine learning, and computational linguistics.

I am currently interested in problems related to deductive reasoning in language models, particularly as it relates to efficiency. Some other topics of interest include: linguistic and cognitive evaluation of LLMs, understanding and modeling human reading, test-time adaptation/scaling, context-free parsing, and constrained generation of LLMs.

Prior to my doctoral studies, I obtained an MSc in Data Science from ETH Zurich and a BSc in Industrial Engineering from Chalmers University of Technology. I have also spent time at University of California, Berkeley.

I am always open to discuss ideas and techniques in my areas of interest, so feel free to reach out!

selected publications

  1. NeurIPS
    Are Language Models Efficient Reasoners? A Perspective from Logic Programming
    Andreas Opedal, Yanick Zengaffinen, Haruki Shirakami, Clemente Pasti, Mrinmaya Sachan, Abulhair Saparov, Ryan Cotterell, and Bernhard Schölkopf
    In The Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025
  2. ICLR
    MathGAP: Out-of-Distribution Evaluation on Problems with Arbitrarily Complex Proofs
    Andreas Opedal*, Haruki Shirakami*, Bernhard Schölkopf, Abulhair Saparov, and Mrinmaya Sachan
    In The Thirteenth International Conference on Learning Representations, 2025
  3. EMNLP
    On the Role of Context in Reading Time Prediction
    Andreas Opedal, Eleanor Chodroff, Ryan Cotterell, and Ethan Wilcox
    In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, Nov 2024
  4. ICML
    Do Language Models Exhibit the Same Cognitive Biases in Problem Solving as Human Learners?
    Andreas Opedal*, Alessandro Stolfo*, Haruki Shirakami, Ying Jiao, Ryan Cotterell, Bernhard Schölkopf, Abulhair Saparov, and Mrinmaya Sachan
    In Forty-first International Conference on Machine Learning, Jul 2024
  5. ACL
    Efficient Semiring-Weighted Earley Parsing
    Andreas Opedal, Ran Zmigrod, Tim Vieira, Ryan Cotterell, and Jason Eisner
    In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Jul 2023