Ruprecht-Karls-Universität Heidelberg
Institut für Computerlinguistik

Bilder vom Neuenheimer Feld, Heidelberg und der Universität Heidelberg

Efficient Methods in NLP

Module Description

Course Module Abbreviation Credit Points
BA-2010[100%|75%] CS-CL 6 LP
BA-2010[50%] BS-CL 6 LP
BA-2010[25%] BS-AC, BS-FL 4 LP
BA-2010 AS-CL 8 LP
Master SS-CL-TAC 8 LP
Lecturer Jakob Schuster
Module Type Hauptseminar / Proseminar
Language Englisch
First Session 16.04.2025
Time and Place Wednesdays, 13:15 - 14:45, INF 326, SR 27
Commitment Period tbd.

Participants

All advanced Bachelor students and all Master students. Students from MSc Data and Computer Science or MSc Scientific Computing with Anwendungsgebiet Computational Linguistics are welcome after getting permission from the lecturer.

Prerequisites for Participation

  • Completion of Introduction to Computational Linguistics, Introduction to Programming
  • Mathematical Foundations of Computational Linguistics or Programming II heavily suggested
  • Solid understanding of machine learning (Statistical Methods for NLP, Introduction to Neural Networks or similar)

Assessment

  • Active Participation
  • Presentation
  • A second presentation, implementation project or exam (dependent on number of participants)

Contents

Neural language models are now trained on many billions of parameters, with data sets that are terabytes in size, and have achieved remarkable success across a wide range of tasks. However, this constant upscaling increases the computational costs and makes them inaccessible without the required hardware.

In this seminar we will discuss different methods for increasing efficiency through model architecture, data usage and application.

This includes but is not limited to Mixture of Experts systems, LoRA, Quantization, Active Learning and Curriculum Learning and Speculative Decoding.

Literature

Will be announced at the beginning of the course.

zum Seitenanfang