Here are my contacts

Course Abstract

The course will begin with a review of the basics of Deep Learning, NLP and PyTorch. Then the Transformer architecture and its Self-attention Mechanism will be introduced and coded. A simple, small but complete autoregressive generative language model such as GPT-2 will be built. This will allow us to understand several relevant aspects of more sophisticated pre-trained LLMs, such as GPT4, Mistral or Llama. Afterwards, we will play with open-source pre-trained LLMs and, if possible, fine-tune one of them. In the last part of the course, we will explore some interesting, also from a physical point of view, emerging abilities of LLMs, touch upon multi-agent systems and their collective behaviour.

Technical Requisites

Lecture Material

Further Readings