Numerical Optimisation

Master 1, Université Grenoble Alpes, UFR IM2AG, January 2026 - April 2026

Overview

This is the webpage of the course Numerical Optimization, Master 1 of Applied Mathematics, at the Université Grenoble Alpes, France.
The course aims to equipe students with basic knowledge of numerical optimization: mathematical background, important concepts, main algorithms, their implementation and theoretical guarantees.

Course organisation

This course will take place during 11 weeks. Each week, there are one lecture, one tutorial (TD) and one practical session (TP), each of which lasts 1 hour 30 minutes.
Unless specified otherwise, place and time slot for these sessions are:
  • TP: 9h45AM Monday, room F202-IM2AG and will be given by Thomas Guillaume
  • Lecture and TD: 9h45AM Thursday, room F321-IM2AG and will be given by myself

Course evaluation

The final evaluation will be based on pratical sessions and a final exam. Only a couple of practical sessions will be noted (and you will be informed of those sessions).
Note for students from graduate school: your final evaluation will be based additionally on another mini-project.

Course materials

It is worth noticing that lectures and TDs are designed to greatly complement each other. While TPs are somewhat designed independently, they also follow closely the lectures.
Materials for TPs can be found at: https://github.com/tGuilmeau/numericalOptimization-M1AM.
Materials for lectures and TDs can be found below:
  • Lecture 1: Introduction and refresher course Lecture Exercises
  • Lecture 2: Convex functions and sets
  • Lecture 3: Gradient descent and theoretical properties
  • Lecture 4: Lower-bound of first-order methods and Nesterov optimal algorithm
  • Lecture 5: Stochastic gradient descent and theoretical properties
  • Lecture 6: Adaptative methods - Adagrad and variants
  • Lecture 7: Second-order methods - Newton algorithm
  • Lecture 8:
  • Lecture 9:
  • Lecture 10:
  • Lecture 11:

Course references

While I do not use a specific textbook to prepare for the lectures, you are welcome to read these following references to follow the course more easily.

  • Numerical optimization - Jorge Nocedal & Stephen J. Wright
  • Convex optimization - Stephen Boyd & Lieven Vandenberghe
  • Introduction to Optimization - Boris Teodorovitsj Polyak
  • Introductory Lectures on Convex Optimization: a basic course - Yurii Nesterov