# TEACHING

## CONVEX OPTIMIZATION

Sommersemester 2016

### GENERAL INFORMATION

Convex optimization problems arise quite naturally in many application areas like signal processing, machine learning, image processing, communication and networks and finance etc.

The course will give an introduction into convex analysis, the theory of convex optimization such as duality theory, algorithms for solving convex optimization problems such as interior point methods but also the basic methods in general nonlinear unconstrained minimization, and recent first-order methods in non-smooth convex optimization. We will also cover related non-convex problems such as d.c. (difference of convex) programming, biconvex optimization problems and hard combinatorial problems and their relaxations into convex problems. While the emphasis is given on mathematical and algorithmic foundations, several example applications together with their modeling as optimization problems will be discussed.

The course requires a **good background in linear algebra and multivariate calculus**,
but no prior knowledge in optimization is required.
The course can be seen as complementary to the core lecture "Optimization"
which will also takes place during the summer semester.

Students who intend to do their master thesis in machine learning are encouraged to take this course.

**Type: Advanced course (Vertiefungsvorlesung), 9 credit points**

### LECTURE MATERIAL

The course follows in the first part the book of Boyd and Vandenberghe.

Lecture notes (will be updated - coverage until convex sets): Lecture notes

The practical exercises will be in Matlab and will make use of CVX.

### SLIDES AND EXCERCISES

### TIME AND LOCATION

Lecture: Tuesday, 14-16, E2 4, SR6 - Room 217, Thursday, 10-12, E1 3, HS 003

Tutorials: Friday, 16-18, E2 4, SR6 - Room 217

### EXAMS AND GRADING

End-term: 10.8.2016, 13-15 HS 001, Re-exam: 11.10.2016, 14-16 HS 003

Grading:

- 50% of the points in the exercises are needed to take part in the exams.
- An exam is passed if you get at least 50% of the points.
- The grading is based on the better result of the end-term and re-exam.
- Exams can be oral or written (depends on the number of participants).

### LECTURER

Office Hours: Thursday, 16-18

Organization: Quynh Nguyen Ngoc

### LITERATURE AND OTHER RESOURCES

- D. P. Bertsekas: Convex Optimization Theory, (2009).

Link to the free chapter on optimization algorithms. - J.-B. Hiriart-Urruty, C. Lemaréchal: Fundamentals of Convex Analysis (2013).
- S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, (2004).

The book is freely available - D. P. Bertsekas: Nonlinear Programming, Athena Scientific, (1999).
- Other resources:
- Matlab is available on cip[101-114] and cip[220-238].studcs.uni-sb.de, gpool[01-27].studcs.uni-sb.de

The path is /usr/local/matlab/bin.

For the sun workstations you have to select in the menu Applications/studcsApplications/Matlab

Access from outside should be possible via ssh: ssh -X username@computername.studcs.uni-sb.de - Matlab tutorial by David F. Griffiths

- Matlab is available on cip[101-114] and cip[220-238].studcs.uni-sb.de, gpool[01-27].studcs.uni-sb.de

### NEWS

Re-exam will be on October 11 at 14.00 in HS 003.

Results of the exam can be found here

Exercise 12 is the last exercise of this lecture.

The matlab file to minimize the softmax/cross entropy loss via Newton descent is in the material for Exercise 8.

List of students admitted to the exam: pdf