Kaisa_2012_3_photo by Veikko Somerpuro

Enrol
10.12.2018 at 09:00 - 17.12.2019 at 11:00

Timetable

Here is the course’s teaching schedule. Check the description for possible other schedules.

DateTimeLocation
Wed 16.1.2019
14:15 - 15:45
Fri 18.1.2019
14:15 - 15:45
Wed 23.1.2019
14:15 - 15:45
Fri 25.1.2019
14:15 - 15:45
Wed 30.1.2019
14:15 - 15:45
Fri 1.2.2019
14:15 - 15:45
Wed 6.2.2019
14:15 - 15:45
Fri 8.2.2019
14:15 - 15:45
Wed 13.2.2019
14:15 - 15:45
Fri 15.2.2019
14:15 - 15:45
Wed 20.2.2019
14:15 - 15:45
Fri 22.2.2019
14:15 - 15:45
Wed 27.2.2019
14:15 - 15:45
Fri 1.3.2019
14:15 - 15:45

Description

This is an introduction to deep learning for students in language technology. The course is aimed at master's level students who already have some experience in machine learning.

It is recommended to take this course after LDA-T3105, Models and Algorithms in NLP-applications. However, all students who have a reasonable background in linear classifiers like the perceptron and logistic regression, as well as, the NumPy Python library can probably follow the course.

  • study track: language technology
  • modules: Studies in Language Technology (LDA-T3100), Essentials in Language Technology (LDA-TA500), Comprehensive specialization in Language Technology (LDA-TB500)

This is an optional course.

The course is available to students from other study tracks and degree programmes.

After completing this course, students will:

  • understand the structure of neural networks
  • know the difference between linear and non-linear models
  • understand how neural networks are applied and trained
  • know the common neural models applied in language technology
  • be able to implement their own neural models using deep learning libraries.

This course is suitable for 2nd year master's students.

The course will cover a selection from the following topics:

  • training linear classifiers using gradient methods
  • multi-layer feedforward networks
  • embeddings layers and the continuous bag-of-words model
  • recurrent networks including long-short term memory networks and gated recurrent units
  • convolutional networks
  • the encoder-decoder architecture
  • mini-batch training and GPU's
  • The PyTorch Python library for deep learning

The course is going to be a lecture course with one weekly lecture and optional exercise sessions. We’ll have a final assignment but no exam.

Weekly exercises and final assignment.