• Home
  • CLT seminar: Mikael Kågebäck - Deep Learning for NLP

CLT seminar: Mikael Kågebäck - Deep Learning for NLP


Who is the female counterpart of Justin Bieber?

Recent advances in the field of neural networks have enabled the training of deep networks, i.e. networks that are able to learn multiple levels of representation. This allows the network to learn highly non-linear mappings, less dependent on clever feature engineering to achieve good performance. In fact, these networks automatically learn feature representations, directly from the data, that may be extracted and used in other applications. When trained on text, the learned word representations have been shown to express multiple dimensions of similarity, encoded as a simple superposition of semantic and syntactic basis vectors.

I will describe how these word representations may be derived using deep learning, and subsequently employ them to answer the posed question in the beginning of this text.

Date: 2013-10-31 10:30 - 11:30

Location: EDIT Room 3364, E-building, Chalmers (Johanneberg)


add to Outlook/iCal

To the top

Page updated: 2013-10-24 22:12

Send as email
Print page
Show as pdf