Applied Machine Learning: Deep Multi-Task Learning to Extract Knowledge from Text

Date:

Many online systems rely on knowledge-bases containing nuggets of information (facts or relationships between entities) about the world, which are generated through analysis of static text corpora, such as a dump of Wikipedia pages. However, over time our knowledge-base will become out-dated, as the state of the world changes, and hence we need technologies to update that knowledge base with fresh information. Deep Multi-task Models such as BERT have shown that it is possible to very effectively encode generalisable knowledge about language use within deep neural network models by co-training that model for a range of related (sub-)tasks. Hence, we are looking to investigate whether can we create a general multi-task model that can accurately and reliably extract facts and relationships between entities from text streams in real-time.”

Video
Slides