![](images/graphics/blank.gif)
Multiple NLP’s tasks
-
In recent years, there are multiple systems (e.g search engines and dialogue systems) that require machines to be able to read and understand human text to perform several tasks in an application. Machine Reading Comprehension (MRC) has posed a challenge to the Natural Language Processing (NLP) community in teaching machines to understand the meaning of human text in order to answer questions provided.
9p
viberkshire
09-08-2023
9
5
Download
-
In recent times, we have witnessed dramatic progresses and emergence of advanced deep neural architectures in natural language processing (NLP) domain. The advanced sequence-to-sequence (seq2seq)/transformer based architectures have demonstrated remarkable improvements in multiple NLP’s tasks, including text categorization.
10p
viannee
02-08-2023
6
5
Download
-
Although most NLP researchers agree that a level of "logical form" is a necessary step toward the goal of representing the meaning of a sentence, few people agree on the content and form of this level of representation. An even smaller number of people have considered the complex action sentences that are often expressed in taskoriented dialogues. Most existing logical form representations have been developed for single-clause sentences that express assertions about properties or actual actions and in which time is not a main concern.
2p
bunmoc_1
20-04-2013
39
1
Download
-
Unsupervised word representations are very useful in NLP tasks both as inputs to learning algorithms and as extra word features in NLP systems. However, most of these models are built with only local context and one representation per word. This is problematic because words are often polysemous and global context can also provide useful information for learning word meanings.
10p
nghetay_1
07-04-2013
57
1
Download