Most previous work on trainable language generation has focused on two paradigms: (a) using a statistical model to rank a set of generated utterances, or (b) using statistics to inform the generation decision process. Both approaches rely on the existence of a handcrafted generator, which limits their scalability to new domains. This paper presents BAGEL, a statistical language generator which uses dynamic Bayesian networks to learn from semantically-aligned data produced by 42 untrained annotators. ...
Sitting at the intersection between statistics and machine learning, Dynamic Bayesian Networks have been applied with much success in many domains, such as speech recognition, vision, and computational biology. While Natural Language Processing increasingly relies on statistical methods, we think they have yet to use Graphical Models to their full potential. In this paper, we report on experiments in learning edit distance costs using Dynamic Bayesian Networks and present results on a pronunciation classiﬁcation task. ...
A number of Russian verbs lack 1sg nonpast forms. These paradigmatic gaps are puzzling because they seemingly contradict the highly productive nature of inflectional systems. We model the persistence and spread of Russian gaps via a multi-agent model with Bayesian learning. We ran three simulations: no grammar learning, learning with arbitrary analogical pressure, and morphophonologically conditioned learning. We compare the results to the attested historical development of the gaps.
Chance events are commonplace in our daily lives. Every day we face situations where the result is uncertain, and, perhaps without realizing it, we guess about the likelihood of one outcome or another. Fortunately, mastering the concepts of probability can cast new light on situations where randomness and chance appear to rule. In this fully revised second edition of Understanding Probability, the reader can learn about the world of probability in an appealing way.
Event recognition methods can be roughly categorized into
model-based methods and appearance-based techniques.
Model-based approaches relied on various models, includ-
ing HMM , coupled HMM , and Dynamic Bayesian
Network , to model the temporal evolution. The
relationships among different body parts and regions are
also modeled in , , in which object tracking needs to
be conducted at first before model learning.
This paper introduces a machine learning method based on bayesian networks which is applied to the mapping between deep semantic representations and lexical semantic resources. A probabilistic model comprising Minimal Recursion Semantics (MRS) structures and lexicalist oriented semantic features is acquired. Lexical semantic roles enriching the MRS structures are inferred, which are useful to improve the accuracy of deep semantic parsing.
The turn of the millennium has been described as the dawn of a new scientific
revolution, which will have as great an impact on society as the industrial and
computer revolutions before. This revolution was heralded by a large-scale
DNA sequencing effort in July 1995, when the entire 1.8 million base pairs
of the genome of the bacterium Haemophilus influenzae was published – the
first of a free-living organism. Since then, the amount of DNA sequence data
in publicly accessible data bases has been growing exponentially, including a
working draft of the complete 3.