Xem 1-20 trên 402 kết quả Models of grammar
  • At least since Chomsky, the usual response to the projection problem has been to characterize knowledge of language as a grammar, and then proceed by restricting so severely the class of grammars available for acquisition that the induction task is greatly simplified - perhaps trivialized. The work reported here describes an implemented LISP program that explicitly reproduces this methodological approach to acquisitio,~ - but in a computational setting.

    pdf6p bungio_1 03-05-2013 18 2   Download

  • An investment of effort over the last two years has begun to produce a wealth of data concerning computational psycholinguistic models of syntax acquisition. The data is generated by running simulations on a recently completed database of word order patterns from over 3,000 abstract languages. This article presents the design of the database which contains sentence patterns, grammars and derivations that can be used to test acquisition models from widely divergent paradigms.

    pdf8p bunbo_1 17-04-2013 23 1   Download

  • The design, implementation, and use of grammar forma]isms for natural language have constituted a major branch of coml)utational linguistics throughout its development. By viewing grammar formalisms as just a special ease of computer languages, we can take advantage of the machinery of denotational semantics to provide a precise specification of their meaning.

    pdf7p bungio_1 03-05-2013 18 1   Download

  • This paper describes a computational model of human sentence processing based on the principles and parameters paradigm of current linguistic theory. The syntactic processing model posits four modules, recovering phrase structure, long-distance dependencies, coreference, and thematic structure. These four modules are implemented as recta-interpreters over their relevant components of the grammar, permitting variation in the deductive strategies employed by each module.

    pdf6p buncha_1 08-05-2013 14 1   Download

  • In the past decade, Cognitive Linguistics has developed into one of the most dynamic and attractive frameworks within theoretical and descriptive linguistics. With about fifty chapters written by experts in the field, the Oxford Handbook of Cognitive Linguistics intends to provide a comprehensive overview of the entire domain of Cognitive Linguistics, from basic concepts to practical applications.

    pdf1365p denngudo 21-06-2012 94 38   Download

  • In the past decade, Cognitive Linguistics has developed into one of the most dynamic and attractive frameworks within theoretical and descriptive linguistics. With about fifty chapters written by experts in the field, the Oxford Handbook of Cognitive Linguistics intends to provide a comprehensive overview of the entire domain of Cognitive Linguistics, from basic concepts to practical applications.

    pdf1325p 123859674 28-06-2012 223 25   Download

  • This paper establishes a connection between two apparently very different kinds of probabilistic models. Latent Dirichlet Allocation (LDA) models are used as “topic models” to produce a lowdimensional representation of documents, while Probabilistic Context-Free Grammars (PCFGs) define distributions over trees. The paper begins by showing that LDA topic models can be viewed as a special kind of PCFG, so Bayesian inference for PCFGs can be used to infer Topic Models as well. Adaptor Grammars (AGs) are a hierarchical, non-parameteric Bayesian extension of PCFGs. ...

    pdf10p hongdo_1 12-04-2013 22 2   Download

  • This paper investigates transforms of split dependency grammars into unlexicalised context-free grammars annotated with hidden symbols. Our best unlexicalised grammar achieves an accuracy of 88% on the Penn Treebank data set, that represents a 50% reduction in error over previously published results on unlexicalised dependency parsing.

    pdf4p hongphan_1 15-04-2013 11 2   Download

  • We present LLCCM, a log-linear variant of the constituent context model (CCM) of grammar induction. LLCCM retains the simplicity of the original CCM but extends robustly to long sentences. On sentences of up to length 40, LLCCM outperforms CCM by 13.9% bracketing F1 and outperforms a right-branching baseline in regimes where CCM does not.

    pdf6p nghetay_1 07-04-2013 7 1   Download

  • We propose CMSMs, a novel type of generic compositional models for syntactic and semantic aspects of natural language, based on matrix multiplication. We argue for the structural and cognitive plausibility of this model and show that it is able to cover and combine various common compositional NLP approaches ranging from statistical word space models to symbolic grammar formalisms.

    pdf10p hongdo_1 12-04-2013 22 1   Download

  • We present a new approach to stochastic modeling of constraintbased grammars that is based on loglinear models and uses EM for estimation from unannotated data. The techniques are applied to an LFG grammar for German. Evaluation on an exact match task yields 86% precision for an ambiguity rate of 5.4, and 90% precision on a subcat frame match for an ambiguity rate of 25. Experimental comparison to training from a parsebank shows a 10% gain from EM training.

    pdf8p bunrieu_1 18-04-2013 14 1   Download

  • Insights from multiple disciplines provides the foundation for the model of language in social context used in this study. We begin by delineating the central questions animating this project, and then proceed to locate these questions within frameworks provided by previous scholarship.English is a language of the western branch of the German language group in the Indo-European), were imported to England by the language invasion of many people in the 6th century.

    pdf327p thuthuy 28-07-2009 214 102   Download

  • Most statistical machine translation systems rely on composed rules (rules that can be formed out of smaller rules in the grammar). Though this practice improves translation by weakening independence assumptions in the translation model, it nevertheless results in huge, redundant grammars, making both training and decoding inefficient. Here, we take the opposite approach, where we only use minimal rules (those that cannot be formed out of other rules), and instead rely on a rule Markov model of the derivation history to capture dependencies between minimal rules. ...

    pdf9p hongdo_1 12-04-2013 13 4   Download

  • This paper reports on the recognition component of an intelligent tutoring system that is designed to help foreign language speakers learn standard English. The system models the grammar of the learner, with this instantiation of the system tailored to signers of American Sign Language (ASL). We discuss the theoretical motivations for the system, various difficulties that have been encountered in the implementation, as well as the methods we have used to overcome these problems. Our method of capturing ungrammaticalities involves using malrules (also called 'error productions'). ...

    pdf7p bunrieu_1 18-04-2013 19 4   Download

  • Although adequate models of human language for syntactic analysis and semantic interpretation are of at least contextfree complexity, for applications such as speech processing in which speed is important finite-state models are often preferred. These requirements may be reconciled by using the more complex grammar to automatically derive a finite-state approximation which can then be used as a filter to guide speech recognition or to reject many hypotheses at an early stage of processing.

    pdf8p bunthai_1 06-05-2013 18 4   Download

  • We propose a model that incorporates an insertion operator in Bayesian tree substitution grammars (BTSG). Tree insertion is helpful for modeling syntax patterns accurately with fewer grammar rules than BTSG. The experimental parsing results show that our model outperforms a standard PCFG and BTSG for a small dataset. For a large dataset, our model obtains comparable results to BTSG, making the number of grammar rules much smaller than with BTSG.

    pdf6p hongdo_1 12-04-2013 18 3   Download

  • This paper presents empirical studies and closely corresponding theoretical models of the performance of a chart parser exhaustively parsing the Penn Treebank with the Treebank’s own CFG grammar. We show how performance is dramatically affected by rule representation and tree transformations, but little by top-down vs. bottom-up strategies.

    pdf8p bunrieu_1 18-04-2013 19 3   Download

  • There is increasing recognition of the fact that the entire range of dependencies that transformational grammars in their various incarnations h a v e t r i e d t o a c c o u n t f o r c a n be satisfactorily captured by classes of rules that are non-transformational and at the same Clme highly constrlaned in terms of the classes of grammars and languages that they define .

    pdf9p bungio_1 03-05-2013 17 3   Download

  • Department of Linguistics The University of Texas at Austin Austin, Texas 78712 jbaldrid@mail.utexas.edu or trigram Hidden Markov Model (HMM). Ravi and Knight (2009) achieved the best results thus far (92.3% word token accuracy) via a Minimum Description Length approach using an integer program (IP) that finds a minimal bigram grammar that obeys the tag dictionary constraints and covers the observed data.

    pdf9p hongdo_1 12-04-2013 22 2   Download

  • We present an approach to multilingual grammar induction that exploits a phylogeny-structured model of parameter drift. Our method does not require any translated texts or token-level alignments. Instead, the phylogenetic prior couples languages at a parameter level. Joint induction in the multilingual model substantially outperforms independent learning, with larger gains both from more articulated phylogenies and as well as from increasing numbers of languages.

    pdf10p hongdo_1 12-04-2013 10 2   Download

Đồng bộ tài khoản