Xem 1-20 trên 3463 kết quả Dependency
  • Tài liệu tham khảo về Lab - DEPENDENT DROPDOWN NEWS...

    pdf0p tuan_mit 12-05-2011 49 10   Download

  • Bowen’s formula relates the Hausdorff dimension of a conformal repeller to the zero of a ‘pressure’ function. We present an elementary, self-contained proof to show that Bowen’s formula holds for C 1 conformal repellers. We consider time-dependent conformal repellers obtained as invariant subsets for sequences of conformally expanding maps within a suitable class.

    pdf55p dontetvui 17-01-2013 25 6   Download

  • ECONOMIC DEVELOPMENT IN GREENLAND: A TIME SERIES ANALYSIS OF DEPENDENCY GROWTH , AND INSTABILITY Column A presents the coefficient from a regression of school average scores on an index of student quality, pooling all metropolitan schools in the NELS sample 44 and including a fixed effect for each MSA.50 As in the SAT data, peer effects and effectiveness sorting are together substantial, inflating the school-level background index coefficient by 90 percent relative to the coefficient of a within-school regression of individual scores on own characteristics....

    pdf361p mualan_mualan 25-02-2013 20 6   Download

  • We live in an oil-dependent world, arriving at this level of dependency in a very short space of time by treating petroleum as if it were in infinite supply. Most of us avoid thinking about what happens when oil runs out (or becomes prohibitively expensive), but The Transition Handbook shows how the inevitable and profound

    pdf134p mymi0809 17-01-2013 24 5   Download

  • We develop a general dynamic programming technique for the tabulation of transition-based dependency parsers, and apply it to obtain novel, polynomial-time algorithms for parsing with the arc-standard and arc-eager models. We also show how to reverse our technique to obtain new transition-based dependency parsers from existing tabular methods. Additionally, we provide a detailed discussion of the conditions under which the feature models commonly used in transition-based parsing can be integrated into our algorithms. ...

    pdf10p hongdo_1 12-04-2013 23 5   Download

  • This paper presents a detailed study of the integration of knowledge from both dependency parses and hierarchical word ontologies into a maximum-entropy-based tagging model that simultaneously labels words with both syntax and semantics. Our findings show that information from both these sources can lead to strong improvements in overall system accuracy: dependency knowledge improved performance over all classes of word, and knowledge of the position of a word in an ontological hierarchy increased accuracy for words not seen in the training data. ...

    pdf8p hongvang_1 16-04-2013 23 5   Download

  • We consider the problem of learning context-dependent mappings from sentences to logical form. The training examples are sequences of sentences annotated with lambda-calculus meaning representations. We develop an algorithm that maintains explicit, lambda-calculus representations of salient discourse entities and uses a context-dependent analysis pipeline to recover logical forms. The method uses a hidden-variable variant of the perception algorithm to learn a linear model used to select the best analysis.

    pdf9p hongphan_1 14-04-2013 13 4   Download

  • Most previous studies of morphological disambiguation and dependency parsing have been pursued independently. Morphological taggers operate on n-grams and do not take into account syntactic relations; parsers use the “pipeline” approach, assuming that morphological information has been separately obtained. However, in morphologically-rich languages, there is often considerable interaction between morphology and syntax, such that neither can be disambiguated without the other.

    pdf10p hongdo_1 12-04-2013 14 3   Download

  • In this paper, we present a novel approach which incorporates the web-derived selectional preferences to improve statistical dependency parsing. Conventional selectional preference learning methods have usually focused on word-to-class relations, e.g., a verb selects as its subject a given nominal class.

    pdf10p hongdo_1 12-04-2013 12 3   Download

  • We explore the contribution of morphological features – both lexical and inflectional – to dependency parsing of Arabic, a morphologically rich language. Using controlled experiments, we find that definiteness, person, number, gender, and the undiacritzed lemma are most helpful for parsing on automatically tagged input.

    pdf11p hongdo_1 12-04-2013 13 3   Download

  • We propose a generative model based on Temporal Restricted Boltzmann Machines for transition based dependency parsing. The parse tree is built incrementally using a shiftreduce parse and an RBM is used to model each decision step. The RBM at the current time step induces latent features with the help of temporal connections to the relevant previous steps which provide context information. Our parser achieves labeled and unlabeled attachment scores of 88.72% and 91.65% respectively, which compare well with similar previous models and the state-of-the-art. ...

    pdf7p hongdo_1 12-04-2013 24 3   Download

  • We consider a very simple, yet effective, approach to cross language adaptation of dependency parsers. We first remove lexical items from the treebanks and map part-of-speech tags into a common tagset. We then train a language model on tag sequences in otherwise unlabeled target data and rank labeled source data by perplexity per word of tag sequences from less similar to most similar to the target. We then train our target language parser on the most similar data points in the source labeled data. ...

    pdf5p hongdo_1 12-04-2013 18 3   Download

  • This paper presents the introduction of WordNet semantic classes in a dependency parser, obtaining improvements on the full Penn Treebank for the first time. We tried different combinations of some basic semantic classes and word sense disambiguation algorithms. Our experiments show that selecting the adequate combination of semantic features on development data is key for success.

    pdf5p hongdo_1 12-04-2013 15 3   Download

  • In this paper, we propose a novel method for semi-supervised learning of nonprojective log-linear dependency parsers using directly expressed linguistic prior knowledge (e.g. a noun’s parent is often a verb). Model parameters are estimated using a generalized expectation (GE) objective function that penalizes the mismatch between model predictions and linguistic expectation constraints.

    pdf9p hongphan_1 14-04-2013 24 3   Download

  • We would like to draw attention to Hidden Markov Tree Models (HMTM), which are to our knowledge still unexploited in the field of Computational Linguistics, in spite of highly successful Hidden Markov (Chain) Models. In dependency trees, the independence assumptions made by HMTM correspond to the intuition of linguistic dependency. Therefore we suggest to use HMTM and tree-modified Viterbi algorithm for tasks interpretable as labeling nodes of dependency trees.

    pdf4p hongphan_1 15-04-2013 21 3   Download

  • The paper investigates the problem of providing a formal device for the dependency approach to syntax, and to link it with a parsing model. After reviewing the basic tenets of the paradigm and the few existing mathematical results, we describe a dependency formalism which is able to deal with long-distance dependencies. Finally, we present an Earley-style parser for the formalism and discuss the (polynomial) complexity results.

    pdf7p bunrieu_1 18-04-2013 25 3   Download

  • Rambow, Wier and Vijay-Shanker (Rainbow et al., 1995) point out the differences between TAG derivation structures and semantic or predicateargument dependencies, and Joshi and VijayShanker (Joshi and Vijay-Shanker, 1999) describe a monotonic compositional semantics based on attachment order that represents the desired dependencies of a derivation without underspecifying predicate-argument relationships at any stage.

    pdf8p bunrieu_1 18-04-2013 26 3   Download

  • Hungarian is a stereotype of morphologically rich and non-configurational languages. Here, we introduce results on dependency parsing of Hungarian that employ a 80K, multi-domain, fully manually annotated corpus, the Szeged Dependency Treebank. We show that the results achieved by state-of-the-art data-driven parsers on Hungarian and English (which is at the other end of the configurational-nonconfigurational spectrum) are quite similar to each other in terms of attachment scores.

    pdf11p bunthai_1 06-05-2013 15 3   Download

  • In this paper we study spectral learning methods for non-deterministic split headautomata grammars, a powerful hiddenstate formalism for dependency parsing. We present a learning algorithm that, like other spectral methods, is efficient and nonsusceptible to local minima. We show how this algorithm can be formulated as a technique for inducing hidden structure from distributions computed by forwardbackward recursions.

    pdf11p bunthai_1 06-05-2013 20 3   Download

  • Results of computational complexity exist for a wide range of phrase structure-based grammar formalisms, while there is an apparent lack of such results for dependency-based formalisms. We here adapt a result on the complexity of ID/LP-grammars to the dependency framework. Contrary to previous studies on heavily restricted dependency grammars, we prove that recognition (and thus, parsing) of linguistically adequate dependency grammars is~A/T'-complete.

    pdf7p bunthai_1 06-05-2013 23 3   Download

Đồng bộ tài khoản