On the computational complexity

Xem 1-20 trên 162 kết quả On the computational complexity
  • Dominance links were introduced in grammars to model long distance scrambling phenomena, motivating the definition of multiset-valued linear indexed grammars (MLIGs) by Rambow (1994b), and inspiring quite a few recent formalisms. It turns out that MLIGs have since been rediscovered and reused in a variety of contexts, and that the complexity of their emptiness problem has become the key to several open questions in computer science. We survey complexity results and open issues on MLIGs and related formalisms, and provide new complexity bounds for some linguistically motivated restrictions.

    pdf11p hongdo_1 12-04-2013 41 2   Download

  • Meta-theoretical results on the decidability, generatire capacity, and recognition complexity o~ several syntactic theories are surveyed These include context-free , lexical func-computer o r a parallel array of neurons. These results over whole classes of machines are very difficult to obtain, and none el any significance exist for parsiD.g problems. Restricting ourselves to a specific machine model and an algorithm M for j', we can ask about the cost. (e.g time or space) e(z) of executing M on a specific input z. ...

    pdf8p bungio_1 03-05-2013 42 2   Download

  • We give an algorithm to compute the generalized division polynomials for elliptic curves with complex multiplication. These polynomials can be used to generate the ray class fields of imaginary quadratic fields over the Hilbert class field with no restriction on the conductor.

    pdf9p danhdanh27 07-01-2019 9 0   Download

  • .\lodern linguistic theory attributes surface complexity to interacting snbsystems of constraints. ["or instance, the I D LP gr,'unmar formalism separates constraints on immediate dominance from those on linear order. 5hieber's (t983) I D / I . P parsing algorithm shows how to use ID and LP constraints directly in language processing, without expandiqg them into an intcrmrdiate "object g a m m a r . " However, Shieber's purported O(:,Gi 2 .n ~) runtime bound underestimates the tlillicnlty of I D / L P parsing. I D / L P parsing is actually NP-complete, anti the worst-case runtime of...

    pdf6p bungio_1 03-05-2013 25 1   Download

  • An important goal of computational linguistics has been to use linguistic theory to guide the construction of computationally efficient real-world natural language processing systems. At first glance, generalized phrase structure grammar (GPSG) appears to be a blessing on two counts. First, the precise formalisms of GPSG might be a direct and fransparent guide for parser design and implementation. Second, since GPSG has weak context-free generative power and context-free languages can be parsed in O(n ~) by a wide range of algorithms, GPSG parsers would appear to run in polynomial time.

    pdf10p bungio_1 03-05-2013 28 1   Download

  • Freeman chain code histogram and Gradient feature extraction techniques are better because freeman chain code approach is robustness to small variation and easy to implement and the gradient technique can be easily used to gray scale images and are robust against image noise and edge direction fluctuation. SVM classifier is better than KNN and ANN classifier because of its complexity of training, flexibility, classification accuracy and complexity which is given bellow.

    pdf8p hongnhan878 12-04-2019 13 0   Download

  • To overcome the challenge, this paper presents an investigation on jointly optimal joint routing, scheduling, channel assignment and power control in multi-radio multi-channel wireless mesh networks using a MAC layer based on Time Division Multiple Access (TDMA). In which, a discrete fashion of power control and physical interference model is considered to find out an optimal solution for given rate demands. The computation complexity is reduced by applying the column generation algorithm in the optimize problem.

    pdf10p minhxaminhyeu4 15-07-2019 9 0   Download

  • With Internet applications spreading like wildfire, the field of software testing is increasingly challenged by the brave new networked world of e–business. This book brings you up to speed on the technologies, testing concepts, and tools you′ll need to run e–business applications on the Web.

    pdf674p bookstore_1 10-01-2013 136 30   Download

  • Nowadays it is difficult to imagine an area of knowledge that can continue developing without the use of computers and informatics. It is not different with biology, that has seen an unpredictable growth in recent decades, with the rise of a new discipline, bioinformatics, bringing together molecular biology, biotechnology and information technology.

    pdf456p lulanphuong 17-03-2012 133 17   Download

  • The Weather Channel is making cloud computing a cornerstone in its architecture to support severe weather events like hurricanes and nor’easter blizzards. Business in the Cloud is a concise but informative insight into cloud computing, is a great tutorial to quickly educate yourself (without vendor biases) on the options and capabilities of cloud computing, and should be read by all business and IT leaders responsible for their organization’s infrastructure.’’ Dan Agronow Chief Technology Officer, The Weather Channel Interactive, Inc.

    pdf226p trac2_123 11-04-2013 43 11   Download

  • Over the past few weeks, we've taken you on a guided tour of the intricacies of JSP, beginning with basics like conditional statements and loops, and quickly moving on to more complex things like form processing, session management, and error handling. But all good things must come to an end − and so, in this final episode of The JSP Files, we'll be briefly touching on a few other facets of this powerful server−side scripting language.

    pdf13p trungha 31-08-2009 46 8   Download

  • The publication of the Cooley-Tukey fast Fourier transform (FFT) algorithm in 1965 has opened a new area in digital signal processing by reducing the order of complexity of some crucial computational tasks like Fourier transform and convultion from N 2 to N log 2 , where N is the problem size. The development of the major algorithms (Cooley-Tukey and split-radix FFT, prime factor algorithm and Winograd fast Fourier transform) is reviewed. Then, an attempt is made to indicate the state of the art on the subject, showin the standing of researh, open problems and implementations....

    pdf51p nguyen4 17-11-2009 90 7   Download

  • Anything you can do in classical physics, we can do better quantum physics. I head that remark in Boulder, Colorado, a few years ago when Dan Kleppners, a distinguished MT quantum physicist, was giving a lec-ture to a group of scientists on the subject of the quantum chaos.

    pdf394p tiramisu0908 15-10-2012 35 7   Download

  • Pursuant to a request by the Department of Defense, the National Research Council (NRC) convened a study committee under the auspices of the Computer Science and Telecommunications Board (CSTB) to assess the nature of the U.S. national investment in software research and, in particular, to consider ways to enhance the knowledge and human resource base needed to design, produce, and employ software- intensive systems for tomorrow’s weapons and operations systems.

    pdf79p layon_5 29-03-2013 35 6   Download

  • In real-world optimization applications, stakeholders require multiple and complex constraints, which are difficult to satisfy and make complicated to find satisfactory solutions. In the most cases, we face over-constrained optimization problems (no satisfactory solution can be found) because of the stakeholders’ multiple requirements and the various and complex constraints to be satisfied. Solving over-constrained problems is based on the relaxation of some constraints according to values of preferences in order to favour the satisfaction of the most relevant....

    pdf294p bi_bi1 11-07-2012 69 6   Download

  • We prove the Minimum Vertex Cover problem to be NP-hard to approximate to within a factor of 1.3606, extending on previous PCP and hardness of approximation technique. To that end, one needs to develop a new proof framework, and to borrow and extend ideas from several fields. 1. Introduction The basic purpose of computational complexity theory is to classify computational problems according to the amount of resources required to solve them. In particular, the most basic task is to classify computational problems to those that are efficiently solvable and those that are not. ...

    pdf48p noel_noel 17-01-2013 48 5   Download

  • (BQ)Tài liệu Introduction to automata theory, languages and compution have used chapter 1 through 8 for a senior-level course, omiting only the material on inherent ambiguity in chapter 4 an portion of chapter 8. Chapter 7, 8, 12 and 13 form the nu cleus of couse on computation complexity. An advanced course on language theory could be buil around chapter 2 through 7, 9 through 11 and 14.

    pdf426p miragevn 06-10-2014 52 5   Download

  • As computer networks become cheaper and more powerful, a new computing paradigm is poised to transform the practice of science and engineering. Driven by increasingly complex problems and propelled by increasingly powerful technology, today’s science is as much based on computation, data analysis, and collaboration as on the efforts of individual experimentalists and theorists. But even as computer power, data storage, and communication continue to improve exponentially, computational resources are failing to keep up with what scientists demand of them....

    pdf13p huggoo 20-08-2010 70 4   Download

  • Since the first production of tools at the beginning of human presence on Earth, human evolution is linked to the invention of new tools, usually combined with new environmental adaptations. The symbiosis of man with tools and environments represents one of the main factors in human evolutionary processes. It is evident how this coupling is based on the biophysics of our bodies and the development of the social memory system called culture.

    pdf433p phoebe75 01-02-2013 37 4   Download

  • ‘Project Management’ is an important topic because all organisations, large and small, are involved in implementing new undertakings as diverse as the development of a new product or service, or a public relations campaign. To keep ahead of their competitors, every organisation is faced with development of complex services and processes. These need cross-functional expertise in a given organisation. The justification for undertaking project management in any organisation lies at two levels, namely, the macro and the micro levels.

    pdf115p tainhacmienphi 25-02-2013 34 4   Download


180 tài liệu
1243 lượt tải

p_strKeyword=On the computational complexity

nocache searchPhinxDoc


Đồng bộ tài khoản