In every text, some words have frequency appearance and are considered as keywords because they have strong relationship with the subjects of their texts, these words frequencies change with time-series variation in a given period. However, in traditional text dealing methods and text search techniques, the importance of frequency change with time-series variation is not considered. Therefore, traditional methods could not correctly determine index of word’s popularity in a given period.
This book comprises the contributions of several authors in the area of polymer
characterization by atomic force microscopy of the polymer network structure formed
in Ferroelectric Liquid Crystals Cells; polymerization by microwave irradiation
method of starch/acrylic acid/acrylamide; polymerization of olefins; emulsion
polymerization; ring opening polymerization; cationic polymerization of vinyl
monomers ; block and graft copolymerization by controlled/living polymerization;
fabrication of doped microstructures by two-photon polymerization; rheology of
biomaterials; plant cell wall ...
Computational fluid dynamics (CFD) is concerned with the efficient numerical solution of the partial differential equations that describe fluid dynamics. CFD techniques are commonly used in the many areas of engineering where fluid behavior is an important factor. Traditional fields of application include aerospace and automotive design, and more recently, bioengineering and consumer and medical electronics.
Bringing together the recent and relevant contributions of over 125 scientists from industry, government, and academia in North America and Western Europe, Alternative Toxicological Methods explores the development and validation of replacement, reduction, and refinement alternatives (the 3Rs) to animal testing. Internationally recognized scientists present what has been accomplished thus far in developing acceptable alternatives to traditional animal toxicological assessment and provide potentially new initiatives.
Each generation has its unique needs and aspirations. When Charles Wiley first
opened his small printing shop in lower Manhattan in 1807, it was a generation
of boundless potential searching for an identity. And we were there, helping to
define a new American literary tradition. Over half a century later, in the midst
of the Second Industrial Revolution, it was a generation focused on building
the future. Once again, we were there, supplying the critical scientific, technical,
and engineering knowledge that helped frame the world.
Dehydrogenases and reductases are enzymes of fundamental metabolic
importance that often adopt a specific structure known as the Rossmann
fold. This fold, consisting of a six-strandedb-sheet surrounded bya-helices,
is responsible for coenzyme binding. We have developed a method to iden-tify Rossmann folds and predict their coenzyme specificity (NAD, NADP
or FAD) using only the amino acid sequence as input.
This second edition presents new chapters on (a) the utilization of mutants as highresolution
nanosensors of short-living protein structures and protein nanophysics
(Chap. 11) and (b) the recently developed method of evolutionary computer
programming (Chap. 12), respectively. In the latter method, computer programs
evolve themselves towards a higher performance. In contrast to simple selflearning
programs, the code of the evolved program differs significantly from that
of the original "wild-type" program.
This commonly used method of procuring furniture and equipment, requires a pool of
qualified, interested bidders and clear, detailed specifications in order to achieve the
desired result. Since the paperwork required is substantial, it calls for more effort by the
bidding agency, and sometimes discourages prospective bidders. The open advertisement
policy, typical of competitive bidding, can attract bidders who lack the experience,
expertise, and financial stability required for this highly specialized type of project.
Traditional Active Learning (AL) techniques assume that the annotation of each datum costs the same. This is not the case when annotating sequences; some sequences will take longer than others. We show that the AL technique which performs best depends on how cost is measured. Applying an hourly cost model based on the results of an annotation user study, we approximate the amount of time necessary to annotate a given sentence. This model allows us to evaluate the effectiveness of AL sampling methods in terms of time spent in annotation. ...
The molecular analysis of human cancer is complicated by the difficulty in
obtaining pure populations of tumor cells to study. One traditional method of
obtaining a pure representation has been establishing cancer cell lines from
primary tumors. However, this technique is time consuming and of low yield.
Artifacts of cell culture include the selection of genetic alterations not present
in primary tumors (1,2) and the alteration of gene expression as compared to
primary tumors (3).
A generic sample normalization method applicable in relative comparison
of mRNAs quantified with real-time polymerase chain reaction (PCR) is
proposed. The method was applied in samples obtained from tomato seeds
after osmopriming and aging treatments and from untreated seeds at early
imbibition stage, when seeds had not completed germination.
The computation of selectional preferences, the admissible argument values for a relation, is a well-known NLP task with broad applicability. We present L DA - SP, which utilizes LinkLDA (Erosheva et al., 2004) to model selectional preferences. By simultaneously inferring latent topics and topic distributions over relations, L DA - SP combines the beneﬁts of previous approaches: like traditional classbased approaches, it produces humaninterpretable classes describing each relation’s preferences, but it is competitive with non-class-based methods in predictive power. ...
Constructing an encoding of a concept lattice using short bit vectors allows for efﬁcient computation of join operations on the lattice. Join is the central operation any uniﬁcation-based parser must support. We extend the traditional bit vector encoding, which represents join failure using the zero vector, to count any vector with less than a ﬁxed number of one bits as failure.
It is traditionally assumed that various sources of linguistic knowledge and their interaction should be formalised in order to be able to convert words into their phonemic representations with reasonable accuracy. We show that using supervised learning techniques, based on a corpus of transcribed words, the same and even better performance can be achieved, without explicit modeling of linguistic knowledge. In this paper we present two instances of this approach.
To solve a problem of how to evaluate computer-produced summaries, a number of automatic and manual methods have been proposed. Manual methods evaluate summaries correctly, because humans evaluate them, but are costly. On the other hand, automatic methods, which use evaluation tools or programs, are low cost, although these methods cannot evaluate summaries as accurately as manual methods. In this paper, we investigate an automatic evaluation method that can reduce the errors of traditional automatic methods by using several evaluation results obtained manually. ...
News stories are typically rich in NEs and therefore, comparable news corpora can be expected to contain NETEs (Klementiev and Roth, 2006; Tao et al., 2006). The large quantity and the perpetual availability of news corpora in many of the world’s languages, make mining of NETEs a viable alternative to traditional approaches. It is this opportunity that we address in our work. In this paper, we detail an effective and scalable mining method, called MINT (MIning Named-entity Transliteration equivalents), for mining of NETEs from large comparable corpora. ...
Learning objectives for chapter 2: Explain the purpose and various phases of the systems development life cycle (SDLC); explain the differences between a model, a tool, a technique, and a methodology; describe the two overall approaches used to develop information systems: the traditional method and the object-oriented method,...
This chapter covers oral reporting and presentation techniques. After studying this chapter you will be able to understand: How the oral research presentation differs from and is similar to traditional public speaking, why historical rhetorical theory has practical influence on business presentation skills in the 21st century, how to plan for the research presentation,...
(BQ) The method of electrical discharge machining (EDM), one of the processing methods based on non-traditional manufacturing procedures, is gaining increased popularity, since it does
not require cutting tools and allows machining involving hard, brittle, thin and complex geometry