Xem 1-20 trên 239 kết quả Data description
  • The UML can be used to describe the complete development of relational and object relational databases 1 from business requirements through the physical data model. However, modeling of the physical data model must express a detailed description of the database. This is done using Rational’s Data Modeling Profile for the UML2 .

    pdf11p trannhu 08-08-2009 240 89   Download

  • This Solution Reference Network Design (SRND) provides a description of the design issues related to optimizing server and application environments. Server farms are built to provide a highly available, scalable infrastructure that on which applications ru . As enterprises grow, so do their application needs which implies that more and more expenditures are allotted for additional application-specific softwar and server requirements. This increase in bottom line expenses heightens the sensitivity to overall retur on investments.

    pdf94p kieuoanh 11-08-2009 177 55   Download

  • This compendium aims at providing a comprehensive overview of the main topics that appear in any well-structured course sequence in statistics for business and economics at the undergraduate and MBA levels. The idea is to supplement either formal or informal statistic textbooks such as, e.g., “Basic Statistical Ideas for Managers” by D.K. Hildebrand and R.L. Ott and “The Practice of Business Statistics: Using Data for Decisions” by D.S. Moore, G.P. McCabe, W.M. Duckworth and S.L. Sclove, with a summary of theory as well as with a couple of extra examples.

    pdf0p sofia11 15-05-2012 83 36   Download

  • The EViews workfile is defined as a file in EViews, which provides many convenient visual ways, such as (i) to enter and save data sets, (ii) to create new series or variables from existing ones, (iii) to display and print series and (iv) to carry out and save results of statistical analysis, aswell as each equation of the models applied in the analysis.By using EViews, each statistical model that applied previously could be recalled and modified easily and quickly to obtain the best fit model, based on personal judgment using an interactive process.

    pdf635p namde02 08-03-2013 51 29   Download

  • Java and XML Data Binding Brett McLaughlin Publisher: O'Reilly First Edition May 2002 ISBN: 0-596-00278-5, 214 pages Table of Contents Index Full Description Reviews Reader reviews Errata This new title provides an in-depth technical look at XML Data Binding. The book offers complete documentation of all features in both the Sun Microsystems JAXB API and popular open source alternative implementations (Enhydra Zeus, Exolabs Castor and Quick).

    pdf200p nuoiheocuoivo 04-05-2010 89 18   Download

  • With some 200 million hosts generating traffic on the Internet, TCP/IP (transmission control protocol/Internet protocol) has become the protocol suite of choice to support the exchange of messages in commercial operations and residential activities. This hands-on resource provides professionals with a comprehensive picture of the Internet protocol stack and the role of TCP/IP in data communication. It serves as a detailed guide to the protocols, networks, codes, signals, and equipment that make it possible to communicate using TCP/IP.

    pdf275p ken333 06-07-2012 59 18   Download

  • Having opened SPSS you will get a dialogue box which you can cancel the first time you enter SPSS. Enlarge the window. SPSS is like a spreadsheet but it does not update calculations, tables or charts if you change the data. At the top of the screen are a series of menus which can be used to instruct SPSS to do something. SPSS uses 2 windows: The Data Editor, which is what you are looking at and which has 2 tabs at the bottom, and the Viewer. The Viewer is not visible yet, but opens automatically as soon as you open a file or run...

    pdf0p sofia11 15-05-2012 52 17   Download

  • The progress of data mining technology and large public popularity establish a need for a comprehensive text on the subject. The series of books entitled by "Data Mining" address the need by presenting in-depth description of novel mining algorithms and many useful applications. In addition to understanding each section deeply, the two books present useful hints and strategies to solving problems in the following chapters.

    pdf0p cucdai_1 20-10-2012 44 13   Download

  • This compendium aims at providing a comprehensive overview of the main topics that appear in any well-structured course sequence in statistics for business and economics at the undergraduate and MBA levels. The idea is to supplement either formal or informal statistic textbooks such as, e.g., “Basic Statistical Ideas for Managers” by D.K. Hildebrand and R.L. Ott and “The Practice of Business Statistics: Using Data for Decisions” by D.S. Moore, G.P. McCabe, W.M. Duckworth and S.L. Sclove, with a summary of theory as well as with a couple of extra examples.

    pdf150p sn_buon 29-11-2012 51 12   Download

  • SAS defines data mining as the process of uncovering hidden patterns in large amounts of data. Many industries use data mining to address business problems and opportunities such as fraud detection, risk and affinity analyses, database marketing, householding, customer churn, bankruptcy prediction, and portfolio analysis.The SAS data mining process is summarized in the acronym SEMMA, which stands for sampling, exploring, modifying, modeling, and assessing data.

    pdf153p transang3 30-09-2012 46 6   Download

  • The sub-series Ternary Alloy Systems of the Landolt-Börnstein New Series provides reliable and comprehensive descriptions of the materials constitution, based on critical intellectual evaluations of all data available at the time and it critically weights the different findings, also with respect to their compatibility with today’s edge binary phase diagrams. Selected are ternary systems of importance to alloy development and systems which gained in the recent years otherwise scientific interest.

    pdf722p tom_123 14-11-2012 25 3   Download

  • Built-In Database Objects In addition to creating the database files, several other structures are created. Data dictionary: Contains descriptions of the objects in the database Dynamic performance tables: Contains information used by the database administrator (DBA) to monitor and tune the database and instance PL/SQL packages: Program units adding functionality to the database. These packages are created when the catproc.sql script is run after the CREATE DATABASE command. PL/SQL packages will not be discussed within the scope of this course.

    ppt20p trinh02 28-01-2013 20 3   Download

  • To learn the white space availability at a given location, the database could use one of two schemes: use spectrum measurements for that location, or compute spectrum availabil- ity using RF propagation models. The former, a data-driven approach, requires extensive wardriving measurements at low sensitivity thresholds and may take a long time to be complete. Furthermore, the measurements will have to be repeated when- ever the primary user’s transmission characteristics, such as transmit power, antenna height, license terms, etc., change.

    pdf12p yasuyidol 02-04-2013 16 3   Download

  • International  education  has  a  considerable  economic  presence  in  Australia.  This  is most  clearly  seen  in  a  comparison  of  export  income  data.  For  2009,  the  ABS  calculates  that  service exports attributable  to Education‐related  travel  services were equal  to around $18  billion. 12  This  includes all expenditure attributable to  international students  in Australia, be  it  fees  or  living  expenses.

    pdf0p trinhcaidat 19-04-2013 24 3   Download

  • The problem addressed in this paper is to segment a given multilingual document into segments for each language and then identify the language of each segment. The problem was motivated by an attempt to collect a large amount of linguistic data for non-major languages from the web.

    pdf10p nghetay_1 07-04-2013 15 2   Download

  • We have constructed a corpus of news articles in which events are annotated for estimated bounds on their duration. Here we describe a method for measuring inter-annotator agreement for these event duration distributions. We then show that machine learning techniques applied to this data yield coarse-grained event duration information, considerably outperforming a baseline and approaching human performance.

    pdf8p hongvang_1 16-04-2013 11 2   Download

  • We address the problem of clustering words (or constructing a thesaurus) based on co-occurrence data, and using the acquired word classes to improve the accuracy of syntactic disambiguation. We view this problem as that of estimating a joint probability distribution specifying the joint probabilities of word pairs, such as noun verb pairs. We propose an efficient algorithm based on the Minimum Description Length (MDL) principle for estimating such a probability distribution. Our method is a natural extension of those proposed in (Brown et al.

    pdf7p bunrieu_1 18-04-2013 10 2   Download

  • Most algorithms dedicated to the generation of referential descriptions widely suffer from a fundamental problem: they make too strong assumptions about adjacent processing components, resulting in a limited coordination with their perceptive and linguistics data, that is, the provider for object descriptors and the lexical expression by which the chosen descriptors is ultimately realized.

    pdf8p bunthai_1 06-05-2013 10 2   Download

  • Acquiring information systems specifications from natural language description is presented as a problem class that requires a different treatment of semantics when compared with other applied NL systems such as database and operating system interfaces. Within this problem class, the specific task of obtaining explicit conceptual data models from natural language text or dialogue is being investigated. The knowledge brought to bear on this task is classified into syntactic, semantic and systems analysis knowledge.

    pdf8p buncha_1 08-05-2013 24 2   Download

  • In the work of Baader and Distel, a method has been proposed to axiomatize all general concept inclusions (GCIs) expressible in the description logic ℰℒK and valid in a given interpretation ℐ. This provides us with an effective method to learn ℰℒK-ontologies from interpretations. In this work, we want to extend this approach in the direction of handling errors, which might be present in the data-set.

    pdf16p namdmcist 08-07-2015 9 2   Download

Đồng bộ tài khoản