Recent Advances in Parsing Technology
Home > Computing and Information Technology > Computer science > Artificial intelligence > Natural language and machine translation > Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)
Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)

Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)


     0     
5
4
3
2
1



International Edition


X
About the Book

In Marcus (1980), deterministic parsers were introduced. These are parsers which satisfy the conditions of Marcus's determinism hypothesis, i.e., they are strongly deterministic in the sense that they do not simulate non­ determinism in any way. In later work (Marcus et al. 1983) these parsers were modified to construct descriptions of trees rather than the trees them­ selves. The resulting D-theory parsers, by working with these descriptions, are capable of capturing a certain amount of ambiguity in the structures they build. In this context, it is not clear what it means for a parser to meet the conditions of the determinism hypothesis. The object of this work is to clarify this and other issues pertaining to D-theory parsers and to provide a framework within which these issues can be examined formally. Thus we have a very narrow scope. We make no ar­ guments about the linguistic issues D-theory parsers are meant to address, their relation to other parsing formalisms or the notion of determinism in general. Rather we focus on issues internal to D-theory parsers themselves.

Table of Contents:
1 Parsing Technologies, and Why We Need Them.- 1.1 Parsing Technologies.- 1.2 About this book.- 2 Fully Incremental Parsing.- 2.1 Introduction.- 2.2 The Change-Update Loop.- 2.3 Reparsing.- 2.4 The Affected and Regenerated Sets.- 2.5 The Disturbance Set.- 2.6 Primitive Changes.- 2.7 Formal Properties.- 2.8 Implementation.- 2.9 Conclusions.- 3 Increasing the Applicability of LR Parsing.- 3.1 Introduction.- 3.2 Hidden left recursion and LR parsing.- 3.3 Correctness of ?-LR parsing.- 3.4 Calculation of items.- 3.5 Memory requirements.- 3.6 Conclusions.- 4 Towards a Formal Understanding of the Determinism Hypothesis in D-Theory.- 4.1 Introduction.- 4.2 D-Theory and the Determinism Hypothesis.- 4.3 Formalization.- 4.4 Consequences for the D-Theory Parsers.- 4.5 Conclusion.- 5 Varieties of Heuristics in Sentence Parsing.- 5.1 Kinds of grammar and their characteristics.- 5.2 Varieties of heuristic components in parsing sentences.- 5.3 Staged design for parsing sentences.- 5.4 Conclusion.- 6 Parsing as Dynamic Interpretation of Feature Structures.- 6.1 Introduction: the status of feature structures.- 6.2 Feature structures as formal-language expressions.- 6.3 Feature structures in the GEL language.- 6.4 Parsing as model-theoretic interpretation.- 6.5 Conclusions and future work.- 7 Proof Theory for HPSG Parsing.- 7.1 Introduction.- 7.2 An Overview of HPSG.- 7.3 Binary-branching HPSG.- 7.4 Deduction for HPSG.- 7.5 Implementation.- 7.6 Lexical entries.- 7.7 Performance.- 7.8 Conclusions.- 8 Efficient Parsing of Compiled Typed Attribute-Value Logic Grammars.- 8.1 Compiling Type Definitions.- 8.2 Compiling Basic Operations.- 8.3 Compiling Descriptions.- 8.4 Compiling Grammars and Programs.- 8.5 Extensional Types.- 8.6 Inequations.- 8.7 Type Constraints.- 8.8 Conclusion.- 9Predictive Head-Corner Chart Parsing.- 9.1 Introduction.- 9.2 Chart parsing.- 9.3 Left-corner chart parsing.- 9.4 Head-Corner chart parsing.- 9.5 Complexity analysis and further optimizations.- 9.6 Extension with feature structures.- 9.7 Related approaches.- 9.8 Conclusions.- 10 GLR* — An Efficient Noise-Skipping Parsing Algorithm for Context-Free Grammars.- 10.1 Introduction.- 10.2 The GLR* Parsing Algorithm.- 10.3 The Beam Search Heuristic.- 10.4 Parsing of Spontaneous Speech Using GLR*.- 10.5 Conclusions and Future Research Directions.- 11 Evaluation of the Tagged Text Parser, A Preliminary Report.- 11.1 Overview.- 11.2 Introduction to Tagged Text Parser.- 11.3 The TTP Time-out Mechanism.- 11.4 Parser Evaluation with Parseval.- 11.5 Evaluation of TTP.- 11.6 Conclusions.- 12 Learning to Parse with Transformations.- 12.1 Introduction.- 12.2 Transformation-based, Error-driven Learning.- 12.3 Learning Phrase Structure.- 12.4 Results.- 12.5 Sample Output.- 12.6 Assigning Nonterminal Labels.- 12.7 Conclusions.- 13 Estimation of Verb Subcategorization Frame Frequencies based on Syntactic and Multidimensional Statistical Analysis.- 13.1 Introduction.- 13.2 Method.- 13.3 Experiment on Wall Street Journal Corpus.- 13.4 Statistical Analysis.- 13.5 Conclusion and Future Direction.- 14 Monte Carlo Parsing.- 14.1 Introduction.- 14.2 The Data-Oriented Parsing Framework.- 14.3 DOP as a Stochastic Tree-Substitution Grammar.- 14.4 Parsing and Disambiguation in DOP.- 14.5 Experiments with DOP.- 14.6 Conclusions.- 15 Stochastic Lexicalized Tree-Insertion Grammar.- 15.1 Motivation.- 15.2 LTIG.- 15.3 Stochastic LTIG.- 15.4 Parsing SLTIG.- 15.5 Training an SLTIG.- 15.6 Conclusion.- 16 The Interplay of Syntactic and Semantic Node Labels in Parsing.- 16.1 State of the Art Demands.- 16.2Partial Parsing.- 16.3 Semantic Grammars.- 16.4 Parsing to Objects.- 16.5 Defining Parsing Rules Schematically.- 16.6 A Parsing Example.- 16.7 Form Rules.- 16.8 Concluding Remarks.- 17 Integration of Morphological and Syntactic Analysis based on GLR Parsing.- 17.1 Introduction.- 17.2 Morphological Analysis of Japanese.- 17.3 Generating An LR Table.- 17.4 Algorithm.- 17.5 A worked example.- 17.6 Conclusion.- 18 Structural Disambiguation in Japanese by Case Structure Evaluation based on Examples in a Case Frame Dictionary.- 18.1 Introduction.- 18.2 Selecting a Proper Case Frame.- 18.3 Structural Disambiguation using Case Structure Score.- 18.4 Experiment.- 18.5 Conclusions.- 19 Flowgraph Parsing.- 19.1 Introduction and Motivation.- 19.2 Notation and Definitions.- 19.3 Chart Parsing of Context-free Flowgraphs.- 19.4 Complexity Analysis.- 19.5 Dealing With Attributes.- 19.6 Dealing With Tie-Point Relationships.- 19.7 Chart Parsing of Structure-Sharing Flowgraphs.- 19.8 Conclusions.- 20 Predictive Parsing for Unordered Relational Languages.- 20.1 Introduction.- 20.2 Atomic Relational Grammars.- 20.3 Parsing Preliminaries.- 20.4 Preliminaries to a Parser.- 20.5 Earley-style Recognition for ARGs.- 20.6 Parse trace.- 20.7 Conclusion.


Best Sellers


Product Details
  • ISBN-13: 9781402003714
  • Publisher: Kluwer Academic Publishers
  • Publisher Imprint: Kluwer Academic Publishers
  • Height: 235 mm
  • No of Pages: 432
  • Returnable: Y
  • Width: 155 mm
  • ISBN-10: 1402003714
  • Publisher Date: 30 Nov 2001
  • Binding: Paperback
  • Language: English
  • Returnable: Y
  • Series Title: 1 Text, Speech and Language Technology


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)
Kluwer Academic Publishers -
Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Recent Advances in Parsing Technology: (1 Text, Speech and Language Technology)

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!