Trends in Parsing Technology, 2010
Dependency Parsing, Domain Adaptation, and Deep Parsing

Text, Speech and Language Technology Series, Vol. 43

Coordinators: Bunt Harry, Merlo Paola, Nivre Joakim

Language: English

Approximative price 105.49 €

In Print (Delivery period: 15 days).

Add to cartAdd to cart
Publication date:
298 p. · 15.5x23.5 cm · Paperback
Computer parsing technology, which breaks down complex linguistic structures into their constituent parts, is a key research area in the automatic processing of human language. This volume is a collection of contributions from leading researchers in the field of natural language processing technology, each of whom detail their recent work which includes new techniques as well as results. The book presents an overview of the state of the art in current research into parsing technologies, focusing on three important themes: dependency parsing, domain adaptation, and deep parsing. The technology, which has a variety of practical uses, is especially concerned with the methods, tools and software that can be used to parse automatically. Applications include extracting information from free text or speech, question answering, speech recognition and comprehension, recommender systems, machine translation, and automatic summarization. New developments in the area of parsing technology are thus widely applicable, and researchers and professionals from a number of fields will find the material here required reading. As well as the other four volumes on parsing technology in this series this book has a breadth of coverage that makes it suitable both as an overview of the field for graduate students, and as a reference for established researchers in computational linguistics, artificial intelligence, computer science, language engineering, information science, and cognitive science. It will also be of interest to designers, developers, and advanced users of natural language processing systems, including applications such as spoken dialogue, text mining, multimodal human-computer interaction, and semantic web technology.
Current Trends in Parsing Technology, Paola Merlo, Harry Bunt and Joakim Nivre Single Malt or Blended? A Study in Multilingual Parser Optimization, Johan Hall, Jens Nilsson and Joakim Nivre A Latent Variable Model for Generative Dependency Parsing, Ivan Titov and James Henderson Dependency Parsing and Domain Adaption with Data-Driven LR Models and Parser Ensembles, Kenji Sagae and Jun’ichi Tsujii Dependency Parsing Using Global Features, Tetsuji Nakagawa Dependency Parsing with Second-Order Feature Maps and Annotated Semantic Information, Massimiliano Ciaramita and Guiseppe Attardi Strictly Lexicalised Dependency Parsing, Qin Iris Wang, Dale Schuurmans and Dekang Lin Favor Short Dependencies: Parsing with Soft and Hard Constraints on Dependency Length, Jason Eisner and Noah A. Smith Corrective Dependency Parsing, Keith Hall and Václav Novák Inducing Lexicalised PCFGs with Latent Heads, Detlef Prescher Self-Trained Bilexical Preferences to Improve Disambiguation Accuracy, Gertjan van Noord Are Very Large Context-Free Grammars Tractable? Pierre Boullier and Benoît Sagot Efficiency in Unification-Based N-Best Parsing, Yi Zhang, Stephan Oepen and John Carroll HPSG Parsing with a Supertagger, Takashi Ninomiya, Takuya Matsuzaki, Yusuke Miyao, Yoshimasa Tsuruoka and Jun‘ichi Tsujii Evaluating the Impact of Re-training a Lexical Disambiguation Model on Domain Adaption of an HPSG Parser, Tadayoshi Hara, Yusuke Miyao and Jun’ichi Tsujii Semi-supervised Training of a Statistical Parser from Unlabeled Partially-bracketed Data, Rebecca Watson, Ted Briscoe and John Carroll Index

Collects contributions from many of today’s leading researchers in the area of natural language processing technology

Describes the contributors’ most recent work and a range of new techniques and results

Presents a state-of-the-art overview of current research in parsing tehcnologies with a focus on three important themes in the field today: dependency parsing, domain adaptation, and deep parsing