Where Is Language

Download Where Is Language PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Where Is Language book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
Where is Language?

Language is central to human experience and our understanding of who we are, whether written or unwritten, sung or spoken. But what is language and how do we record it? Where does it reside? Does it exist and evolve within written sources, in performance, in the mind or in speech? For too long, ethnographic, aesthetic and sociolinguistic studies of language have remained apart from analyses emerging from traditions such as literature and performance. Where is Language? argues for a more complex and contextualized understanding of language across this range of disciplines, engaging with key issues, including orality, literacy, narrative, ideology, performance and the human communities in which these take place. Eminent anthropologist Ruth Finnegan draws together a lifetime of ethnographic case studies, reading and personal commentary to explore the roles and nature of language in cultures across the world, from West Africa to the South Pacific. By combining research and reflections, Finnegan discusses the multi-modality of language to provide an account not simply of vocabulary and grammar, but one which questions the importance of cultural settings and the essence of human communication itself.
Language and Automata Theory and Applications

This book constitutes the refereed proceedings of the 7th International Conference on Language and Automata Theory and Applications, LATA 2013, held in Bilbao, Spain in April 2013. The 45 revised full papers presented together with 5 invited talks were carefully reviewed and selected from 97 initial submissions. The volume features contributions from both classical theory fields and application areas (bioinformatics, systems biology, language technology, artificial intelligence, etc.). Among the topics covered are algebraic language theory; algorithms for semi-structured data mining; algorithms on automata and words; automata and logic; automata for system analysis and program verification; automata, concurrency and Petri nets; automatic structures; cellular automata; combinatorics on words; computability; computational complexity; computational linguistics; data and image compression; decidability questions on words and languages; descriptional complexity; DNA and other models of bio-inspired computing; document engineering; foundations of finite state technology; foundations of XML; fuzzy and rough languages; grammars (Chomsky hierarchy, contextual, multidimensional, unification, categorial, etc.); grammars and automata architectures; grammatical inference and algorithmic learning; graphs and graph transformation; language varieties and semigroups; language-based cryptography; language-theoretic foundations of artificial intelligence and artificial life; parallel and regulated rewriting; parsing; pattern recognition; patterns and codes; power series; quantum, chemical and optical computing; semantics; string and combinatorial issues in computational biology and bioinformatics; string processing algorithms; symbolic dynamics; symbolic neural networks; term rewriting; transducers; trees, tree languages and tree automata; weighted automata.
Algebraic Structures in Natural Language

Algebraic Structures in Natural Language addresses a central problem in cognitive science concerning the learning procedures through which humans acquire and represent natural language. Until recently algebraic systems have dominated the study of natural language in formal and computational linguistics, AI, and the psychology of language, with linguistic knowledge seen as encoded in formal grammars, model theories, proof theories and other rule-driven devices. Recent work on deep learning has produced an increasingly powerful set of general learning mechanisms which do not apply rule-based algebraic models of representation. The success of deep learning in NLP has led some researchers to question the role of algebraic models in the study of human language acquisition and linguistic representation. Psychologists and cognitive scientists have also been exploring explanations of language evolution and language acquisition that rely on probabilistic methods, social interaction and information theory, rather than on formal models of grammar induction. This book addresses the learning procedures through which humans acquire natural language, and the way in which they represent its properties. It brings together leading researchers from computational linguistics, psychology, behavioral science and mathematical linguistics to consider the significance of non-algebraic methods for the study of natural language. The text represents a wide spectrum of views, from the claim that algebraic systems are largely irrelevant to the contrary position that non-algebraic learning methods are engineering devices for efficiently identifying the patterns that underlying grammars and semantic models generate for natural language input. There are interesting and important perspectives that fall at intermediate points between these opposing approaches, and they may combine elements of both. It will appeal to researchers and advanced students in each of these fields, as well as to anyone who wants to learn more about the relationship between computational models and natural language.