Skip to main content

Syntactic Networks—Kernel Memory Approach

  • Book
  • © 2024

Overview

  • Focuses upon providing a framework to model a composite network system
  • Proposes a novel connectionist approach to a challenging topic of language modeling
  • Presents a conceptual framework of kernel memory, rather than pursuing the individual topics each by each

Part of the book series: Studies in Computational Intelligence (SCI, volume 1157)

  • 729 Accesses

This is a preview of subscription content, log in via an institution to check access.

Access this book

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

eBook USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

About this book

This book proposes a novel connectionist approach to a challenging topic of language modeling within the context of kernel memory and artificial mind system, both proposed previously by the author in the very first volume of the series, Artificial Mind System—Kernel Memory Approach: Studies in Computational Intelligence, Vol. 1. The present volume focuses on how syntactic structures of language are modeled in terms of the respective composite connectionist architectures, each embracing both the nonsymbolic and symbolic parts. These two parts are developed via inter-module processes within the artificial mind system and eventually integrated under a unified framework of kernel memory. The data representation by the networks embodied within the kernel memory principle is essentially local, unlike conventional artificial neural network models such as the pervasive multilayer perceptron-based neural networks. With this locality principle, kernel memory inherently bears many attractive features, such as topologically unconstrained network formation, straightforward network growing, shrinking, and reconfiguration, no requirement of arduous iterative parameter tuning, construction of transparent and hierarchical data structures, and multimodal and temporal data processing via the network representation. Exploiting these multifacet properties of kernel memory with interweaving the notion of inter-module processing within the artificial mind system provides coherent accounts for concept formation and how various linguistic phenomena, viz. word compoundings, morphologies, and multiword constructions, are modeled. The description is then extended to more intricate network models of context-dependent lexical network and syntactic-oriented processing, the latter being the central theme of the present study, and further to those representing a hybrid of nonverbal and verbal thinking, and semantic and pragmatic aspects of sentential meaning. The book is intended for general readers engaging in various areas of study in cognitive science, computer science, engineering, linguistics, philosophy, psycholinguistics, and psychology.

Keywords

Table of contents (8 chapters)

Authors and Affiliations

  • Dept. Computer Eng., College of Science and Technology, Nihon University, Funabashi-City, Japan

    Tetsuya Hoya

Bibliographic Information

Publish with us