Analogue Imprecision in MLP Training (Progress in Neural Processing v. 4)

Analogue Imprecision in MLP Training (Progress in Neural Processing v. 4)

By: A.F. Murray (editor), Peter J. Edwards (editor)Hardback

1 - 2 weeks availability

£71.00 With FREE Saver Delivery


Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a"fault tolerance hint" can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.

Create a review


Neural network performance metrics; noise in neural implementations; simulation requirements and environment; fault tolerance; generalisation ability; learning trajectory and speed.

Product Details

  • publication date: 08/01/1996
  • ISBN13: 9789810227395
  • Format: Hardback
  • Number Of Pages: 192
  • ID: 9789810227395
  • ISBN10: 9810227396

Delivery Information

  • Saver Delivery: Yes
  • 1st Class Delivery: Yes
  • Courier Delivery: Yes
  • Store Delivery: Yes

Prices are for internet purchases only. Prices and availability in WHSmith Stores may vary significantly