Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4 (Progress In Neural Processing 4)
By: A.F. Murray (author), Peter J. Edwards (author)Hardback
Up to 2 WeeksUsually despatched within 2 weeks
Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a"fault tolerance hint" can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
Neural network performance metrics; noise in neural implementations; simulation requirements and environment; fault tolerance; generalisation ability; learning trajectory and speed.
Number Of Pages:
- ID: 9789810227395
- Saver Delivery: Yes
- 1st Class Delivery: Yes
- Courier Delivery: Yes
- Store Delivery: Yes
Prices are for internet purchases only. Prices and availability in WHSmith Stores may vary significantly
© Copyright 2013 - 2018 WHSmith and its suppliers.
WHSmith High Street Limited Greenbridge Road, Swindon, Wiltshire, United Kingdom, SN3 3LD, VAT GB238 5548 36