Free shipping on orders over $99
Accelerators for Convolutional Neural Networks

Accelerators for Convolutional Neural Networks

by Arslan MunirJoonho Kong and Mahmood Azhar Qureshi
Hardback
Publication Date: 01/11/2023

Share This Book:

  $265.05
or 4 easy payments of $66.26 with
afterpay
This item qualifies your order for FREE DELIVERY
Accelerators for Convolutional Neural Networks

Comprehensive and thorough resource exploring different types of convolutional neural networks and complementary accelerators

Accelerators for Convolutional Neural Networks provides basic deep learning knowledge and instructive content to build up convolutional neural network (CNN) accelerators for the Internet of things (IoT) and edge computing practitioners, elucidating compressive coding for CNNs, presenting a two-step lossless input feature maps compression method, discussing arithmetic coding -based lossless weights compression method and the design of an associated decoding method, describing contemporary sparse CNNs that consider sparsity in both weights and activation maps, and discussing hardware/software co-design and co-scheduling techniques that can lead to better optimization and utilization of the available hardware resources for CNN acceleration.

The first part of the book provides an overview of CNNs along with the composition and parameters of different contemporary CNN models. Later chapters focus on compressive coding for CNNs and the design of dense CNN accelerators. The book also provides directions for future research and development for CNN accelerators.

Other sample topics covered in Accelerators for Convolutional Neural Networks include:

  • How to apply arithmetic coding and decoding with range scaling for lossless weight compression for 5-bit CNN weights to deploy CNNs in extremely resource-constrained systems
  • State-of-the-art research surrounding dense CNN accelerators, which are mostly based on systolic arrays or parallel multiply-accumulate (MAC) arrays
  • iMAC dense CNN accelerator, which combines image-to-column (im2col) and general matrix multiplication (GEMM) hardware acceleration
  • Multi-threaded, low-cost, log-based processing element (PE) core, instances of which are stacked in a spatial grid to engender NeuroMAX dense accelerator
  • Sparse-PE, a multi-threaded and flexible CNN PE core that exploits sparsity in both weights and activation maps, instances of which can be stacked in a spatial grid for engendering sparse CNN accelerators

For researchers in AI, computer vision, computer architecture, and embedded systems, along with graduate and senior undergraduate students in related programs of study, Accelerators for Convolutional Neural Networks is an essential resource to understanding the many facets of the subject and relevant applications.

ISBN:
9781394171880
9781394171880
Category:
Electronics & communications engineering
Format:
Hardback
Publication Date:
01-11-2023
Language:
English
Publisher:
John\Wiley#& Sons, Incorporated
Country of origin:
United States
Dimensions (mm):
229x152x20mm
Weight:
0.68kg

This title is in stock with our Australian supplier and should arrive at our Sydney warehouse within 2 - 3 weeks of you placing an order.

Once received into our warehouse we will despatch it to you with a Shipping Notification which includes online tracking.

Please check the estimated delivery times below for your region, for after your order is despatched from our warehouse:

ACT Metro: 2 working days
NSW Metro: 2 working days
NSW Rural: 2-3 working days
NSW Remote: 2-5 working days
NT Metro: 3-6 working days
NT Remote: 4-10 working days
QLD Metro: 2-4 working days
QLD Rural: 2-5 working days
QLD Remote: 2-7 working days
SA Metro: 2-5 working days
SA Rural: 3-6 working days
SA Remote: 3-7 working days
TAS Metro: 3-6 working days
TAS Rural: 3-6 working days
VIC Metro: 2-3 working days
VIC Rural: 2-4 working days
VIC Remote: 2-5 working days
WA Metro: 3-6 working days
WA Rural: 4-8 working days
WA Remote: 4-12 working days

Reviews

Be the first to review Accelerators for Convolutional Neural Networks.