Accelerators for Convolutional Neural Networks
Home > Computing and Information Technology > Computer science > Artificial intelligence > Accelerators for Convolutional Neural Networks
Accelerators for Convolutional Neural Networks

Accelerators for Convolutional Neural Networks

|
     0     
5
4
3
2
1




International Edition


About the Book

Accelerators for Convolutional Neural Networks Comprehensive and thorough resource exploring different types of convolutional neural networks and complementary accelerators Accelerators for Convolutional Neural Networks provides basic deep learning knowledge and instructive content to build up convolutional neural network (CNN) accelerators for the Internet of things (IoT) and edge computing practitioners, elucidating compressive coding for CNNs, presenting a two-step lossless input feature maps compression method, discussing arithmetic coding -based lossless weights compression method and the design of an associated decoding method, describing contemporary sparse CNNs that consider sparsity in both weights and activation maps, and discussing hardware/software co-design and co-scheduling techniques that can lead to better optimization and utilization of the available hardware resources for CNN acceleration. The first part of the book provides an overview of CNNs along with the composition and parameters of different contemporary CNN models. Later chapters focus on compressive coding for CNNs and the design of dense CNN accelerators. The book also provides directions for future research and development for CNN accelerators. Other sample topics covered in Accelerators for Convolutional Neural Networks include: How to apply arithmetic coding and decoding with range scaling for lossless weight compression for 5-bit CNN weights to deploy CNNs in extremely resource-constrained systems State-of-the-art research surrounding dense CNN accelerators, which are mostly based on systolic arrays or parallel multiply-accumulate (MAC) arrays iMAC dense CNN accelerator, which combines image-to-column (im2col) and general matrix multiplication (GEMM) hardware acceleration Multi-threaded, low-cost, log-based processing element (PE) core, instances of which are stacked in a spatial grid to engender NeuroMAX dense accelerator Sparse-PE, a multi-threaded and flexible CNN PE core that exploits sparsity in both weights and activation maps, instances of which can be stacked in a spatial grid for engendering sparse CNN accelerators For researchers in AI, computer vision, computer architecture, and embedded systems, along with graduate and senior undergraduate students in related programs of study, Accelerators for Convolutional Neural Networks is an essential resource to understanding the many facets of the subject and relevant applications.

Table of Contents:
About the Authors xiii Preface xv Part I Overview 1 1 Introduction 3 1.1 History and Applications 5 1.2 Pitfalls of High-Accuracy DNNs/CNNs 6 1.2.1 Compute and Energy Bottleneck 6 1.2.2 Sparsity Considerations 9 1.3 Chapter Summary 11 2 Overview of Convolutional Neural Networks 13 2.1 Deep Neural Network Architecture 13 2.2 Convolutional Neural Network Architecture 15 2.3 Popular CNN Models 26 2.4 Popular CNN Datasets 30 2.5 CNN Processing Hardware 31 2.6 Chapter Summary 37 Part II Compressive Coding for CNNs 39 3 Contemporary Advances in Compressive Coding for CNNs 41 3.1 Background of Compressive Coding 41 3.2 Compressive Coding for CNNs 43 3.3 Lossy Compression for CNNs 43 3.4 Lossless Compression for CNNs 44 3.5 Recent Advancements in Compressive Coding for CNNs 48 3.6 Chapter Summary 50 4 Lossless Input Feature Map Compression 51 4.1 Two-Step Input Feature Map Compression Technique 52 4.2 Evaluation 55 4.3 Chapter Summary 57 5 Arithmetic Coding and Decoding for 5-Bit CNN Weights 59 5.1 Architecture and Design Overview 60 5.2 Algorithm Overview 63 5.3 Weight Decoding Algorithm 67 5.4 Encoding and Decoding Examples 69 5.5 Evaluation Methodology 74 5.6 Evaluation Results 75 5.7 Chapter Summary 84 Part III Dense CNN Accelerators 85 6 Contemporary Dense CNN Accelerators 87 6.1 Background on Dense CNN Accelerators 87 6.2 Representation of the CNNWeights and Feature Maps in Dense Format 87 6.3 Popular Architectures for Dense CNN Accelerators 89 6.4 Recent Advancements in Dense CNN Accelerators 92 6.5 Chapter Summary 93 7 iMAC: Image-to-Column and General Matrix Multiplication-Based Dense CNN Accelerator 95 7.1 Background and Motivation 95 7.2 Architecture 97 7.3 Implementation 99 7.4 Chapter Summary 100 8 NeuroMAX: A Dense CNN Accelerator 101 8.1 RelatedWork 102 8.2 Log Mapping 103 8.3 Hardware Architecture 105 8.4 Data Flow and Processing 108 8.5 Implementation and Results 118 8.6 Chapter Summary 124 Part IV Sparse CNN Accelerators 125 9 Contemporary Sparse CNN Accelerators 127 9.1 Background of Sparsity in CNN Models 127 9.2 Background of Sparse CNN Accelerators 128 9.3 Recent Advancements in Sparse CNN Accelerators 131 9.4 Chapter Summary 133 10 CNN Accelerator for In Situ Decompression and Convolution of Sparse Input Feature Maps 135 10.1 Overview 135 10.2 Hardware Design Overview 135 10.3 Design Optimization Techniques Utilized in the Hardware Accelerator 140 10.4 FPGA Implementation 141 10.5 Evaluation Results 143 10.6 Chapter Summary 149 11 Sparse-PE: A Sparse CNN Accelerator 151 11.1 RelatedWork 155 11.2 Sparse-PE 156 11.3 Implementation and Results 174 11.4 Chapter Summary 184 12 Phantom: A High-Performance Computational Core for Sparse CNNs 185 12.1 RelatedWork 189 12.2 Phantom 190 12.3 Phantom-2D 201 12.4 Experiments and Results 209 12.5 Chapter Summary 218 Part V HW/SW Co-Design and Co-Scheduling for CNN Acceleration 221 13 State-of-the-Art in HW/SW Co-Design and Co-Scheduling for CNN Acceleration 223 13.1 HW/SW Co-Design 223 13.2 HW/SW Co-Scheduling 228 13.3 Chapter Summary 230 14 Hardware/Software Co-Design for CNN Acceleration 231 14.1 Background of iMAC Accelerator 231 14.2 Software Partition for iMAC Accelerator 232 14.3 Experimental Evaluations 235 14.4 Chapter Summary 237 15 CPU-Accelerator Co-Scheduling for CNN Acceleration 239 15.1 Background and Preliminaries 240 15.2 CNN Acceleration with CPU-Accelerator Co-Scheduling 242 15.3 Experimental Results 251 15.4 Chapter Summary 257 16 Conclusions 259 References 265 Index 285


Best Sellers


Product Details
  • ISBN-13: 9781394171880
  • Publisher: John Wiley & Sons Inc
  • Publisher Imprint: Wiley-IEEE Press
  • Language: English
  • Returnable: N
  • Weight: 630 gr
  • ISBN-10: 1394171889
  • Publisher Date: 16 Oct 2023
  • Binding: Hardback
  • No of Pages: 304
  • Returnable: N


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
Accelerators for Convolutional Neural Networks
John Wiley & Sons Inc -
Accelerators for Convolutional Neural Networks
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Accelerators for Convolutional Neural Networks

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals

    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!