Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems

You must be logged in to access this title.

Sign up now

Already a member? Log in

Synopsis

The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.

Book details

Edition:
1st ed. 2023
Series:
Studies in Computational Intelligence (Book 1100)
Author:
Witold Pedrycz, Shyi-Ming Chen
ISBN:
9783031320958
Related ISBNs:
9783031320941
Publisher:
Springer International Publishing
Pages:
N/A
Reading age:
Not specified
Includes images:
Yes
Date of addition:
2023-07-14
Usage restrictions:
Copyright
Copyright date:
2023
Copyright by:
The Editor 
Adult content:
No
Language:
English
Categories:
Computers and Internet, Nonfiction, Technology