Abstract: Knowledge distillation is an effective method for training small and efficient deep learning models. However, the efficacy of a single method can degenerate when transferring to other tasks, ...
Abstract: Aiming at the recognition problems such as facial changes and uneven grayscale caused by expression and illumination changes in image recognition, a two dimensional Locality Preserving ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results