Neural architecture search (NAS) deals with the selection of neural models for specific learning problems. NAS, however, is computationally expensive for automating and democratising machine learning. The initial success of NAS was attributed partially to the weight-sharing method, which helped in the dramatic acceleration of probing the architectures. But why is the weight sharing method being criticised?

Read more: https://analyticsindiamag.com/weight-sharing-deep-learning/

#deeplearning #artificial-intelligence

What Is Weight Sharing In Deep Learning And Why Is It Important
1.35 GEEK