parameter magnitudes 와 infrequent gradient 계산으로 training을 통해 fixed parameter count 와 a fixed computational cost 를 갖는 sparse neural network 방법(RigL)을 소개


a random sparse network, and at regularly spaced intervals it removes a fraction of connections based on their magnitudes and activates new ones using instantaneous gradient information. After updating the connectivity, training continues with the updated network until the next update.