i2dl tum i2dl homework: exercise_1: softmax_loss two_layer_net exercise_2: layer: affine_forward, affine_backward relu_forward, relu_backward batch normalization dropout projects: Segmentation(cnn) Resnet论文复现 chinamoney爬虫