NinjaUnlikely6343
NinjaUnlikely6343 OP t1_j5vps4g wrote
Reply to comment by ChingBlue in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Sweet! Thanks a lot!
NinjaUnlikely6343 OP t1_j5vkttj wrote
Reply to comment by thatpretzelife in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
At neural networks specifically haha!
NinjaUnlikely6343 OP t1_j5udl0q wrote
Reply to comment by PsecretPseudonym in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Makes sense!
NinjaUnlikely6343 OP t1_j5tmucg wrote
Reply to comment by thatpretzelife in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Thanks for the advice! I'm actually already SSH tunneling to the immense computing resources at Compute Canada. It's still extremely long haha
NinjaUnlikely6343 OP t1_j5rjhvi wrote
Reply to comment by suflaj in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Thanks a lot! I'll try that and keep you posted
NinjaUnlikely6343 OP t1_j5r4d3x wrote
Reply to comment by suflaj in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Thanks a lot for the detailed response! Didn't know you could use a portion of the dataset and expect approximately what you'd get with the whole set. I'm currently just testing different learning rates, but I thought about having a go at dropout rate as well.
Submitted by NinjaUnlikely6343 t3_10kecyc in deeplearning
NinjaUnlikely6343 OP t1_j5vq6vj wrote
Reply to comment by emad_eldeen in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
I've heard of it when I started delving into deep learning, but it seemed too complex for me at the time. I'll check it out!