Submitted by [deleted] t3_xtpksa in MachineLearning
McMa t1_iqtcdfp wrote
Reply to comment by abystoma in [D] Hidden unit connected to each other in a single layer by [deleted]
Just a normal hidden layer. The connection between nodes of the same layer in your schema can be thought of as two layers where each node forwards it’s exact value to the corresponding node of the next layer (weight frozen to 1), and the rest of the weights get trained.
Viewing a single comment thread. View all comments