The Author Online Book Forums are Moving

The Author Online Book Forums will soon redirect to Manning's liveBook and liveVideo. All book forum content will migrate to liveBook's discussion forum and all video forum content will migrate to liveVideo. Log in to liveBook or liveVideo with your Manning credentials to join the discussion!

Thank you for your engagement in the AoF over the years! We look forward to offering you a more enhanced forum experience.

470512 (3) [Avatar] Offline
#1
Hi,
Page 182 MEAP 11.
I think that the fifth line (definition of sigmoid) should be indented.

Furthermore, I do not understand this line:
layer_2 = sigmoid(np.dot(layer_1,weights_1_2)) # linear + softmax

Why are we applying the sigmoid function to obtain the softmax? The same is repeated in test data block.

Thanks!
575487 (1) [Avatar] Offline
#2
Yes, I noticed this bug as well. It should be softmax in the second ;line for both training and testing.

Furthermore, the description on page 181-182 (Intro to Embedding Layer) discusses One-Hot Encoding of words and how to sum the vectors to avoid matrix vector multiplication. But the code uses Word indexes instead of one hot encoding. Also, the concept of Embeddings is quite confusing especially what is the significance of their hidden size (100 in the example code).