Hey,
Thx for the book great read.
Chapter 9
9.4.1
In the description you talk about softmax_cross_entropy_with_logits but the link to tensorflow official documentationis points to sigmoid cross entropy and even that link is dead.
I would of wished to see as well some more detailed explanations on the functions you use from tensorflow and as well some information about the way certain shapes are chosen. For instance Listing 9.11 Set up CNN weights:
x = tf.placeholder(tf.float32, [None, 24 * 24])
y = tf.placeholder(tf.float32, [None, len(names)])
W1 = tf.Variable(tf.random_normal([5, 5, 1, 64]))
b1 = tf.Variable(tf.random_normal([64]))
W2 = tf.Variable(tf.random_normal([5, 5, 64, 64]))
b2 = tf.Variable(tf.random_normal([64]))
W3 = tf.Variable(tf.random_normal([6*6*64, 1024]))
b3 = tf.Variable(tf.random_normal([1024]))
We have here 2 filters * 64 convolutions and a max pool (fully connected layer) but it is not clear how the dimensions are defined. (at least not to me)
The 64 in W1 are the number of convolutions.
I guess that the 64, 64 in W2 are the number of convolutions and the shape of the prev conv of W1
But then shape of W3 [6*6*64, 1024] is pure magic.
