Fixing wide-and-deep to use normalized inputs

Thanks for this great book/repo. I noticed the wide and deep example is using non-normalized inputs which leads to performance degradation.
This commit is contained in:
GarrettJenkinson
2022-06-27 09:27:51 -05:00
committed by GitHub
parent eabcc492aa
commit bdd246bf07

View File

@@ -1568,7 +1568,7 @@
"normalized = normalization_layer(input_)\n",
"hidden1 = hidden_layer1(normalized)\n",
"hidden2 = hidden_layer2(hidden1)\n",
"concat = concat_layer([input_, hidden2])\n",
"concat = concat_layer([normalized, hidden2])\n",
"output = output_layer(concat)\n",
"\n",
"model = tf.keras.Model(inputs=[input_], outputs=[output])"