3.6. Autoencoding

3.6.1. One-hot Encoder

In [1]:
from conx import *
Using TensorFlow backend.
/usr/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: compiletime version 3.5 of module 'tensorflow.python.framework.fast_tensor_util' does not match runtime version 3.6
  return f(*args, **kwds)
conx, version 3.5.15
In [3]:
size = 5
In [4]:
net = Network("Autoencoder")
net.add(Layer("input", size)) #, minmax=(0,1)))
net.add(Layer("hidden", 5, activation="relu", visible=True))
net.add(Layer("output", size, activation="sigmoid"))
net.config["font_family"] = "monospace"
In [5]:
net.connect()
In [6]:
net.compile(error='binary_crossentropy', optimizer="adam")
In [7]:
net
Out[7]:
AutoencoderLayer: output (output) shape = (5,) Keras class = Dense activation = sigmoidoutputWeights from hidden to output output/kernel:0 has shape (5, 5) output/bias:0 has shape (5,)Layer: hidden (hidden) shape = (5,) Keras class = Dense activation = reluhiddenWeights from input to hidden hidden/kernel:0 has shape (5, 5) hidden/bias:0 has shape (5,)Layer: input (input) shape = (5,) Keras class = Inputinput
In [8]:
patterns = [onehot(i, size) for i in range(size)]
In [9]:
patterns[0]
Out[9]:
[1, 0, 0, 0, 0]
In [10]:
net.dataset.load([(p,p) for p in patterns])
In [11]:
net
Out[11]:
AutoencoderLayer: output (output) shape = (5,) Keras class = Dense activation = sigmoidoutputWeights from hidden to output output/kernel:0 has shape (5, 5) output/bias:0 has shape (5,)Layer: hidden (hidden) shape = (5,) Keras class = Dense activation = reluhiddenWeights from input to hidden hidden/kernel:0 has shape (5, 5) hidden/bias:0 has shape (5,)Layer: input (input) shape = (5,) Keras class = Inputinput
In [12]:
import time
for i in range(size):
    net.propagate(net.dataset.inputs[i], visualize=True)
    time.sleep(1)
In [13]:
net.dataset.summary()

Dataset Split: * training : 5 * testing : 0 * total : 5

Input Summary: * shape : [(5,)] * range : [(0.0, 1.0)]

Target Summary: * shape : [(5,)] * range : [(0.0, 1.0)]

In [14]:
net.reset()
net.train(accuracy=1, epochs=10000, report_rate=200, tolerance=0.4, plot=True)
_images/Autoencoder_14_0.svg
========================================================================
       |  Training |  Training
Epochs |     Error |  Accuracy
------ | --------- | ---------
# 1789 |   0.09234 |   1.00000
In [15]:
net.propagate(net.dataset.inputs[0])
Out[15]:
[0.8396602272987366,
 0.05913377180695534,
 0.001064726267941296,
 0.015739908441901207,
 0.01280010212212801]
In [16]:
net.test(tolerance=0.4, show=True)
========================================================
Testing validation dataset with tolerance 0.4...
# | inputs | targets | outputs | result
---------------------------------------
0 | [[1.00,0.00,0.00,0.00,0.00]] | [[1.00,0.00,0.00,0.00,0.00]] | [0.84,0.06,0.00,0.02,0.01] | correct
1 | [[0.00,1.00,0.00,0.00,0.00]] | [[0.00,1.00,0.00,0.00,0.00]] | [0.05,0.94,0.02,0.01,0.00] | correct
2 | [[0.00,0.00,1.00,0.00,0.00]] | [[0.00,0.00,1.00,0.00,0.00]] | [0.03,0.09,0.76,0.40,0.01] | correct
3 | [[0.00,0.00,0.00,1.00,0.00]] | [[0.00,0.00,0.00,1.00,0.00]] | [0.00,0.00,0.11,0.70,0.04] | correct
4 | [[0.00,0.00,0.00,0.00,1.00]] | [[0.00,0.00,0.00,0.00,1.00]] | [0.12,0.00,0.00,0.22,0.93] | correct
Total count: 5
      correct: 5
      incorrect: 0
Total percentage correct: 1.0
In [17]:
for i in range(size):
    net.propagate(net.dataset.inputs[i], visualize=True)
    time.sleep(1)
In [18]:
net.dashboard()

3.6.2. MNIST Autoencoding

In [19]:
from conx import *
In [20]:
net = Network("MNIST-Autoencoder")
In [21]:
net.add(ImageLayer("input", (28,28), 1))
net.add(Conv2DLayer("conv", 3, (5,5), activation="relu"))
net.add(MaxPool2DLayer("pool", pool_size=(2,2)))
net.add(FlattenLayer("flatten"))
net.add(Layer("hidden3", 25, activation="relu"))
net.add(Layer("output", (28,28,1), activation="sigmoid"))
Out[21]:
'output'
In [22]:
net.connect()
net.compile(error="mse", optimizer="adam")
net
Out[22]:
MNIST-AutoencoderLayer: output (output) shape = (28, 28, 1) Keras class = Dense activation = sigmoidoutputWeights from hidden3 to output output_2/kernel:0 has shape (25, 784) output_2/bias:0 has shape (784,)Layer: hidden3 (hidden) shape = (25,) Keras class = Dense activation = reluhidden3Weights from flatten to hidden3 hidden3/kernel:0 has shape (432, 25) hidden3/bias:0 has shape (25,)Layer: flatten (hidden) Keras class = FlattenflattenWeights from pool to flattenLayer: pool (hidden) Keras class = MaxPooling2D pool_size = (2, 2)pool30Weights from conv to poolLayer: conv (hidden) Keras class = Conv2D activation = reluconv30Weights from input to conv conv/kernel:0 has shape (5, 5, 1, 3) conv/bias:0 has shape (3,)Layer: input (input) shape = (28, 28, 1) Keras class = Inputinput
In [23]:
net.dataset.get('mnist')
net.dataset.set_targets_from_inputs()
net.dataset.targets.reshape(0, (28 * 28))
net.dataset.summary()
WARNING: network 'MNIST-Autoencoder' target bank #0 has a multi-dimensional shape, which is not allowed

Dataset name: MNIST

Original source: http://yann.lecun.com/exdb/mnist/

The MNIST database of handwritten digits, available from this page, has 70,000 examples. It is a subset of a larger set available from NIST. The digits have been size-normalized and centered in a fixed-size image. It is a good database for people who want to try learning techniques and pattern recognition methods on real-world data while spending minimal efforts on preprocessing and formatting.

Dataset Split: * training : 70000 * testing : 0 * total : 70000

Input Summary: * shape : [(28, 28, 1)] * range : [(0.0, 1.0)]

Target Summary: * shape : [(784,)] * range : [(0.0, 1.0)]

In [24]:
net.dashboard()
In [25]:
net.propagate_to_features("pool", net.dataset.inputs[0], cols=1)
Out[25]:

Feature 0

Feature 1

Feature 2
In [26]:
image = net.dataset.inputs[0]
output = net.propagate_to_image("output", image)
output.size
Out[26]:
(28, 28)
In [27]:
net.propagate_to("hidden3", image)
Out[27]:
[0.0,
 0.24985632300376892,
 0.0,
 0.30162084102630615,
 0.0,
 0.0,
 0.0,
 0.0,
 0.0,
 0.1399173140525818,
 0.0,
 0.0,
 0.0,
 0.0,
 0.0,
 0.0,
 0.0,
 0.1823977679014206,
 0.12886810302734375,
 0.21955928206443787,
 0.0,
 0.0,
 0.15523290634155273,
 0.0,
 0.3935135304927826]
In [28]:
net.dataset.slice(10)
In [29]:
net.train(accuracy=0.5, epochs=1000, report_rate=100, tolerance=.4, plot=True)
_images/Autoencoder_30_0.svg
========================================================================
       |  Training |  Training
Epochs |     Error |  Accuracy
------ | --------- | ---------
#  819 |   0.00063 |   0.50000
In [30]:
for i in range(10):
    net.propagate(net.dataset.inputs[i], visualize=True)
In [31]:
net.test(show_inputs=False, show_outputs=False, show=True)
========================================================
Testing validation dataset with tolerance 0.4...
# | result
---------------------------------------
0 | X
1 | X
2 | correct
3 | correct
4 | X
5 | X
6 | correct
7 | X
8 | correct
9 | correct
Total count: 10
      correct: 5
      incorrect: 5
Total percentage correct: 0.5