XOR Multiple Inputs/Targets

Testing...

In [1]:
from conx import Network, Layer, SGD
Using Theano backend.
In [22]:
net = Network("XOR Network", 2, 4, 1, activation="sigmoid")
dataset = [
    ([0, 0], [0]),
    ([0, 1], [1]),
    ([1, 0], [1]),
    ([1, 1], [0])
]
net["output"].minmax = (0, 1)
In [23]:
net.set_dataset(dataset)
net.dataset.summary()
Input Summary:
   count  : 4 (4 for training, 0 for testing)
   shape  : (2,)
   range  : (0.0, 1.0)
Target Summary:
   count  : 4 (4 for training, 0 for testing)
   shape  : (1,)
   range  : (0.0, 1.0)
In [24]:
net.dataset.targets[0]
Out[24]:
[0.0]
In [25]:
net.dataset._inputs.shape
Out[25]:
(4, 2)
In [26]:
net.compile(loss='mean_squared_error',
            optimizer=SGD(lr=0.3, momentum=0.9))
In [27]:
net.dashboard()
In [32]:
net.propagate([0, 0])
Out[32]:
[0.06976541131734848]
In [29]:
net.train(epochs=2000, accuracy=1.0, report_rate=25)
Training...
Epoch #   25 | train error 0.24782 | train accuracy 0.50000 | validate% 0.00000
Epoch #   50 | train error 0.24208 | train accuracy 0.75000 | validate% 0.00000
Epoch #   75 | train error 0.23345 | train accuracy 0.75000 | validate% 0.00000
Epoch #  100 | train error 0.21946 | train accuracy 0.75000 | validate% 0.00000
Epoch #  125 | train error 0.20062 | train accuracy 0.75000 | validate% 0.00000
Epoch #  150 | train error 0.17957 | train accuracy 0.75000 | validate% 0.00000
Epoch #  175 | train error 0.15574 | train accuracy 0.75000 | validate% 0.00000
Epoch #  200 | train error 0.12277 | train accuracy 1.00000 | validate% 0.00000
Epoch #  225 | train error 0.08117 | train accuracy 1.00000 | validate% 0.00000
Epoch #  250 | train error 0.04477 | train accuracy 1.00000 | validate% 0.00000
Epoch #  275 | train error 0.02395 | train accuracy 1.00000 | validate% 0.00000
Epoch #  300 | train error 0.01448 | train accuracy 1.00000 | validate% 0.25000
Epoch #  325 | train error 0.00993 | train accuracy 1.00000 | validate% 0.75000
Epoch #  350 | train error 0.00742 | train accuracy 1.00000 | validate% 1.00000
========================================================================
Epoch #  350 | train error 0.00742 | train accuracy 1.00000 | validate% 1.00000
In [10]:
net.test()
Testing on training dataset...
# | inputs | targets | outputs | result
---------------------------------------
0 | [0.00, 0.00] | [0.00] | [0.05] | correct
1 | [0.00, 1.00] | [1.00] | [0.92] | correct
2 | [1.00, 0.00] | [1.00] | [0.91] | correct
3 | [1.00, 1.00] | [0.00] | [0.10] | correct
Total count: 4
Total percentage correct: 1.0
In [11]:
net.propagate_to("input", [0, 1])
Out[11]:
[0.0, 1.0]
In [12]:
net.propagate([0.5, 0.5])
Out[12]:
[0.19813156127929688]
In [13]:
net.propagate_to("hidden", [1, 0])
Out[13]:
[0.15031614899635315,
 0.030400902032852173,
 0.6464217305183411,
 0.000940606405492872]
In [21]:
net.propagate_to("output", [1, 1])
Out[21]:
[0.09999343007802963]
In [15]:
net.propagate_to("input", [0.25, 0.25])
Out[15]:
[0.25, 0.25]
In [16]:
net.propagate_from("input", [1.0, 1.0])
Out[16]:
[0.13129079]
In [17]:
net.propagate_from("hidden", [1.0, 0.0, 1.0, -1.0])
Out[17]:
[0.76549464]
In [18]:
net.test()
Testing on training dataset...
# | inputs | targets | outputs | result
---------------------------------------
0 | [0.00, 0.00] | [0.00] | [0.05] | correct
1 | [0.00, 1.00] | [1.00] | [0.92] | correct
2 | [1.00, 0.00] | [1.00] | [0.91] | correct
3 | [1.00, 1.00] | [0.00] | [0.10] | correct
Total count: 4
Total percentage correct: 1.0
In [1]:
from conx import Network, Layer, SGD
Using Theano backend.
In [59]:
net = Network("XOR2 Network")
net.add(Layer("input1", 1))
net.add(Layer("input2", 1))
net.add(Layer("hidden1", 10, activation="sigmoid"))
net.add(Layer("hidden2", 10, activation="sigmoid"))
net.add(Layer("shared-hidden", 5, activation="sigmoid"))
net.add(Layer("output1", 1, activation="sigmoid", minmax=(-1,1)))
net.add(Layer("output2", 1, activation="sigmoid", minmax=(-1,1)))
In [60]:
net
Out[60]:
<Network name='XOR2 Network' (uncompiled)>
In [61]:
net.connect("input1", "hidden1")
net.connect("input2", "hidden2")
net.connect("hidden1", "shared-hidden")
net.connect("hidden2", "shared-hidden")
net.connect("shared-hidden", "output1")
net.connect("shared-hidden", "output2")
In [62]:
net.layers[2].incoming_connections
Out[62]:
[<Layer name='input1', shape=(1,), act='None'>]
In [63]:
net.compile(loss='mean_squared_error',
            optimizer=SGD(lr=0.3, momentum=0.9))
In [64]:
net.config["hspace"] = 200
net.dashboard()
In [65]:
net.propagate([[0], [0]])
Out[65]:
[[0.729832649230957], [0.65898597240448]]
In [66]:
dataset = [
    ([[0],[0]], [[0],[0]]),
    ([[0],[1]], [[1],[1]]),
    ([[1],[0]], [[1],[1]]),
    ([[1],[1]], [[0],[0]])
]
In [67]:
net.set_dataset(dataset)
In [68]:
net.get_weights("hidden2")
Out[68]:
[[[-0.09656606614589691,
   0.6486210227012634,
   0.69124436378479,
   -0.7247172594070435,
   0.6375254988670349,
   -0.5482105016708374,
   0.42973917722702026,
   -0.49183517694473267,
   -0.7177578806877136,
   0.5769698023796082]],
 [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]
In [69]:
net.model.layers[-1].output
Out[69]:
sigmoid.0
In [70]:
import numpy as np
In [71]:
net.model.predict([np.array([[1]]), np.array([[1]])])
Out[71]:
[array([[ 0.74016315]], dtype=float32), array([[ 0.65678769]], dtype=float32)]
In [72]:
net.propagate([[1], [1]])
Out[72]:
[[0.7401631474494934], [0.6567876935005188]]
In [73]:
for i in range(20):
    (epoch_count, loss, acc, val_acc) = net.train(epochs=100, verbose=0)
    for index in range(4):
        net.propagate(dataset[index][0])
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-73-337d5fd73182> in <module>()
      2     (epoch_count, loss, acc, val_acc) = net.train(epochs=100, verbose=0)
      3     for index in range(4):
----> 4         net.propagate(dataset[index][0])

/usr/local/lib/python3.5/dist-packages/conx/network.py in propagate(self, input, batch_size, visualize)
    625             if self._comm.kernel:
    626                 for layer in self.layers:
--> 627                     image = self.propagate_to_image(layer.name, input, batch_size)
    628                     data_uri = self._image_to_uri(image)
    629                     self._comm.send({'class': "%s_%s" % (self.name, layer.name), "href": data_uri})

/usr/local/lib/python3.5/dist-packages/conx/network.py in propagate_to_image(self, layer_name, input, batch_size)
    744             if self.num_input_layers == 1:
    745                 input = input[0]
--> 746         outputs = self.propagate_to(layer_name, input, batch_size)
    747         array = np.array(outputs)
    748         image = self[layer_name].make_image(array, self.config)

/usr/local/lib/python3.5/dist-packages/conx/network.py in propagate_to(self, layer_name, inputs, batch_size, visualize)
    732                     image = self[layer.name].make_image(np.array(out), self.config) # single vector, as an np.array
    733                     data_uri = self._image_to_uri(image)
--> 734                     self._comm.send({'class': "%s_%s" % (self.name, layer.name), "href": data_uri})
    735         outputs = outputs[0].tolist()
    736         return outputs

/usr/local/lib/python3.5/dist-packages/ipykernel/comm/comm.py in send(self, data, metadata, buffers)
    119         """Send a message to the frontend-side version of this comm"""
    120         self._publish_msg('comm_msg',
--> 121             data=data, metadata=metadata, buffers=buffers,
    122         )
    123

/usr/local/lib/python3.5/dist-packages/ipykernel/comm/comm.py in _publish_msg(self, msg_type, data, metadata, buffers, **keys)
     69             parent=self.kernel._parent_header,
     70             ident=self.topic,
---> 71             buffers=buffers,
     72         )
     73

/usr/local/lib/python3.5/dist-packages/jupyter_client/session.py in send(self, stream, msg_or_type, content, parent, ident, buffers, track, header, metadata)
    711         else:
    712             msg = self.msg(msg_or_type, content=content, parent=parent,
--> 713                            header=header, metadata=metadata)
    714         if self.check_pid and not os.getpid() == self.pid:
    715             get_logger().warning("WARNING: attempted to send message from fork\n%s",

/usr/local/lib/python3.5/dist-packages/jupyter_client/session.py in msg(self, msg_type, content, parent, header, metadata)
    570         """
    571         msg = {}
--> 572         header = self.msg_header(msg_type) if header is None else header
    573         msg['header'] = header
    574         msg['msg_id'] = header['msg_id']

/usr/local/lib/python3.5/dist-packages/jupyter_client/session.py in msg_header(self, msg_type)
    560
    561     def msg_header(self, msg_type):
--> 562         return msg_header(self.msg_id, msg_type, self.username, self.session)
    563
    564     def msg(self, msg_type, content=None, parent=None, header=None, metadata=None):

/usr/local/lib/python3.5/dist-packages/jupyter_client/session.py in msg_id(self)
    510     def msg_id(self):
    511         """always return new uuid"""
--> 512         return new_id()
    513
    514     def _check_packers(self):

/usr/local/lib/python3.5/dist-packages/jupyter_client/session.py in new_id()
    119 #-----------------------------------------------------------------------------
    120
--> 121 def new_id():
    122     """Generate a new random id.
    123

KeyboardInterrupt:
In [74]:
net.reset()
In [75]:
net.train(epochs=2000, accuracy=1.0, report_rate=25)
Training...
Epoch #   25 | train error 0.50349 | train accuracy 0.75000 | validate% 0.00000
Epoch #   50 | train error 0.50031 | train accuracy 1.25000 | validate% 0.00000
Epoch #   75 | train error 0.50008 | train accuracy 1.25000 | validate% 0.00000
Epoch #  100 | train error 0.50005 | train accuracy 1.25000 | validate% 0.00000
Epoch #  125 | train error 0.50003 | train accuracy 1.00000 | validate% 0.00000
Epoch #  150 | train error 0.50002 | train accuracy 1.00000 | validate% 0.00000
Epoch #  175 | train error 0.50001 | train accuracy 1.00000 | validate% 0.00000
Epoch #  200 | train error 0.50000 | train accuracy 1.00000 | validate% 0.00000
Epoch #  225 | train error 0.49998 | train accuracy 1.00000 | validate% 0.00000
Epoch #  250 | train error 0.49997 | train accuracy 1.00000 | validate% 0.00000
Epoch #  275 | train error 0.49996 | train accuracy 1.00000 | validate% 0.00000
Epoch #  300 | train error 0.49995 | train accuracy 1.00000 | validate% 0.00000
Epoch #  325 | train error 0.49993 | train accuracy 1.00000 | validate% 0.00000
Epoch #  350 | train error 0.49992 | train accuracy 1.00000 | validate% 0.00000
Epoch #  375 | train error 0.49990 | train accuracy 1.00000 | validate% 0.00000
Epoch #  400 | train error 0.49988 | train accuracy 1.00000 | validate% 0.00000
Epoch #  425 | train error 0.49986 | train accuracy 1.00000 | validate% 0.00000
Epoch #  450 | train error 0.49983 | train accuracy 1.00000 | validate% 0.00000
Epoch #  475 | train error 0.49980 | train accuracy 1.00000 | validate% 0.00000
Epoch #  500 | train error 0.49976 | train accuracy 1.25000 | validate% 0.00000
Epoch #  525 | train error 0.49972 | train accuracy 1.00000 | validate% 0.00000
Epoch #  550 | train error 0.49966 | train accuracy 1.00000 | validate% 0.00000
Epoch #  575 | train error 0.49959 | train accuracy 1.00000 | validate% 0.00000
Epoch #  600 | train error 0.49951 | train accuracy 1.00000 | validate% 0.00000
Epoch #  625 | train error 0.49939 | train accuracy 1.00000 | validate% 0.00000
Epoch #  650 | train error 0.49923 | train accuracy 1.25000 | validate% 0.00000
Epoch #  675 | train error 0.49902 | train accuracy 1.25000 | validate% 0.00000
Epoch #  700 | train error 0.49870 | train accuracy 1.25000 | validate% 0.00000
Epoch #  725 | train error 0.49823 | train accuracy 1.25000 | validate% 0.00000
Epoch #  750 | train error 0.49748 | train accuracy 1.25000 | validate% 0.00000
Epoch #  775 | train error 0.49619 | train accuracy 1.25000 | validate% 0.00000
Epoch #  800 | train error 0.49372 | train accuracy 1.50000 | validate% 0.00000
Epoch #  825 | train error 0.48831 | train accuracy 1.50000 | validate% 0.00000
Epoch #  850 | train error 0.47430 | train accuracy 1.50000 | validate% 0.00000
Epoch #  875 | train error 0.43509 | train accuracy 1.50000 | validate% 0.00000
Epoch #  900 | train error 0.35891 | train accuracy 1.50000 | validate% 0.00000
Epoch #  925 | train error 0.24351 | train accuracy 1.75000 | validate% 0.00000
Epoch #  950 | train error 0.07670 | train accuracy 2.00000 | validate% 0.00000
Epoch #  975 | train error 0.02634 | train accuracy 2.00000 | validate% 0.00000
========================================================================
Epoch #  993 | train error 0.01732 | train accuracy 2.00000 | validate% 1.00000
In [17]:
net.propagate_from("input1", [0.0])
Out[17]:
[[0.5], [0.5]]
In [18]:
net.propagate_from("shared-hidden", [0.0] * 5)
Out[18]:
[[0.5], [0.5]]
In [34]:
net.propagate_to("hidden1", [[1], [1]])
Out[34]:
[0.8598753213882446,
 0.7098890542984009,
 0.6287320852279663,
 0.26783427596092224,
 0.8600228428840637,
 0.7995230555534363,
 0.14518599212169647,
 0.23925507068634033,
 0.8647616505622864,
 0.18686796724796295]
In [19]:
net.test()
Testing on training dataset...
# | inputs | targets | outputs | result
---------------------------------------
0 | [[0.00], [0.00]] | [[0.00], [0.00]] | [[0.09], [0.09]] | correct
1 | [[0.00], [1.00]] | [[1.00], [1.00]] | [[0.91], [0.90]] | correct
2 | [[1.00], [0.00]] | [[1.00], [1.00]] | [[0.90], [0.91]] | correct
3 | [[1.00], [1.00]] | [[0.00], [0.00]] | [[0.09], [0.09]] | correct
Total count: 4
Total percentage correct: 1.0
In [36]:
net.dataset.slice(2)
In [37]:
net.train(epochs=2000, accuracy=1.0, report_rate=25)
Training...
========================================================================
Epoch # 2048 | train error 0.00859 | train accuracy 2.00000 | validate% 1.00000

Conx model is a Keras Model

In [38]:
from keras.utils.vis_utils import model_to_dot
from IPython.display import HTML
In [39]:
dot = model_to_dot(net.model, rankdir="BT")
In [40]:
HTML(dot.create_svg().decode())
Out[40]:
G 140455298006824 input1: InputLayer 140455245822328 hidden1: Dense 140455298006824->140455245822328 140455246122848 input2: InputLayer 140455298006488 hidden2: Dense 140455246122848->140455298006488 140455243405632 concatenate_1: Concatenate 140455245822328->140455243405632 140455298006488->140455243405632 140455245822776 shared-hidden: Dense 140455243405632->140455245822776 140455243129856 output1: Dense 140455245822776->140455243129856 140455243405128 output2: Dense 140455245822776->140455243405128