3.13. AlphaZero

This notebook is based on the paper:

with additional insight from:

This code use the new conx layer that sits on top of Keras. Conx is designed to be simpler than Keras, more intuitive, and integrated visualizations.

Currently this code requires the TensorFlow backend, as it has a function specific to TF.

3.13.1. The Game

First, let’s look at a specific game. We can use many, but for this demonstration we’ll pick ConnectFour. There is a good code base of different games and a game engine in the code based on Artificial Intelligence: A Modern Approach.

If you would like to install aima3, you can use something like this in a cell:

! pip install aima3 -U --user

aima3 has other games that you can play as well as ConnectFour, including TicTacToe. aima3 has many AI algorithms wrapped up to play games. You can see more details about the game engine and ConnectFour here:

and other resources in that repository.

We import some of these that will be useful in our AlphaZero exploration:

In [2]:
from aima3.games import (ConnectFour, RandomPlayer,
                         MCTSPlayer, QueryPlayer, Player,
                         MiniMaxPlayer, AlphaBetaPlayer,
                         AlphaBetaCutoffPlayer)
import numpy as np

Let’s make a game:

In [3]:
game = ConnectFour()

and play a game between two random players:

In [4]:
game.play_game(RandomPlayer("Random-1"), RandomPlayer("Random-2"))
Random-2 is thinking...
Random-2 makes action (4, 1):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . X . . .
Random-1 is thinking...
Random-1 makes action (3, 1):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . O X . . .
Random-2 is thinking...
Random-2 makes action (7, 1):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . O X . . X
Random-1 is thinking...
Random-1 makes action (3, 2):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . O . . . .
. . O X . . X
Random-2 is thinking...
Random-2 makes action (4, 2):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . O X . . .
. . O X . . X
Random-1 is thinking...
Random-1 makes action (5, 1):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . O X . . .
. . O X O . X
Random-2 is thinking...
Random-2 makes action (4, 3):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . X . . .
. . O X . . .
. . O X O . X
Random-1 is thinking...
Random-1 makes action (6, 1):
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
. . . X . . .
. . O X . . .
. . O X O O X
Random-2 is thinking...
Random-2 makes action (4, 4):
. . . . . . .
. . . . . . .
. . . . . . .
. . . X . . .
. . . X . . .
. . O X . . .
. . O X O O X
***** Random-2 wins!
Out[4]:
['Random-2']

We can also play a match (a bunch of games) or even a tournament between a bunch of players.

p1 = RandomPlayer("Random-1")
p2 = MiniMax("MiniMax-1")
p3 = AlphaBetaCutoff("ABCutoff-1")

game.play_matches(10, p1, p2)

game.tournament(1, p1, p2, p3)

Can you beat RandomPlayer? Hope so!

Can you beat MiniMax? No! But it takes too long.

Humans enter their commands by (column, row) where column starts at 1 from left, and row starts at 1 from bottom.

In [6]:
# game.play_game(AlphaBetaCutoffPlayer("AlphaBetaCutoff"), HumanPlayer("Your Name Here"))

3.13.2. The Network

Net, we are going to build the same kind of network described in the AlphaZero paper.

Make sure to set your Keras backend to TensorFlow for now, as we have a function that is written at that level.

In [7]:
import conx as cx
from aima3.games import Game
from keras import regularizers
Using TensorFlow backend.
/usr/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: compiletime version 3.5 of module 'tensorflow.python.framework.fast_tensor_util' does not match runtime version 3.6
  return f(*args, **kwds)
conx, version 3.5.12
In [10]:
## NEED TO REWRITE THIS FUNCTION IN KERAS:

import tensorflow as tf

def softmax_cross_entropy_with_logits(y_true, y_pred):
    p = y_pred
    pi = y_true
    zero = tf.zeros(shape = tf.shape(pi), dtype=tf.float32)
    where = tf.equal(pi, zero)
    negatives = tf.fill(tf.shape(pi), -100.0)
    p = tf.where(where, negatives, p)
    loss = tf.nn.softmax_cross_entropy_with_logits(labels = pi, logits = p)
    return loss

3.13.2.1. Representations

The state board is the most important bits of information. How to represent it? Possible ideas:

  • a vector of 42 values
  • a 6x7 matrix

We decided to represent the state of the board as 2 6x7 matrices: one for representing the current player’s pieces, and the other for the opponent pieces.

We also need to represent actions. Possible ideas:

  • 7 outputs, each representing a column to drop a piece into
  • two outputs, one representing row, and the other column
  • 6x7 matrix, each representing the position on the grid
  • 42 outputs, each representing the position on the grid

We decided to represent them as the final option: 42 outputs.

The network architecture in AlphaZero is quite large, and has repeating blocks of layers. To help in the construction of the network, we define some functions

In [8]:
def add_conv_block(net, input_layer):
    cname = net.add(cx.Conv2DLayer("conv2d-%d",
                    filters=75,
                    kernel_size=(4,4),
                    padding='same',
                    use_bias=False,
                    activation='linear',
                    kernel_regularizer=regularizers.l2(0.0001)))
    bname = net.add(cx.BatchNormalizationLayer("batch-norm-%d", axis=1))
    lname = net.add(cx.LeakyReLULayer("leaky-relu-%d"))
    net.connect(input_layer, cname)
    net.connect(cname, bname)
    net.connect(bname, lname)
    return lname

def add_residual_block(net, input_layer):
    prev_layer = add_conv_block(net, input_layer)
    cname = net.add(cx.Conv2DLayer("conv2d-%d",
        filters=75,
        kernel_size=(4,4),
        padding='same',
        use_bias=False,
        activation='linear',
        kernel_regularizer=regularizers.l2(0.0001)))
    bname = net.add(cx.BatchNormalizationLayer("batch-norm-%d", axis=1))
    aname = net.add(cx.AddLayer("add-%d"))
    lname = net.add(cx.LeakyReLULayer("leaky-relu-%d"))
    net.connect(prev_layer, cname)
    net.connect(cname, bname)
    net.connect(input_layer, aname)
    net.connect(bname, aname)
    net.connect(aname, lname)
    return lname

def add_value_block(net, input_layer):
    l1 = net.add(cx.Conv2DLayer("conv2d-%d",
        filters=1,
        kernel_size=(1,1),
        padding='same',
        use_bias=False,
        activation='linear',
        kernel_regularizer=regularizers.l2(0.0001)))
    l2 = net.add(cx.BatchNormalizationLayer("batch-norm-%d", axis=1))
    l3 = net.add(cx.LeakyReLULayer("leaky-relu-%d"))
    l4 = net.add(cx.FlattenLayer("flatten-%d"))
    l5 = net.add(cx.Layer("dense-%d",
        20,
        use_bias=False,
        activation='linear',
        kernel_regularizer=regularizers.l2(0.0001)))
    l6 = net.add(cx.LeakyReLULayer("leaky-relu-%d"))
    l7 = net.add(cx.Layer('value_head',
        1,
        use_bias=False,
        activation='tanh',
        kernel_regularizer=regularizers.l2(0.0001)))
    net.connect(input_layer, l1)
    net.connect(l1, l2)
    net.connect(l2, l3)
    net.connect(l3, l4)
    net.connect(l4, l5)
    net.connect(l5, l6)
    net.connect(l6, l7)
    return l7

def add_policy_block(net, input_layer):
    l1 = net.add(cx.Conv2DLayer("conv2d-%d",
        filters=2,
        kernel_size=(1,1),
        padding='same',
        use_bias=False,
        activation='linear',
        kernel_regularizer = regularizers.l2(0.0001)))
    l2 = net.add(cx.BatchNormalizationLayer("batch-norm-%d", axis=1))
    l3 = net.add(cx.LeakyReLULayer("leaky-relu-%d"))
    l4 = net.add(cx.FlattenLayer("flatten-%d"))
    l5 = net.add(cx.Layer('policy_head',
            42,
            use_bias=False,
            activation='linear',
            kernel_regularizer=regularizers.l2(0.0001)))
    net.connect(input_layer, l1)
    net.connect(l1, l2)
    net.connect(l2, l3)
    net.connect(l3, l4)
    net.connect(l4, l5)
    return l5
In [26]:
def make_network(game, residuals=5):
    net = cx.Network("Residual CNN")
    net.add(cx.Layer("main_input", (game.v, game.h, 2)))
    out_layer = add_conv_block(net, "main_input")
    for i in range(residuals):
        out_layer = add_residual_block(net, out_layer)
    add_policy_block(net, out_layer)
    add_value_block(net, out_layer)
    net.compile(loss={'value_head': 'mean_squared_error',
                  'policy_head': softmax_cross_entropy_with_logits},
            optimizer=cx.SGD(lr=0.1, momentum=0.9),
            loss_weights={'value_head': 0.5,
                          'policy_head': 0.5})
    return net
In [27]:
game = ConnectFour()
net = make_network(game)
In [12]:
net.model.summary()
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to
==================================================================================================
main_input (InputLayer)         (None, 6, 7, 2)      0
__________________________________________________________________________________________________
conv2d-1 (Conv2D)               (None, 6, 7, 75)     2400        main_input[0][0]
__________________________________________________________________________________________________
batch-norm-1 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-1[0][0]
__________________________________________________________________________________________________
leaky-relu-1 (LeakyReLU)        (None, 6, 7, 75)     0           batch-norm-1[0][0]
__________________________________________________________________________________________________
conv2d-2 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-1[0][0]
__________________________________________________________________________________________________
batch-norm-2 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-2[0][0]
__________________________________________________________________________________________________
leaky-relu-2 (LeakyReLU)        (None, 6, 7, 75)     0           batch-norm-2[0][0]
__________________________________________________________________________________________________
conv2d-3 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-2[0][0]
__________________________________________________________________________________________________
batch-norm-3 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-3[0][0]
__________________________________________________________________________________________________
add-1 (Add)                     (None, 6, 7, 75)     0           leaky-relu-1[0][0]
                                                                 batch-norm-3[0][0]
__________________________________________________________________________________________________
leaky-relu-3 (LeakyReLU)        (None, 6, 7, 75)     0           add-1[0][0]
__________________________________________________________________________________________________
conv2d-4 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-3[0][0]
__________________________________________________________________________________________________
batch-norm-4 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-4[0][0]
__________________________________________________________________________________________________
leaky-relu-4 (LeakyReLU)        (None, 6, 7, 75)     0           batch-norm-4[0][0]
__________________________________________________________________________________________________
conv2d-5 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-4[0][0]
__________________________________________________________________________________________________
batch-norm-5 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-5[0][0]
__________________________________________________________________________________________________
add-2 (Add)                     (None, 6, 7, 75)     0           leaky-relu-3[0][0]
                                                                 batch-norm-5[0][0]
__________________________________________________________________________________________________
leaky-relu-5 (LeakyReLU)        (None, 6, 7, 75)     0           add-2[0][0]
__________________________________________________________________________________________________
conv2d-6 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-5[0][0]
__________________________________________________________________________________________________
batch-norm-6 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-6[0][0]
__________________________________________________________________________________________________
leaky-relu-6 (LeakyReLU)        (None, 6, 7, 75)     0           batch-norm-6[0][0]
__________________________________________________________________________________________________
conv2d-7 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-6[0][0]
__________________________________________________________________________________________________
batch-norm-7 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-7[0][0]
__________________________________________________________________________________________________
add-3 (Add)                     (None, 6, 7, 75)     0           leaky-relu-5[0][0]
                                                                 batch-norm-7[0][0]
__________________________________________________________________________________________________
leaky-relu-7 (LeakyReLU)        (None, 6, 7, 75)     0           add-3[0][0]
__________________________________________________________________________________________________
conv2d-8 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-7[0][0]
__________________________________________________________________________________________________
batch-norm-8 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-8[0][0]
__________________________________________________________________________________________________
leaky-relu-8 (LeakyReLU)        (None, 6, 7, 75)     0           batch-norm-8[0][0]
__________________________________________________________________________________________________
conv2d-9 (Conv2D)               (None, 6, 7, 75)     90000       leaky-relu-8[0][0]
__________________________________________________________________________________________________
batch-norm-9 (BatchNormalizatio (None, 6, 7, 75)     24          conv2d-9[0][0]
__________________________________________________________________________________________________
add-4 (Add)                     (None, 6, 7, 75)     0           leaky-relu-7[0][0]
                                                                 batch-norm-9[0][0]
__________________________________________________________________________________________________
leaky-relu-9 (LeakyReLU)        (None, 6, 7, 75)     0           add-4[0][0]
__________________________________________________________________________________________________
conv2d-10 (Conv2D)              (None, 6, 7, 75)     90000       leaky-relu-9[0][0]
__________________________________________________________________________________________________
batch-norm-10 (BatchNormalizati (None, 6, 7, 75)     24          conv2d-10[0][0]
__________________________________________________________________________________________________
leaky-relu-10 (LeakyReLU)       (None, 6, 7, 75)     0           batch-norm-10[0][0]
__________________________________________________________________________________________________
conv2d-11 (Conv2D)              (None, 6, 7, 75)     90000       leaky-relu-10[0][0]
__________________________________________________________________________________________________
batch-norm-11 (BatchNormalizati (None, 6, 7, 75)     24          conv2d-11[0][0]
__________________________________________________________________________________________________
add-5 (Add)                     (None, 6, 7, 75)     0           leaky-relu-9[0][0]
                                                                 batch-norm-11[0][0]
__________________________________________________________________________________________________
leaky-relu-11 (LeakyReLU)       (None, 6, 7, 75)     0           add-5[0][0]
__________________________________________________________________________________________________
conv2d-13 (Conv2D)              (None, 6, 7, 1)      75          leaky-relu-11[0][0]
__________________________________________________________________________________________________
batch-norm-13 (BatchNormalizati (None, 6, 7, 1)      24          conv2d-13[0][0]
__________________________________________________________________________________________________
conv2d-12 (Conv2D)              (None, 6, 7, 2)      150         leaky-relu-11[0][0]
__________________________________________________________________________________________________
leaky-relu-13 (LeakyReLU)       (None, 6, 7, 1)      0           batch-norm-13[0][0]
__________________________________________________________________________________________________
batch-norm-12 (BatchNormalizati (None, 6, 7, 2)      24          conv2d-12[0][0]
__________________________________________________________________________________________________
flatten-2 (Flatten)             (None, 42)           0           leaky-relu-13[0][0]
__________________________________________________________________________________________________
leaky-relu-12 (LeakyReLU)       (None, 6, 7, 2)      0           batch-norm-12[0][0]
__________________________________________________________________________________________________
dense-1 (Dense)                 (None, 20)           840         flatten-2[0][0]
__________________________________________________________________________________________________
flatten-1 (Flatten)             (None, 84)           0           leaky-relu-12[0][0]
__________________________________________________________________________________________________
leaky-relu-14 (LeakyReLU)       (None, 20)           0           dense-1[0][0]
__________________________________________________________________________________________________
policy_head (Dense)             (None, 42)           3528        flatten-1[0][0]
__________________________________________________________________________________________________
value_head (Dense)              (None, 1)            20          leaky-relu-14[0][0]
==================================================================================================
Total params: 907,325
Trainable params: 907,169
Non-trainable params: 156
__________________________________________________________________________________________________
In [13]:
len(net.layers)
Out[13]:
51
In [14]:
net.render(height="15000px")
Out[14]:
Residual CNNLayer: policy_head (output) shape = (42,) Keras class = Dense use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6c1d0>policy_headLayer: value_head (output) shape = (1,) Keras class = Dense use_bias = False activation = tanh kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6c390>value_headWeights from leaky-relu-14 to value_head value_head/kernel:0 has shape (20, 1)Layer: leaky-relu-14 (hidden) Keras class = LeakyReLUleaky-relu-14Weights from flatten-1 to policy_head policy_head/kernel:0 has shape (84, 42)Weights from dense-1 to leaky-relu-14Layer: dense-1 (hidden) shape = (20,) Keras class = Dense use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6c0f0>dense-1Weights from flatten-1 to policy_head policy_head/kernel:0 has shape (84, 42)Weights from flatten-1 to policy_head policy_head/kernel:0 has shape (84, 42)Layer: flatten-1 (hidden) Keras class = Flattenflatten-1Weights from flatten-2 to dense-1 dense-1/kernel:0 has shape (42, 20)Layer: flatten-2 (hidden) Keras class = Flattenflatten-2Weights from leaky-relu-12 to flatten-1Layer: leaky-relu-12 (hidden) Keras class = LeakyReLUleaky-relu-1220Weights from leaky-relu-13 to flatten-2Layer: leaky-relu-13 (hidden) Keras class = LeakyReLUleaky-relu-1310Weights from batch-norm-12 to leaky-relu-12Layer: batch-norm-12 (hidden) Keras class = BatchNormalization axis = 1batch-norm-1220Weights from batch-norm-13 to leaky-relu-13Layer: batch-norm-13 (hidden) Keras class = BatchNormalization axis = 1batch-norm-1310Weights from conv2d-12 to batch-norm-12 batch-norm-12/gamma:0 has shape (6,) batch-norm-12/beta:0 has shape (6,) batch-norm-12/moving_mean:0 has shape (6,) batch-norm-12/moving_variance:0 has shape (6,)Layer: conv2d-12 (hidden) Keras class = Conv2D filters = 2 kernel_size = (1, 1) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6cb38>conv2d-1220Weights from conv2d-13 to batch-norm-13 batch-norm-13/gamma:0 has shape (6,) batch-norm-13/beta:0 has shape (6,) batch-norm-13/moving_mean:0 has shape (6,) batch-norm-13/moving_variance:0 has shape (6,)Layer: conv2d-13 (hidden) Keras class = Conv2D filters = 1 kernel_size = (1, 1) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6c278>conv2d-1310Weights from leaky-relu-11 to conv2d-12 conv2d-12/kernel:0 has shape (1, 1, 75, 2)Weights from leaky-relu-11 to conv2d-13 conv2d-13/kernel:0 has shape (1, 1, 75, 1)Layer: leaky-relu-11 (hidden) Keras class = LeakyReLUleaky-relu-11750Weights from add-5 to leaky-relu-11Layer: add-5 (hidden) Keras class = Addadd-5750Weights from leaky-relu-9 to add-5Weights from batch-norm-11 to add-5Layer: batch-norm-11 (hidden) Keras class = BatchNormalization axis = 1batch-norm-11750Weights from leaky-relu-9 to add-5Weights from conv2d-11 to batch-norm-11 batch-norm-11/gamma:0 has shape (6,) batch-norm-11/beta:0 has shape (6,) batch-norm-11/moving_mean:0 has shape (6,) batch-norm-11/moving_variance:0 has shape (6,)Layer: conv2d-11 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6ccf8>conv2d-11750Weights from leaky-relu-9 to add-5Weights from leaky-relu-10 to conv2d-11 conv2d-11/kernel:0 has shape (4, 4, 75, 75)Layer: leaky-relu-10 (hidden) Keras class = LeakyReLUleaky-relu-10750Weights from leaky-relu-9 to add-5Weights from batch-norm-10 to leaky-relu-10Layer: batch-norm-10 (hidden) Keras class = BatchNormalization axis = 1batch-norm-10750Weights from leaky-relu-9 to add-5Weights from conv2d-10 to batch-norm-10 batch-norm-10/gamma:0 has shape (6,) batch-norm-10/beta:0 has shape (6,) batch-norm-10/moving_mean:0 has shape (6,) batch-norm-10/moving_variance:0 has shape (6,)Layer: conv2d-10 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6ce48>conv2d-10750Weights from leaky-relu-9 to conv2d-10 conv2d-10/kernel:0 has shape (4, 4, 75, 75)Weights from leaky-relu-9 to add-5Layer: leaky-relu-9 (hidden) Keras class = LeakyReLUleaky-relu-9750Weights from add-4 to leaky-relu-9Layer: add-4 (hidden) Keras class = Addadd-4750Weights from leaky-relu-7 to add-4Weights from batch-norm-9 to add-4Layer: batch-norm-9 (hidden) Keras class = BatchNormalization axis = 1batch-norm-9750Weights from leaky-relu-7 to add-4Weights from conv2d-9 to batch-norm-9 batch-norm-9/gamma:0 has shape (6,) batch-norm-9/beta:0 has shape (6,) batch-norm-9/moving_mean:0 has shape (6,) batch-norm-9/moving_variance:0 has shape (6,)Layer: conv2d-9 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b6cd68>conv2d-9750Weights from leaky-relu-7 to add-4Weights from leaky-relu-8 to conv2d-9 conv2d-9/kernel:0 has shape (4, 4, 75, 75)Layer: leaky-relu-8 (hidden) Keras class = LeakyReLUleaky-relu-8750Weights from leaky-relu-7 to add-4Weights from batch-norm-8 to leaky-relu-8Layer: batch-norm-8 (hidden) Keras class = BatchNormalization axis = 1batch-norm-8750Weights from leaky-relu-7 to add-4Weights from conv2d-8 to batch-norm-8 batch-norm-8/gamma:0 has shape (6,) batch-norm-8/beta:0 has shape (6,) batch-norm-8/moving_mean:0 has shape (6,) batch-norm-8/moving_variance:0 has shape (6,)Layer: conv2d-8 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b73a20>conv2d-8750Weights from leaky-relu-7 to conv2d-8 conv2d-8/kernel:0 has shape (4, 4, 75, 75)Weights from leaky-relu-7 to add-4Layer: leaky-relu-7 (hidden) Keras class = LeakyReLUleaky-relu-7750Weights from add-3 to leaky-relu-7Layer: add-3 (hidden) Keras class = Addadd-3750Weights from leaky-relu-5 to add-3Weights from batch-norm-7 to add-3Layer: batch-norm-7 (hidden) Keras class = BatchNormalization axis = 1batch-norm-7750Weights from leaky-relu-5 to add-3Weights from conv2d-7 to batch-norm-7 batch-norm-7/gamma:0 has shape (6,) batch-norm-7/beta:0 has shape (6,) batch-norm-7/moving_mean:0 has shape (6,) batch-norm-7/moving_variance:0 has shape (6,)Layer: conv2d-7 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b739b0>conv2d-7750Weights from leaky-relu-5 to add-3Weights from leaky-relu-6 to conv2d-7 conv2d-7/kernel:0 has shape (4, 4, 75, 75)Layer: leaky-relu-6 (hidden) Keras class = LeakyReLUleaky-relu-6750Weights from leaky-relu-5 to add-3Weights from batch-norm-6 to leaky-relu-6Layer: batch-norm-6 (hidden) Keras class = BatchNormalization axis = 1batch-norm-6750Weights from leaky-relu-5 to add-3Weights from conv2d-6 to batch-norm-6 batch-norm-6/gamma:0 has shape (6,) batch-norm-6/beta:0 has shape (6,) batch-norm-6/moving_mean:0 has shape (6,) batch-norm-6/moving_variance:0 has shape (6,)Layer: conv2d-6 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b737f0>conv2d-6750Weights from leaky-relu-5 to conv2d-6 conv2d-6/kernel:0 has shape (4, 4, 75, 75)Weights from leaky-relu-5 to add-3Layer: leaky-relu-5 (hidden) Keras class = LeakyReLUleaky-relu-5750Weights from add-2 to leaky-relu-5Layer: add-2 (hidden) Keras class = Addadd-2750Weights from leaky-relu-3 to add-2Weights from batch-norm-5 to add-2Layer: batch-norm-5 (hidden) Keras class = BatchNormalization axis = 1batch-norm-5750Weights from leaky-relu-3 to add-2Weights from conv2d-5 to batch-norm-5 batch-norm-5/gamma:0 has shape (6,) batch-norm-5/beta:0 has shape (6,) batch-norm-5/moving_mean:0 has shape (6,) batch-norm-5/moving_variance:0 has shape (6,)Layer: conv2d-5 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b73748>conv2d-5750Weights from leaky-relu-3 to add-2Weights from leaky-relu-4 to conv2d-5 conv2d-5/kernel:0 has shape (4, 4, 75, 75)Layer: leaky-relu-4 (hidden) Keras class = LeakyReLUleaky-relu-4750Weights from leaky-relu-3 to add-2Weights from batch-norm-4 to leaky-relu-4Layer: batch-norm-4 (hidden) Keras class = BatchNormalization axis = 1batch-norm-4750Weights from leaky-relu-3 to add-2Weights from conv2d-4 to batch-norm-4 batch-norm-4/gamma:0 has shape (6,) batch-norm-4/beta:0 has shape (6,) batch-norm-4/moving_mean:0 has shape (6,) batch-norm-4/moving_variance:0 has shape (6,)Layer: conv2d-4 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b734a8>conv2d-4750Weights from leaky-relu-3 to conv2d-4 conv2d-4/kernel:0 has shape (4, 4, 75, 75)Weights from leaky-relu-3 to add-2Layer: leaky-relu-3 (hidden) Keras class = LeakyReLUleaky-relu-3750Weights from add-1 to leaky-relu-3Layer: add-1 (hidden) Keras class = Addadd-1750Weights from leaky-relu-1 to add-1Weights from batch-norm-3 to add-1Layer: batch-norm-3 (hidden) Keras class = BatchNormalization axis = 1batch-norm-3750Weights from leaky-relu-1 to add-1Weights from conv2d-3 to batch-norm-3 batch-norm-3/gamma:0 has shape (6,) batch-norm-3/beta:0 has shape (6,) batch-norm-3/moving_mean:0 has shape (6,) batch-norm-3/moving_variance:0 has shape (6,)Layer: conv2d-3 (hidden) Keras class = Conv2D filters = 75 kernel_size = (4, 4) padding = same use_bias = False activation = linear kernel_regularizer = <keras.regularizers.L1L2 object at 0x7fe3c0b73278>conv2d-3750Weights from leaky-relu-1 to add-1Weights from leaky-relu-2 to conv2d-3 conv2d-3/kernel:0 has shape (4, 4, 75, 75)Layer: leaky-relu-2 (hidden) Keras class = LeakyReLUleaky-relu-2750Weights from leaky-relu-1 to add-1Weights from batch-norm-2 to leaky-relu-2Layer: batch-norm-2 (hidden) Keras class = BatchNormalization axis = 1batch-norm-2750Weights from leaky-relu-1 to add-1Weights from conv2d-2 to batch-norm-2 batch-norm-2/gamma:0 has shape (6,) batch-norm-2/beta:0 has shape (6,) batch-norm-2/moving_mean:0 has shape (6,) batch-norm-2/moving_variance:0 has shape (6,)