Execution Time1.17s

Test: TMVA-DNN-BatchNormalization-Cpu (Passed)
Build: master-x86_64-centos7-gcc48-dbg (olhswep22.cern.ch) on 2020-01-21 09:01:00

Test Timing: Passed
Processors1

Show Command Line
Display graphs:

Test output
Testing Backpropagation:
addBNormLayer 1 , 1 , 2 , 3  2  1
DEEP NEURAL NETWORK:   Depth = 3  Input = ( 1, 3, 4 )  Batch size = 3  Loss function = R
	Layer 0	 DENSE Layer: 	 ( Input =     4 , Width =     2 ) 	Output = (  1 ,     3 ,     2 ) 	 Activation Function = Identity
	Layer 1	 BATCH NORM Layer: 	 Input/Output = ( 3 , 2 , 1 ) 	 Norm dim =     2	 axis = -1

	Layer 2	 DENSE Layer: 	 ( Input =     2 , Width =     1 ) 	Output = (  1 ,     3 ,     1 ) 	 Activation Function = Identity
input 
input network X  size = 12 shape = { 3 , 4 , 1 }  tensor count 1
{ { -0.66186 0.0464238 -1.04496 1.15911  } 
{ -0.127882 -0.409349 -0.448258 0.961617  } 
{ 0.396235 -1.29833 2.55896 1.92693  } 
 } 
weights layer 0

2x4 matrix is as follows

     |      0    |      1    |      2    |      3    |
---------------------------------------------------------
   0 |     0.4492      0.4183       1.813      -1.131 
   1 |     -1.783     0.06541     -0.7713     -0.7706 

Tensor shape : { 3 , 2 , 1 }  Layout : ColMajor
Tensor shape : { 3 , 2 , 1 }  Layout : ColMajor
Tensor shape : { 3 , 1 , 1 }  Layout : ColMajor
output of layer before BNorm layer 
input BN layer size = 6 shape = { 3 , 2 , 1 }  tensor count 1
{ { -3.48347 1.09606  } 
{ -2.12903 -0.19404  } 
{ 2.09505 -4.25027  } 
 } 
output BN 
Mean and std of BN inputs
output DL feature 0 mean -1.17248	output DL std 2.90968
output DL feature 1 mean -1.11609	output DL std 2.78988
output of BN 
output BN layer size = 6 shape = { 3 , 2 , 1 }  tensor count 1
{ { -0.972734 0.971111  } 
{ -0.402629 0.40477  } 
{ 1.37536 -1.37588  } 
 } 
Mean and std of BN outputs
output BN feature 0 mean -1.4803e-16	output BN std 1.22473
output BN feature 1 mean -7.40149e-17	output BN std 1.22473

Do backward pass.....
Computed BN dx size = 6 shape = { 3 , 2 , 1 }  tensor count 1
{ { -0.000239556 -0.00025004  } 
{ 0.000314856 0.000328291  } 
{ -7.53005e-05 -7.82512e-05  } 
 } 

Test gradients.....
Testing weight gradients   for    layer 2
weight gradient for layer 2 component 0
weights for layer 2
dW( 0 , 0 )  numeric = -0.429422 from BP = -0.429422
dW( 0 , 1 )  numeric = 0.429418 from BP = 0.429418
max error = 1.69731e-13
Testing weight gradients   for    layer 1
weight gradient for layer 1 component 0
weights for layer 1
dW( 0 , 0 )  numeric = 0.360484 from BP = 0.360484
dW( 0 , 1 )  numeric = -0.268279 from BP = -0.268279
Testing weight gradients   for    layer 1
weight gradient for layer 1 component 1
weights for layer 1
dW( 0 , 0 )  numeric = 9.71445e-13 from BP = -3.60822e-16
dW( 0 , 1 )  numeric = 2.77556e-13 from BP = -2.63678e-16
max error = 2.29492e-12
Testing weight gradients   for    layer 0
weight gradient for layer 0 component 0
weights for layer 0
dW( 0 , 0 )  numeric = 8.84513e-05 from BP = 8.84513e-05
dW( 0 , 1 )  numeric = -4.22427e-05 from BP = -4.22426e-05
dW( 0 , 2 )  numeric = -8.35027e-05 from BP = -8.35027e-05
dW( 0 , 3 )  numeric = -0.000119999 from BP = -0.000119999
dW( 1 , 0 )  numeric = 9.25031e-05 from BP = 9.25031e-05
dW( 1 , 1 )  numeric = -4.43979e-05 from BP = -4.43979e-05
dW( 1 , 2 )  numeric = -8.61204e-05 from BP = -8.61204e-05
dW( 1 , 3 )  numeric = -0.000124918 from BP = -0.000124918
max error = 1.37802e-08
Testing weight gradients:      maximum relative error: [NON-XML-CHAR-0x1B][33m1.37802e-08[NON-XML-CHAR-0x1B][39m
addBNormLayer 3 , 4 , 4 , 3  16  10
addBNormLayer 1 , 1 , 4 , 10  4  1
DEEP NEURAL NETWORK:   Depth = 6  Input = ( 2, 4, 4 )  Batch size = 10  Loss function = R
	Layer 0	 CONV LAYER: 	( W = 4 ,  H = 4 ,  D = 3 ) 	 Filter ( W = 3 ,  H = 3 ) 	Output = ( 10 , 3 , 3 , 16 ) 	 Activation Function = Relu
	Layer 1	 BATCH NORM Layer: 	 Input/Output = ( 3 , 16 , 10 ) 	 Norm dim =     3	 axis = 1

	Layer 2	 RESHAPE Layer 	 Input = ( 3 , 4 , 4 ) 	Output = ( 1 , 10 , 48 ) 
	Layer 3	 DENSE Layer: 	 ( Input =    48 , Width =     4 ) 	Output = (  1 ,    10 ,     4 ) 	 Activation Function = Sigmoid
	Layer 4	 BATCH NORM Layer: 	 Input/Output = ( 10 , 4 , 1 ) 	 Norm dim =     4	 axis = -1

	Layer 5	 DENSE Layer: 	 ( Input =     4 , Width =     1 ) 	Output = (  1 ,    10 ,     1 ) 	 Activation Function = Identity
Testing weight gradients   for    layer 5
weight gradient for layer 5 component 0
weights for layer 5
dW( 0 , 0 )  numeric = -0.593681 from BP = -0.593681
dW( 0 , 1 )  numeric = 0.633379 from BP = 0.633379
dW( 0 , 2 )  numeric = 0.33767 from BP = 0.33767
dW( 0 , 3 )  numeric = 0.012933 from BP = 0.012933
max error = 1.26956e-11
Testing weight gradients   for    layer 4
weight gradient for layer 4 component 0
weights for layer 4
dW( 0 , 0 )  numeric = 0.153573 from BP = 0.153573
dW( 0 , 1 )  numeric = 0.188585 from BP = 0.188585
dW( 0 , 2 )  numeric = -0.0196618 from BP = -0.0196618
dW( 0 , 3 )  numeric = 0.00141664 from BP = 0.00141664
Testing weight gradients   for    layer 4
weight gradient for layer 4 component 1
weights for layer 4
dW( 0 , 0 )  numeric = 4.16334e-13 from BP = -2.60209e-17
dW( 0 , 1 )  numeric = 0 from BP = 4.16334e-17
dW( 0 , 2 )  numeric = 4.62593e-14 from BP = -4.77049e-18
dW( 0 , 3 )  numeric = -4.62593e-14 from BP = 1.17094e-17
max error = 2.24121e-10
Testing weight gradients   for    layer 3
weight gradient for layer 3 component 0
weights for layer 3
dW( 0 , 0 )  numeric = 0.011693 from BP = 0.011693
dW( 0 , 1 )  numeric = -0.0141787 from BP = -0.0141787
dW( 0 , 2 )  numeric = -0.0194687 from BP = -0.0194687
dW( 0 , 3 )  numeric = 0.0364377 from BP = 0.0364377
dW( 0 , 4 )  numeric = 0.00552727 from BP = 0.00552727
dW( 0 , 5 )  numeric = -0.00984608 from BP = -0.00984608
dW( 0 , 6 )  numeric = 0.0124439 from BP = 0.0124439
dW( 0 , 7 )  numeric = 0.0178001 from BP = 0.0178001
dW( 0 , 8 )  numeric = 0.018108 from BP = 0.018108
dW( 0 , 9 )  numeric = -0.00876628 from BP = -0.00876628
dW( 0 , 10 )  numeric = -0.00613616 from BP = -0.00613616
dW( 0 , 11 )  numeric = 0.0165015 from BP = 0.0165015
dW( 0 , 12 )  numeric = -0.000341236 from BP = -0.000341236
dW( 0 , 13 )  numeric = 0.0093476 from BP = 0.0093476
dW( 0 , 14 )  numeric = 0.0204513 from BP = 0.0204513
dW( 0 , 15 )  numeric = 0.00442031 from BP = 0.00442031
dW( 0 , 16 )  numeric = -0.0275995 from BP = -0.0275995
dW( 0 , 17 )  numeric = 0.0195901 from BP = 0.0195901
dW( 0 , 18 )  numeric = -0.00515916 from BP = -0.00515916
dW( 0 , 19 )  numeric = 0.0012612 from BP = 0.0012612
dW( 0 , 20 )  numeric = -0.0213553 from BP = -0.0213553
dW( 0 , 21 )  numeric = -0.031119 from BP = -0.031119
dW( 0 , 22 )  numeric = 0.00626309 from BP = 0.00626309
dW( 0 , 23 )  numeric = 0.0319289 from BP = 0.0319289
dW( 0 , 24 )  numeric = -0.00782274 from BP = -0.00782274
dW( 0 , 25 )  numeric = 0.0172174 from BP = 0.0172174
dW( 0 , 26 )  numeric = 0.060891 from BP = 0.060891
dW( 0 , 27 )  numeric = -0.0134967 from BP = -0.0134967
dW( 0 , 28 )  numeric = 0.00522573 from BP = 0.00522573
dW( 0 , 29 )  numeric = -0.00214397 from BP = -0.00214397
dW( 0 , 30 )  numeric = -0.0426011 from BP = -0.0426011
dW( 0 , 31 )  numeric = -0.0251659 from BP = -0.0251659
dW( 0 , 32 )  numeric = 0.0241915 from BP = 0.0241915
dW( 0 , 33 )  numeric = -0.0780843 from BP = -0.0780843
dW( 0 , 34 )  numeric = -0.0166977 from BP = -0.0166977
dW( 0 , 35 )  numeric = -0.00963945 from BP = -0.00963945
dW( 0 , 36 )  numeric = 0.0307449 from BP = 0.0307449
dW( 0 , 37 )  numeric = -0.0465663 from BP = -0.0465663
dW( 0 , 38 )  numeric = -0.0424277 from BP = -0.0424277
dW( 0 , 39 )  numeric = 0.0175175 from BP = 0.0175175
dW( 0 , 40 )  numeric = 0.0895761 from BP = 0.0895761
dW( 0 , 41 )  numeric = 0.0274651 from BP = 0.0274651
dW( 0 , 42 )  numeric = 0.0124737 from BP = 0.0124737
dW( 0 , 43 )  numeric = 0.0247722 from BP = 0.0247722
dW( 0 , 44 )  numeric = -0.00153135 from BP = -0.00153135
dW( 0 , 45 )  numeric = -0.0313544 from BP = -0.0313544
dW( 0 , 46 )  numeric = -0.0184807 from BP = -0.0184807
dW( 0 , 47 )  numeric = 0.0247586 from BP = 0.0247586
dW( 1 , 0 )  numeric = -0.0183305 from BP = -0.0183305
dW( 1 , 1 )  numeric = -0.0346746 from BP = -0.0346746
dW( 1 , 2 )  numeric = 0.0996906 from BP = 0.0996906
dW( 1 , 3 )  numeric = -0.0800985 from BP = -0.0800985
dW( 1 , 4 )  numeric = -0.0202459 from BP = -0.0202459
dW( 1 , 5 )  numeric = -0.00720714 from BP = -0.00720714
dW( 1 , 6 )  numeric = 0.0304232 from BP = 0.0304232
dW( 1 , 7 )  numeric = -0.238044 from BP = -0.238044
dW( 1 , 8 )  numeric = -0.0303159 from BP = -0.0303159
dW( 1 , 9 )  numeric = -0.0346238 from BP = -0.0346238
dW( 1 , 10 )  numeric = 0.00651353 from BP = 0.00651353
dW( 1 , 11 )  numeric = -0.0475934 from BP = -0.0475934
dW( 1 , 12 )  numeric = 0.0590261 from BP = 0.0590261
dW( 1 , 13 )  numeric = -0.0932059 from BP = -0.0932059
dW( 1 , 14 )  numeric = -0.0519485 from BP = -0.0519485
dW( 1 , 15 )  numeric = -0.00184467 from BP = -0.00184467
dW( 1 , 16 )  numeric = 0.0160951 from BP = 0.0160951
dW( 1 , 17 )  numeric = -0.0419918 from BP = -0.0419918
dW( 1 , 18 )  numeric = -0.0972033 from BP = -0.0972033
dW( 1 , 19 )  numeric = 0.00287754 from BP = 0.00287754
dW( 1 , 20 )  numeric = 0.0291675 from BP = 0.0291675
dW( 1 , 21 )  numeric = -0.00650401 from BP = -0.00650401
dW( 1 , 22 )  numeric = -0.289884 from BP = -0.289884
dW( 1 , 23 )  numeric = 0.029863 from BP = 0.029863
dW( 1 , 24 )  numeric = -0.0069129 from BP = -0.0069129
dW( 1 , 25 )  numeric = -0.348227 from BP = -0.348227
dW( 1 , 26 )  numeric = -0.458504 from BP = -0.458504
dW( 1 , 27 )  numeric = -0.106376 from BP = -0.106376
dW( 1 , 28 )  numeric = -0.127682 from BP = -0.127682
dW( 1 , 29 )  numeric = -0.0164232 from BP = -0.0164232
dW( 1 , 30 )  numeric = -0.0337705 from BP = -0.0337705
dW( 1 , 31 )  numeric = -0.0613847 from BP = -0.0613847
dW( 1 , 32 )  numeric = -0.238793 from BP = -0.238793
dW( 1 , 33 )  numeric = 0.0646477 from BP = 0.0646477
dW( 1 , 34 )  numeric = 0.101816 from BP = 0.101816
dW( 1 , 35 )  numeric = -0.0194105 from BP = -0.0194105
dW( 1 , 36 )  numeric = 0.032816 from BP = 0.032816
dW( 1 , 37 )  numeric = 0.165735 from BP = 0.165735
dW( 1 , 38 )  numeric = 0.201773 from BP = 0.201773
dW( 1 , 39 )  numeric = -0.0114027 from BP = -0.0114027
dW( 1 , 40 )  numeric = 0.0692641 from BP = 0.0692641
dW( 1 , 41 )  numeric = -0.14882 from BP = -0.14882
dW( 1 , 42 )  numeric = 0.138361 from BP = 0.138361
dW( 1 , 43 )  numeric = 0.0297947 from BP = 0.0297947
dW( 1 , 44 )  numeric = -0.0622388 from BP = -0.0622388
dW( 1 , 45 )  numeric = 0.114628 from BP = 0.114628
dW( 1 , 46 )  numeric = 0.0604447 from BP = 0.0604447
dW( 1 , 47 )  numeric = 0.0557016 from BP = 0.0557016
dW( 2 , 0 )  numeric = 0.00416355 from BP = 0.00416355
dW( 2 , 1 )  numeric = -0.00329904 from BP = -0.00329904
dW( 2 , 2 )  numeric = -0.0105286 from BP = -0.0105286
dW( 2 , 3 )  numeric = 0.0148296 from BP = 0.0148296
dW( 2 , 4 )  numeric = -0.000229254 from BP = -0.000229254
dW( 2 , 5 )  numeric = 6.60124e-05 from BP = 6.60124e-05
dW( 2 , 6 )  numeric = 0.00436356 from BP = 0.00436356
dW( 2 , 7 )  numeric = 0.0161881 from BP = 0.0161881
dW( 2 , 8 )  numeric = 0.00724598 from BP = 0.00724598
dW( 2 , 9 )  numeric = -0.00775281 from BP = -0.00775281
dW( 2 , 10 )  numeric = -0.00249892 from BP = -0.00249892
dW( 2 , 11 )  numeric = 0.0103836 from BP = 0.0103836
dW( 2 , 12 )  numeric = -0.000901649 from BP = -0.000901649
dW( 2 , 13 )  numeric = 0.0110248 from BP = 0.0110248
dW( 2 , 14 )  numeric = 0.00485818 from BP = 0.00485818
dW( 2 , 15 )  numeric = 0.000311237 from BP = 0.000311237
dW( 2 , 16 )  numeric = -0.0149826 from BP = -0.0149826
dW( 2 , 17 )  numeric = 0.0053753 from BP = 0.0053753
dW( 2 , 18 )  numeric = -0.00373039 from BP = -0.00373039
dW( 2 , 19 )  numeric = -0.00709904 from BP = -0.00709904
dW( 2 , 20 )  numeric = -0.00256257 from BP = -0.00256257
dW( 2 , 21 )  numeric = 0.00412445 from BP = 0.00412445
dW( 2 , 22 )  numeric = 0.0227037 from BP = 0.0227037
dW( 2 , 23 )  numeric = 0.015258 from BP = 0.015258
dW( 2 , 24 )  numeric = 0.00093318 from BP = 0.00093318
dW( 2 , 25 )  numeric = 0.0198755 from BP = 0.0198755
dW( 2 , 26 )  numeric = 0.0339466 from BP = 0.0339466
dW( 2 , 27 )  numeric = -0.0130698 from BP = -0.0130698
dW( 2 , 28 )  numeric = 0.00641155 from BP = 0.00641155
dW( 2 , 29 )  numeric = 0.00903051 from BP = 0.00903051
dW( 2 , 30 )  numeric = 0.0045544 from BP = 0.0045544
dW( 2 , 31 )  numeric = 0.0070187 from BP = 0.0070187
dW( 2 , 32 )  numeric = 0.0168548 from BP = 0.0168548
dW( 2 , 33 )  numeric = -0.020651 from BP = -0.020651
dW( 2 , 34 )  numeric = -0.0125422 from BP = -0.0125422
dW( 2 , 35 )  numeric = 0.00228222 from BP = 0.00228222
dW( 2 , 36 )  numeric = 0.0103837 from BP = 0.0103837
dW( 2 , 37 )  numeric = -0.0299219 from BP = -0.0299219
dW( 2 , 38 )  numeric = -0.0263874 from BP = -0.0263874
dW( 2 , 39 )  numeric = 0.00855703 from BP = 0.00855703
dW( 2 , 40 )  numeric = 0.0213195 from BP = 0.0213195
dW( 2 , 41 )  numeric = 0.0198082 from BP = 0.0198082
dW( 2 , 42 )  numeric = -0.00321636 from BP = -0.00321636
dW( 2 , 43 )  numeric = 0.00600621 from BP = 0.00600621
dW( 2 , 44 )  numeric = 0.00085532 from BP = 0.00085532
dW( 2 , 45 )  numeric = -0.0160915 from BP = -0.0160915
dW( 2 , 46 )  numeric = -0.0114722 from BP = -0.0114722
dW( 2 , 47 )  numeric = 0.00945535 from BP = 0.00945535
dW( 3 , 0 )  numeric = -0.00658555 from BP = -0.00658555
dW( 3 , 1 )  numeric = -0.00303606 from BP = -0.00303606
dW( 3 , 2 )  numeric = 0.0261299 from BP = 0.0261299
dW( 3 , 3 )  numeric = -0.0258231 from BP = -0.0258231
dW( 3 , 4 )  numeric = -0.00987731 from BP = -0.00987731
dW( 3 , 5 )  numeric = -0.00788197 from BP = -0.00788197
dW( 3 , 6 )  numeric = -0.00546868 from BP = -0.00546868
dW( 3 , 7 )  numeric = -0.0464107 from BP = -0.0464107
dW( 3 , 8 )  numeric = -0.0145477 from BP = -0.0145477
dW( 3 , 9 )  numeric = 0.000306401 from BP = 0.000306401
dW( 3 , 10 )  numeric = -0.00115982 from BP = -0.00115982
dW( 3 , 11 )  numeric = -0.0138407 from BP = -0.0138407
dW( 3 , 12 )  numeric = 0.00838484 from BP = 0.00838484
dW( 3 , 13 )  numeric = -0.0364537 from BP = -0.0364537
dW( 3 , 14 )  numeric = -0.0246322 from BP = -0.0246322
dW( 3 , 15 )  numeric = -0.00257659 from BP = -0.00257659
dW( 3 , 16 )  numeric = 0.0116844 from BP = 0.0116844
dW( 3 , 17 )  numeric = -0.0108703 from BP = -0.0108703
dW( 3 , 18 )  numeric = 0.00147481 from BP = 0.00147481
dW( 3 , 19 )  numeric = 0.00435134 from BP = 0.00435134
dW( 3 , 20 )  numeric = 0.0103469 from BP = 0.0103469
dW( 3 , 21 )  numeric = 0.00034676 from BP = 0.00034676
dW( 3 , 22 )  numeric = -0.0874167 from BP = -0.0874167
dW( 3 , 23 )  numeric = -0.020503 from BP = -0.020503
dW( 3 , 24 )  numeric = -0.00297336 from BP = -0.00297336
dW( 3 , 25 )  numeric = -0.0636041 from BP = -0.0636041
dW( 3 , 26 )  numeric = -0.0979918 from BP = -0.0979918
dW( 3 , 27 )  numeric = 0.0146385 from BP = 0.0146385
dW( 3 , 28 )  numeric = -0.0369261 from BP = -0.0369261
dW( 3 , 29 )  numeric = -0.0119248 from BP = -0.0119248
dW( 3 , 30 )  numeric = -0.00144288 from BP = -0.00144288
dW( 3 , 31 )  numeric = -0.0132603 from BP = -0.0132603
dW( 3 , 32 )  numeric = -0.0408387 from BP = -0.0408387
dW( 3 , 33 )  numeric = 0.0416896 from BP = 0.0416896
dW( 3 , 34 )  numeric = 0.0285859 from BP = 0.0285859
dW( 3 , 35 )  numeric = -0.00353452 from BP = -0.00353452
dW( 3 , 36 )  numeric = -0.00627903 from BP = -0.00627903
dW( 3 , 37 )  numeric = 0.057868 from BP = 0.057868
dW( 3 , 38 )  numeric = 0.0604269 from BP = 0.0604269
dW( 3 , 39 )  numeric = 0.00107904 from BP = 0.00107904
dW( 3 , 40 )  numeric = -0.020104 from BP = -0.020104
dW( 3 , 41 )  numeric = -0.0459166 from BP = -0.0459166
dW( 3 , 42 )  numeric = 0.0172773 from BP = 0.0172773
dW( 3 , 43 )  numeric = -0.00107886 from BP = -0.00107886
dW( 3 , 44 )  numeric = -0.00209803 from BP = -0.00209803
dW( 3 , 45 )  numeric = 0.0309413 from BP = 0.0309413
dW( 3 , 46 )  numeric = 0.0192917 from BP = 0.0192917
dW( 3 , 47 )  numeric = -0.00630462 from BP = -0.00630462
max error = 3.47686e-09
max error = 3.47686e-09
Testing weight gradients   for    layer 1
weight gradient for layer 1 component 0
weights for layer 1
dW( 0 , 0 )  numeric = -0.0345151 from BP = -0.0345151
dW( 0 , 1 )  numeric = 0.0291101 from BP = 0.0291101
dW( 0 , 2 )  numeric = 0.0046172 from BP = 0.0046172
Testing weight gradients   for    layer 1
weight gradient for layer 1 component 1
weights for layer 1
dW( 0 , 0 )  numeric = 0.0754103 from BP = 0.0754103
dW( 0 , 1 )  numeric = -0.089697 from BP = -0.089697
dW( 0 , 2 )  numeric = -0.0209937 from BP = -0.0209937
max error = 3.47686e-09
Testing weight gradients   for    layer 0
weight gradient for layer 0 component 0
weights for layer 0
dW( 0 , 0 )  numeric = -0.0106997 from BP = -0.0106997
dW( 0 , 1 )  numeric = -0.0880306 from BP = -0.0880306
dW( 0 , 2 )  numeric = -0.0492752 from BP = -0.0492752
dW( 0 , 3 )  numeric = -0.0349176 from BP = -0.0349176
dW( 0 , 4 )  numeric = 0.142384 from BP = 0.142384
dW( 0 , 5 )  numeric = 0.0780165 from BP = 0.0780165
dW( 0 , 6 )  numeric = 0.0143353 from BP = 0.0143353
dW( 0 , 7 )  numeric = 0.185289 from BP = 0.185289
dW( 0 , 8 )  numeric = -0.0291604 from BP = -0.0291604
dW( 0 , 9 )  numeric = -0.174304 from BP = -0.174304
dW( 0 , 10 )  numeric = -0.0107562 from BP = -0.0107562
dW( 0 , 11 )  numeric = -0.158057 from BP = -0.158057
dW( 0 , 12 )  numeric = 0.223344 from BP = 0.223344
dW( 0 , 13 )  numeric = 0.132804 from BP = 0.132804
dW( 0 , 14 )  numeric = 0.130389 from BP = 0.130389
dW( 0 , 15 )  numeric = -0.0921615 from BP = -0.0921615
dW( 0 , 16 )  numeric = -0.0646975 from BP = -0.0646975
dW( 0 , 17 )  numeric = 0.00288743 from BP = 0.00288743
dW( 1 , 0 )  numeric = -0.0323214 from BP = -0.0323214
dW( 1 , 1 )  numeric = 0.132871 from BP = 0.132871
dW( 1 , 2 )  numeric = -0.0859449 from BP = -0.0859449
dW( 1 , 3 )  numeric = 0.00632787 from BP = 0.00632787
dW( 1 , 4 )  numeric = -0.0157296 from BP = -0.0157296
dW( 1 , 5 )  numeric = 0.0627083 from BP = 0.0627083
dW( 1 , 6 )  numeric = 0.121339 from BP = 0.121339
dW( 1 , 7 )  numeric = -0.0949467 from BP = -0.0949467
dW( 1 , 8 )  numeric = -0.00402903 from BP = -0.00402903
dW( 1 , 9 )  numeric = 0.0623545 from BP = 0.0623545
dW( 1 , 10 )  numeric = 0.0389294 from BP = 0.0389294
dW( 1 , 11 )  numeric = 0.15194 from BP = 0.15194
dW( 1 , 12 )  numeric = -0.0808012 from BP = -0.0808012
dW( 1 , 13 )  numeric = 0.103782 from BP = 0.103782
dW( 1 , 14 )  numeric = -0.0421424 from BP = -0.0421424
dW( 1 , 15 )  numeric = -0.0818798 from BP = -0.0818798
dW( 1 , 16 )  numeric = -0.015355 from BP = -0.015355
dW( 1 , 17 )  numeric = 0.0289404 from BP = 0.0289404
dW( 2 , 0 )  numeric = 0.0918059 from BP = 0.0918059
dW( 2 , 1 )  numeric = 0.00537019 from BP = 0.00537019
dW( 2 , 2 )  numeric = -0.0698671 from BP = -0.0698671
dW( 2 , 3 )  numeric = 0.0329446 from BP = 0.0329446
dW( 2 , 4 )  numeric = -0.110113 from BP = -0.110113
dW( 2 , 5 )  numeric = -0.055296 from BP = -0.055296
dW( 2 , 6 )  numeric = 0.112207 from BP = 0.112207
dW( 2 , 7 )  numeric = -0.0154783 from BP = -0.0154783
dW( 2 , 8 )  numeric = 0.0513045 from BP = 0.0513045
dW( 2 , 9 )  numeric = 0.014564 from BP = 0.014564
dW( 2 , 10 )  numeric = -0.0147 from BP = -0.0147
dW( 2 , 11 )  numeric = 0.0517792 from BP = 0.0517792
dW( 2 , 12 )  numeric = -0.0526463 from BP = -0.0526463
dW( 2 , 13 )  numeric = 0.018029 from BP = 0.018029
dW( 2 , 14 )  numeric = 0.040201 from BP = 0.040201
dW( 2 , 15 )  numeric = -0.0359345 from BP = -0.0359345
dW( 2 , 16 )  numeric = -0.0383778 from BP = -0.0383778
dW( 2 , 17 )  numeric = 0.0160556 from BP = 0.0160556
max error = 3.47686e-09
Testing weight gradients:      maximum relative error: [NON-XML-CHAR-0x1B][33m3.47686e-09[NON-XML-CHAR-0x1B][39m