Last active
March 27, 2017 22:22
-
-
Save D3MZ/0edaf6e9bc4e2350687cb88d48fc4fd1 to your computer and use it in GitHub Desktop.
Deep Neural Network Training Performance
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Model Name: MacBook Pro | |
Model Identifier: MacBookPro10,1 | |
Processor Name: Intel Core i7 | |
Processor Speed: 2.6 GHz | |
Number of Processors: 1 | |
Total Number of Cores: 4 | |
L2 Cache (per Core): 256 KB | |
L3 Cache: 6 MB | |
Memory: 16 GB | |
CPU only performance - | |
training loss at step 0: 5.18 (2017-03-26 23:00:12.958387) | |
training loss at step 10: 6.79 (2017-03-26 23:02:21.577157) | |
training loss at step 20: 3.92 (2017-03-26 23:04:36.139422) | |
training loss at step 30: 3.57 (2017-03-26 23:06:38.333611) | |
training loss at step 40: 5.11 (2017-03-26 23:08:37.515464) | |
training loss at step 50: 4.66 (2017-03-26 23:10:37.834948) | |
training loss at step 60: 3.52 (2017-03-26 23:12:37.144137) | |
training loss at step 70: 3.23 (2017-03-26 23:14:36.581871) | |
training loss at step 80: 3.22 (2017-03-26 23:16:36.372585) | |
training loss at step 90: 3.26 (2017-03-26 23:18:44.086892) | |
training loss at step 100: 3.62 (2017-03-26 23:20:36.604236) | |
training loss at step 110: 3.44 (2017-03-26 23:22:33.296071) | |
training loss at step 120: 3.13 (2017-03-26 23:24:29.657611) | |
training loss at step 130: 3.12 (2017-03-26 23:26:26.798654) | |
training loss at step 140: 3.05 (2017-03-26 23:28:30.723090) | |
training loss at step 150: 3.36 (2017-03-26 23:30:28.047430) | |
training loss at step 160: 3.21 (2017-03-26 23:32:26.001032) | |
training loss at step 170: 3.21 (2017-03-26 23:34:20.996725) | |
training loss at step 180: 3.01 (2017-03-26 23:36:10.306886) | |
training loss at step 190: 2.97 (2017-03-26 23:38:01.789716) | |
training loss at step 200: 3.04 (2017-03-26 23:39:53.835658) | |
training loss at step 210: 2.88 (2017-03-26 23:41:46.099598) | |
training loss at step 220: 3.06 (2017-03-26 23:43:37.745141) | |
training loss at step 230: 3.06 (2017-03-26 23:45:25.982317) | |
training loss at step 240: 2.90 (2017-03-26 23:47:19.015574) | |
training loss at step 250: 2.91 (2017-03-26 23:49:20.510422) | |
training loss at step 260: 3.01 (2017-03-26 23:51:21.075479) | |
training loss at step 270: 2.92 (2017-03-26 23:53:27.881684) | |
training loss at step 280: 2.91 (2017-03-26 23:55:28.596910) | |
training loss at step 290: 3.07 (2017-03-26 23:57:26.706560) | |
training loss at step 300: 3.06 (2017-03-26 23:59:23.123927) | |
training loss at step 310: 2.97 (2017-03-27 00:01:26.992637) | |
training loss at step 320: 3.08 (2017-03-27 00:03:36.334042) | |
training loss at step 330: 3.50 (2017-03-27 00:05:39.651977) | |
training loss at step 340: 3.07 (2017-03-27 00:07:46.852188) | |
training loss at step 350: 3.23 (2017-03-27 00:09:52.434154) | |
training loss at step 360: 3.27 (2017-03-27 00:12:00.449231) | |
training loss at step 370: 3.04 (2017-03-27 00:14:03.019577) | |
training loss at step 380: 3.03 (2017-03-27 00:16:13.271809) | |
training loss at step 390: 3.19 (2017-03-27 00:18:16.095882) | |
training loss at step 400: 3.01 (2017-03-27 00:20:20.200987) | |
training loss at step 410: 2.88 (2017-03-27 00:22:27.795162) | |
training loss at step 420: 2.99 (2017-03-27 00:24:33.276613) | |
training loss at step 430: 3.23 (2017-03-27 00:26:32.483821) | |
training loss at step 440: 3.13 (2017-03-27 00:28:36.130196) | |
training loss at step 450: 3.06 (2017-03-27 00:30:53.843883) | |
training loss at step 460: 2.99 (2017-03-27 00:33:22.773876) | |
training loss at step 470: 3.13 (2017-03-27 00:35:49.824295) | |
training loss at step 480: 3.20 (2017-03-27 00:38:00.928156) | |
training loss at step 490: 3.01 (2017-03-27 00:40:04.123722) | |
training loss at step 500: 3.05 (2017-03-27 00:42:26.506314) | |
training loss at step 510: 2.88 (2017-03-27 00:45:03.085606) | |
training loss at step 520: 3.02 (2017-03-27 00:47:04.002962) | |
training loss at step 530: 2.95 (2017-03-27 00:48:54.206379) | |
training loss at step 540: 2.95 (2017-03-27 00:50:42.648044) | |
training loss at step 550: 3.11 (2017-03-27 00:52:33.029794) | |
training loss at step 560: 2.92 (2017-03-27 00:54:21.791243) | |
training loss at step 570: 3.01 (2017-03-27 00:56:13.485987) | |
training loss at step 580: 2.98 (2017-03-27 00:58:02.104934) | |
training loss at step 590: 3.00 (2017-03-27 00:59:50.548175) | |
training loss at step 600: 3.45 (2017-03-27 01:01:38.170249) | |
training loss at step 610: 2.83 (2017-03-27 01:03:25.571513) | |
training loss at step 620: 2.90 (2017-03-27 01:05:12.741223) | |
training loss at step 630: 2.89 (2017-03-27 01:06:59.345041) | |
training loss at step 640: 3.06 (2017-03-27 01:08:46.314051) | |
training loss at step 650: 2.95 (2017-03-27 01:10:33.413516) | |
training loss at step 660: 2.76 (2017-03-27 01:12:19.992900) | |
training loss at step 670: 2.95 (2017-03-27 01:14:06.439530) | |
training loss at step 680: 2.90 (2017-03-27 01:15:53.167096) | |
training loss at step 690: 2.83 (2017-03-27 01:17:42.414286) | |
training loss at step 700: 2.98 (2017-03-27 01:19:30.182910) | |
training loss at step 710: 2.86 (2017-03-27 01:21:17.475843) | |
training loss at step 720: 2.76 (2017-03-27 01:23:04.350826) | |
training loss at step 730: 3.13 (2017-03-27 01:24:51.092899) | |
training loss at step 740: 3.22 (2017-03-27 01:26:38.086311) | |
training loss at step 750: 3.06 (2017-03-27 01:28:24.762899) | |
training loss at step 760: 3.03 (2017-03-27 01:30:11.445001) | |
training loss at step 770: 2.81 (2017-03-27 01:31:58.343038) | |
training loss at step 780: 2.88 (2017-03-27 01:33:45.298148) | |
training loss at step 790: 2.82 (2017-03-27 01:35:32.082900) | |
training loss at step 800: 2.90 (2017-03-27 01:37:18.907542) | |
training loss at step 810: 2.96 (2017-03-27 01:39:06.341501) | |
training loss at step 820: 2.79 (2017-03-27 01:40:52.784552) | |
training loss at step 830: 3.06 (2017-03-27 01:42:41.035431) | |
training loss at step 840: 2.80 (2017-03-27 01:44:28.809030) | |
training loss at step 850: 2.83 (2017-03-27 01:46:15.716904) | |
training loss at step 860: 2.87 (2017-03-27 01:48:03.954686) | |
training loss at step 870: 2.96 (2017-03-27 01:49:54.586654) | |
training loss at step 880: 2.85 (2017-03-27 01:51:43.613311) | |
training loss at step 890: 2.83 (2017-03-27 01:53:31.558020) | |
training loss at step 900: 2.78 (2017-03-27 01:55:18.242951) | |
training loss at step 910: 3.03 (2017-03-27 01:57:05.488109) | |
training loss at step 920: 3.15 (2017-03-27 01:58:52.159931) | |
training loss at step 930: 2.82 (2017-03-27 02:00:38.325383) | |
training loss at step 940: 2.79 (2017-03-27 02:02:24.913548) | |
training loss at step 950: 2.93 (2017-03-27 02:04:11.071810) | |
training loss at step 960: 2.86 (2017-03-27 02:05:58.866114) | |
training loss at step 970: 2.85 (2017-03-27 02:07:45.530941) | |
training loss at step 980: 2.82 (2017-03-27 02:09:32.896429) | |
training loss at step 990: 2.80 (2017-03-27 02:11:19.108711) | |
training loss at step 1000: 2.71 (2017-03-27 02:13:05.530056) | |
training loss at step 1010: 2.79 (2017-03-27 02:14:51.993555) | |
training loss at step 1020: 2.92 (2017-03-27 02:16:39.041466) | |
training loss at step 1030: 2.78 (2017-03-27 02:18:25.401441) | |
training loss at step 1040: 3.02 (2017-03-27 02:20:12.197695) | |
training loss at step 1050: 2.87 (2017-03-27 02:21:58.880277) | |
training loss at step 1060: 3.10 (2017-03-27 02:23:46.170606) | |
training loss at step 1070: 2.98 (2017-03-27 02:25:32.686839) | |
training loss at step 1080: 3.03 (2017-03-27 02:27:18.981336) | |
training loss at step 1090: 3.02 (2017-03-27 02:29:05.560732) | |
training loss at step 1100: 2.92 (2017-03-27 02:30:51.363267) | |
training loss at step 1110: 2.91 (2017-03-27 02:32:37.348821) | |
training loss at step 1120: 3.05 (2017-03-27 02:34:22.626845) | |
training loss at step 1130: 2.87 (2017-03-27 02:36:08.003938) | |
training loss at step 1140: 2.80 (2017-03-27 02:37:53.340010) | |
training loss at step 1150: 2.77 (2017-03-27 02:39:38.333332) | |
training loss at step 1160: 2.58 (2017-03-27 02:41:23.864214) | |
training loss at step 1170: 2.86 (2017-03-27 02:43:08.919280) | |
training loss at step 1180: 2.84 (2017-03-27 02:44:54.456078) | |
training loss at step 1190: 3.09 (2017-03-27 02:46:40.055097) | |
training loss at step 1200: 2.92 (2017-03-27 02:48:25.049377) | |
training loss at step 1210: 2.95 (2017-03-27 02:50:10.179861) | |
training loss at step 1220: 2.80 (2017-03-27 02:51:55.522796) | |
training loss at step 1230: 2.65 (2017-03-27 02:53:40.871631) | |
training loss at step 1240: 2.79 (2017-03-27 02:55:26.284904) | |
training loss at step 1250: 2.67 (2017-03-27 02:57:11.993991) | |
training loss at step 1260: 2.73 (2017-03-27 02:58:56.994082) | |
training loss at step 1270: 2.75 (2017-03-27 03:00:42.139375) | |
training loss at step 1280: 2.70 (2017-03-27 03:02:27.320370) | |
training loss at step 1290: 2.85 (2017-03-27 03:04:12.246664) | |
training loss at step 1300: 3.02 (2017-03-27 03:06:00.006665) | |
training loss at step 1310: 3.24 (2017-03-27 03:07:45.189443) | |
training loss at step 1320: 2.83 (2017-03-27 03:09:30.352980) | |
training loss at step 1330: 2.67 (2017-03-27 03:11:15.522992) | |
training loss at step 1340: 2.95 (2017-03-27 03:13:01.283899) | |
training loss at step 1350: 2.80 (2017-03-27 03:14:46.761936) | |
training loss at step 1360: 2.79 (2017-03-27 03:16:32.130298) | |
training loss at step 1370: 2.63 (2017-03-27 03:18:16.726364) | |
training loss at step 1380: 2.73 (2017-03-27 03:20:01.733026) | |
training loss at step 1390: 2.71 (2017-03-27 03:21:53.605505) | |
training loss at step 1400: 2.66 (2017-03-27 03:23:44.364496) | |
training loss at step 1410: 2.63 (2017-03-27 03:25:36.525591) | |
training loss at step 1420: 2.91 (2017-03-27 03:27:30.102984) | |
training loss at step 1430: 2.68 (2017-03-27 03:29:15.059918) | |
training loss at step 1440: 2.65 (2017-03-27 03:31:00.009383) | |
training loss at step 1450: 2.70 (2017-03-27 03:32:45.170261) | |
training loss at step 1460: 2.70 (2017-03-27 03:34:30.730817) | |
training loss at step 1470: 2.56 (2017-03-27 03:36:16.133329) | |
training loss at step 1480: 2.85 (2017-03-27 03:38:01.674267) | |
training loss at step 1490: 2.78 (2017-03-27 03:39:46.697212) | |
training loss at step 1500: 2.69 (2017-03-27 03:41:31.851806) | |
training loss at step 1510: 2.64 (2017-03-27 03:43:17.192647) | |
training loss at step 1520: 2.77 (2017-03-27 03:45:02.437226) | |
training loss at step 1530: 2.66 (2017-03-27 03:46:47.882874) | |
training loss at step 1540: 2.70 (2017-03-27 03:48:32.899915) | |
training loss at step 1550: 2.75 (2017-03-27 03:50:17.942379) | |
training loss at step 1560: 2.69 (2017-03-27 03:52:03.178622) | |
training loss at step 1570: 2.60 (2017-03-27 03:53:48.771488) | |
training loss at step 1580: 2.83 (2017-03-27 03:55:33.886529) | |
training loss at step 1590: 2.86 (2017-03-27 03:57:18.812834) | |
training loss at step 1600: 2.77 (2017-03-27 03:59:04.025535) | |
training loss at step 1610: 2.87 (2017-03-27 04:00:49.716604) | |
training loss at step 1620: 2.70 (2017-03-27 04:02:34.750375) | |
training loss at step 1630: 2.79 (2017-03-27 04:04:19.788436) | |
training loss at step 1640: 2.74 (2017-03-27 04:06:05.861172) | |
training loss at step 1650: 2.67 (2017-03-27 04:07:50.998522) | |
training loss at step 1660: 2.79 (2017-03-27 04:09:36.035754) | |
training loss at step 1670: 2.61 (2017-03-27 04:11:21.390136) | |
training loss at step 1680: 2.68 (2017-03-27 04:13:06.544606) | |
training loss at step 1690: 2.73 (2017-03-27 04:14:51.352969) | |
training loss at step 1700: 2.77 (2017-03-27 04:16:36.549700) | |
training loss at step 1710: 2.70 (2017-03-27 04:18:21.699248) | |
training loss at step 1720: 2.63 (2017-03-27 04:20:06.821865) | |
training loss at step 1730: 2.84 (2017-03-27 04:21:54.421391) | |
training loss at step 1740: 2.77 (2017-03-27 04:23:41.343629) | |
training loss at step 1750: 2.75 (2017-03-27 04:25:28.460561) | |
training loss at step 1760: 2.73 (2017-03-27 04:27:15.646891) | |
training loss at step 1770: 2.63 (2017-03-27 04:29:02.811422) | |
training loss at step 1780: 2.64 (2017-03-27 04:30:49.613501) | |
training loss at step 1790: 2.82 (2017-03-27 04:32:36.395927) | |
training loss at step 1800: 2.67 (2017-03-27 04:34:23.513669) | |
training loss at step 1810: 2.82 (2017-03-27 04:36:10.250132) | |
training loss at step 1820: 2.71 (2017-03-27 04:37:56.774013) | |
training loss at step 1830: 2.83 (2017-03-27 04:39:43.376280) | |
training loss at step 1840: 2.59 (2017-03-27 04:41:28.958657) | |
training loss at step 1850: 2.58 (2017-03-27 04:43:14.554911) | |
training loss at step 1860: 3.30 (2017-03-27 04:44:59.504137) | |
training loss at step 1870: 2.54 (2017-03-27 04:46:45.098533) | |
training loss at step 1880: 2.56 (2017-03-27 04:48:30.042369) | |
training loss at step 1890: 2.60 (2017-03-27 04:50:15.580969) | |
training loss at step 1900: 2.76 (2017-03-27 04:52:00.638268) | |
training loss at step 1910: 2.66 (2017-03-27 04:53:46.720994) | |
training loss at step 1920: 2.46 (2017-03-27 04:55:31.869177) | |
training loss at step 1930: 2.61 (2017-03-27 04:57:16.856359) | |
training loss at step 1940: 2.58 (2017-03-27 04:59:01.456410) | |
training loss at step 1950: 2.59 (2017-03-27 05:00:46.997536) | |
training loss at step 1960: 2.62 (2017-03-27 05:02:31.941293) | |
training loss at step 1970: 2.45 (2017-03-27 05:04:16.926911) | |
training loss at step 1980: 2.49 (2017-03-27 05:06:03.239170) | |
training loss at step 1990: 2.67 (2017-03-27 05:07:48.347559) | |
training loss at step 2000: 2.61 (2017-03-27 05:09:33.457811) | |
training loss at step 2010: 2.75 (2017-03-27 05:11:18.746481) | |
training loss at step 2020: 2.56 (2017-03-27 05:13:04.418341) | |
training loss at step 2030: 2.48 (2017-03-27 05:14:49.568304) | |
training loss at step 2040: 2.56 (2017-03-27 05:16:35.001301) | |
training loss at step 2050: 2.49 (2017-03-27 05:18:19.901698) | |
training loss at step 2060: 2.55 (2017-03-27 05:20:05.182235) | |
training loss at step 2070: 2.58 (2017-03-27 05:21:51.538782) | |
training loss at step 2080: 2.39 (2017-03-27 05:23:38.040665) | |
training loss at step 2090: 2.65 (2017-03-27 05:25:23.090852) | |
training loss at step 2100: 2.42 (2017-03-27 05:27:08.084578) | |
training loss at step 2110: 2.40 (2017-03-27 05:28:52.993403) | |
training loss at step 2120: 2.43 (2017-03-27 05:30:38.168957) | |
training loss at step 2130: 2.59 (2017-03-27 05:32:23.691698) | |
training loss at step 2140: 2.52 (2017-03-27 05:34:09.179331) | |
training loss at step 2150: 2.39 (2017-03-27 05:35:54.142848) | |
training loss at step 2160: 2.44 (2017-03-27 05:37:39.633440) | |
training loss at step 2170: 2.43 (2017-03-27 05:39:24.367269) | |
training loss at step 2180: 2.53 (2017-03-27 05:41:09.308931) | |
training loss at step 2190: 2.54 (2017-03-27 05:42:54.210707) | |
training loss at step 2200: 2.60 (2017-03-27 05:44:39.774867) | |
training loss at step 2210: 2.61 (2017-03-27 05:46:24.947697) | |
training loss at step 2220: 2.57 (2017-03-27 05:48:09.916746) | |
training loss at step 2230: 2.54 (2017-03-27 05:49:55.408659) | |
training loss at step 2240: 2.58 (2017-03-27 05:51:40.496452) | |
training loss at step 2250: 2.47 (2017-03-27 05:53:25.550303) | |
training loss at step 2260: 2.44 (2017-03-27 05:55:11.092302) | |
training loss at step 2270: 2.33 (2017-03-27 05:56:56.039033) | |
training loss at step 2280: 2.50 (2017-03-27 05:58:40.987098) | |
training loss at step 2290: 2.54 (2017-03-27 06:00:26.144644) | |
training loss at step 2300: 2.61 (2017-03-27 06:02:11.053743) | |
training loss at step 2310: 2.52 (2017-03-27 06:03:56.258034) | |
training loss at step 2320: 2.61 (2017-03-27 06:05:42.309105) | |
training loss at step 2330: 2.65 (2017-03-27 06:07:27.777254) | |
training loss at step 2340: 2.67 (2017-03-27 06:09:13.036906) | |
training loss at step 2350: 2.60 (2017-03-27 06:10:58.097571) | |
training loss at step 2360: 2.55 (2017-03-27 06:12:43.298921) | |
training loss at step 2370: 2.54 (2017-03-27 06:14:28.392174) | |
training loss at step 2380: 2.68 (2017-03-27 06:16:14.781107) | |
training loss at step 2390: 2.53 (2017-03-27 06:17:59.926041) | |
training loss at step 2400: 2.43 (2017-03-27 06:19:44.848929) | |
training loss at step 2410: 2.36 (2017-03-27 06:21:30.037449) | |
training loss at step 2420: 2.29 (2017-03-27 06:23:15.911915) | |
training loss at step 2430: 2.45 (2017-03-27 06:25:03.270282) | |
training loss at step 2440: 2.40 (2017-03-27 06:26:50.339893) | |
training loss at step 2450: 2.51 (2017-03-27 06:28:37.131742) | |
training loss at step 2460: 2.50 (2017-03-27 06:30:24.290839) | |
training loss at step 2470: 2.65 (2017-03-27 06:32:11.116147) | |
training loss at step 2480: 2.41 (2017-03-27 06:33:57.781538) | |
training loss at step 2490: 2.27 (2017-03-27 06:35:49.627067) | |
training loss at step 2500: 2.38 (2017-03-27 06:37:39.048915) | |
training loss at step 2510: 2.42 (2017-03-27 06:39:24.439704) | |
training loss at step 2520: 2.31 (2017-03-27 06:41:09.983447) | |
training loss at step 2530: 2.40 (2017-03-27 06:42:55.431848) | |
training loss at step 2540: 2.29 (2017-03-27 06:44:41.025557) | |
training loss at step 2550: 2.51 (2017-03-27 06:46:26.575910) | |
training loss at step 2560: 2.36 (2017-03-27 06:48:11.883663) | |
training loss at step 2570: 2.91 (2017-03-27 06:49:56.659032) | |
training loss at step 2580: 2.56 (2017-03-27 06:51:41.872070) | |
training loss at step 2590: 2.43 (2017-03-27 06:53:26.823341) | |
training loss at step 2600: 2.42 (2017-03-27 06:55:12.112964) | |
training loss at step 2610: 2.35 (2017-03-27 06:56:57.061621) | |
training loss at step 2620: 2.36 (2017-03-27 06:58:42.496361) | |
training loss at step 2630: 2.28 (2017-03-27 07:00:27.791020) | |
training loss at step 2640: 2.28 (2017-03-27 07:02:13.033049) | |
training loss at step 2650: 2.29 (2017-03-27 07:03:58.205292) | |
training loss at step 2660: 2.23 (2017-03-27 07:05:44.504804) | |
training loss at step 2670: 2.25 (2017-03-27 07:07:29.696737) | |
training loss at step 2680: 2.35 (2017-03-27 07:09:14.897682) | |
training loss at step 2690: 2.29 (2017-03-27 07:10:59.939896) | |
training loss at step 2700: 2.25 (2017-03-27 07:12:45.080509) | |
training loss at step 2710: 2.23 (2017-03-27 07:14:29.888801) | |
training loss at step 2720: 2.23 (2017-03-27 07:16:15.249125) | |
training loss at step 2730: 2.17 (2017-03-27 07:18:00.487498) | |
training loss at step 2740: 2.48 (2017-03-27 07:19:45.936503) | |
training loss at step 2750: 2.36 (2017-03-27 07:21:31.200992) | |
training loss at step 2760: 2.35 (2017-03-27 07:23:17.607493) | |
training loss at step 2770: 2.29 (2017-03-27 07:25:04.919894) | |
training loss at step 2780: 2.40 (2017-03-27 07:26:50.332494) | |
training loss at step 2790: 2.43 (2017-03-27 07:28:35.434992) | |
training loss at step 2800: 2.37 (2017-03-27 07:30:20.222241) | |
training loss at step 2810: 2.35 (2017-03-27 07:32:05.549582) | |
training loss at step 2820: 2.31 (2017-03-27 07:33:50.830394) | |
training loss at step 2830: 2.30 (2017-03-27 07:35:36.637767) | |
training loss at step 2840: 2.37 (2017-03-27 07:37:22.111679) | |
training loss at step 2850: 2.38 (2017-03-27 07:39:07.389751) | |
training loss at step 2860: 2.24 (2017-03-27 07:40:52.203400) | |
training loss at step 2870: 2.48 (2017-03-27 07:42:37.050773) | |
training loss at step 2880: 2.25 (2017-03-27 07:44:22.114354) | |
training loss at step 2890: 2.38 (2017-03-27 07:46:08.379371) | |
training loss at step 2900: 2.26 (2017-03-27 07:47:53.641059) | |
training loss at step 2910: 2.21 (2017-03-27 07:49:38.630053) | |
training loss at step 2920: 2.34 (2017-03-27 07:51:23.669337) | |
training loss at step 2930: 2.26 (2017-03-27 07:53:08.921424) | |
training loss at step 2940: 2.31 (2017-03-27 07:54:54.055248) | |
training loss at step 2950: 2.22 (2017-03-27 07:56:39.170163) | |
training loss at step 2960: 2.35 (2017-03-27 07:58:24.214048) | |
training loss at step 2970: 2.23 (2017-03-27 08:00:09.404486) | |
training loss at step 2980: 2.18 (2017-03-27 08:01:54.571721) | |
training loss at step 2990: 2.34 (2017-03-27 08:03:39.698702) | |
training loss at step 3000: 2.27 (2017-03-27 08:05:26.270058) | |
training loss at step 3010: 2.30 (2017-03-27 08:07:11.792791) | |
training loss at step 3020: 2.26 (2017-03-27 08:08:57.045746) | |
training loss at step 3030: 2.28 (2017-03-27 08:10:42.150372) | |
training loss at step 3040: 2.21 (2017-03-27 08:12:27.167334) | |
training loss at step 3050: 2.33 (2017-03-27 08:14:12.341152) | |
training loss at step 3060: 2.24 (2017-03-27 08:15:57.612631) | |
training loss at step 3070: 2.39 (2017-03-27 08:17:42.923820) | |
training loss at step 3080: 2.29 (2017-03-27 08:19:28.403838) | |
training loss at step 3090: 2.33 (2017-03-27 08:21:13.508391) | |
training loss at step 3100: 2.15 (2017-03-27 08:22:58.581455) | |
training loss at step 3110: 2.17 (2017-03-27 08:24:46.541041) | |
training loss at step 3120: 2.47 (2017-03-27 08:26:34.039905) | |
training loss at step 3130: 2.12 (2017-03-27 08:28:20.617229) | |
training loss at step 3140: 2.23 (2017-03-27 08:30:07.758393) | |
training loss at step 3150: 2.23 (2017-03-27 08:31:54.645611) | |
training loss at step 3160: 2.28 (2017-03-27 08:33:41.208056) | |
training loss at step 3170: 2.26 (2017-03-27 08:35:27.300702) | |
training loss at step 3180: 2.14 (2017-03-27 08:37:13.457127) | |
training loss at step 3190: 2.16 (2017-03-27 08:39:00.236454) | |
training loss at step 3200: 2.31 (2017-03-27 08:40:45.621466) | |
training loss at step 3210: 2.28 (2017-03-27 08:42:30.764625) | |
training loss at step 3220: 2.15 (2017-03-27 08:44:16.057538) | |
training loss at step 3230: 2.13 (2017-03-27 08:46:01.694118) | |
training loss at step 3240: 2.20 (2017-03-27 08:47:47.396705) | |
training loss at step 3250: 2.33 (2017-03-27 08:49:32.654713) | |
training loss at step 3260: 2.19 (2017-03-27 08:51:17.602587) | |
training loss at step 3270: 2.29 (2017-03-27 08:53:02.769141) | |
training loss at step 3280: 2.17 (2017-03-27 08:54:48.285887) | |
training loss at step 3290: 2.24 (2017-03-27 08:56:33.548040) | |
training loss at step 3300: 2.25 (2017-03-27 08:58:19.152803) | |
training loss at step 3310: 2.09 (2017-03-27 09:00:04.171064) | |
training loss at step 3320: 2.22 (2017-03-27 09:01:49.197783) | |
training loss at step 3330: 2.14 (2017-03-27 09:03:34.367464) | |
training loss at step 3340: 2.04 (2017-03-27 09:05:21.124526) | |
training loss at step 3350: 2.24 (2017-03-27 09:07:07.549578) | |
training loss at step 3360: 2.09 (2017-03-27 09:08:53.020075) | |
training loss at step 3370: 2.18 (2017-03-27 09:10:40.617460) | |
training loss at step 3380: 2.14 (2017-03-27 09:12:25.724419) | |
training loss at step 3390: 2.21 (2017-03-27 09:14:11.157556) | |
training loss at step 3400: 2.23 (2017-03-27 09:15:56.910841) | |
training loss at step 3410: 2.05 (2017-03-27 09:17:42.508829) | |
training loss at step 3420: 2.14 (2017-03-27 09:19:27.682868) | |
training loss at step 3430: 2.11 (2017-03-27 09:21:12.900409) | |
training loss at step 3440: 2.23 (2017-03-27 09:22:57.547769) | |
training loss at step 3450: 2.18 (2017-03-27 09:24:45.128839) | |
training loss at step 3460: 2.36 (2017-03-27 09:26:30.426327) | |
training loss at step 3470: 2.17 (2017-03-27 09:28:15.482721) | |
training loss at step 3480: 2.20 (2017-03-27 09:30:00.623983) | |
training loss at step 3490: 2.20 (2017-03-27 09:31:45.995292) | |
training loss at step 3500: 2.18 (2017-03-27 09:33:31.371807) | |
training loss at step 3510: 2.20 (2017-03-27 09:35:16.766333) | |
training loss at step 3520: 2.08 (2017-03-27 09:37:01.828110) | |
training loss at step 3530: 2.09 (2017-03-27 09:38:47.412195) | |
training loss at step 3540: 2.13 (2017-03-27 09:40:32.283008) | |
training loss at step 3550: 2.24 (2017-03-27 09:42:17.578671) | |
training loss at step 3560: 2.32 (2017-03-27 09:44:02.176948) | |
training loss at step 3570: 2.13 (2017-03-27 09:45:47.587066) | |
training loss at step 3580: 2.33 (2017-03-27 09:47:33.140111) | |
training loss at step 3590: 2.37 (2017-03-27 09:49:18.435213) | |
training loss at step 3600: 2.44 (2017-03-27 09:51:03.690538) | |
training loss at step 3610: 2.26 (2017-03-27 09:52:48.956638) | |
training loss at step 3620: 2.11 (2017-03-27 09:54:33.846491) | |
training loss at step 3630: 2.17 (2017-03-27 09:56:19.317572) | |
training loss at step 3640: 2.33 (2017-03-27 09:58:04.941840) | |
training loss at step 3650: 2.22 (2017-03-27 09:59:50.182936) | |
training loss at step 3660: 2.08 (2017-03-27 10:01:35.305834) | |
training loss at step 3670: 2.13 (2017-03-27 10:03:20.230858) | |
training loss at step 3680: 2.05 (2017-03-27 10:05:06.411305) | |
training loss at step 3690: 2.28 (2017-03-27 10:06:51.413781) | |
training loss at step 3700: 2.06 (2017-03-27 10:08:37.186254) | |
training loss at step 3710: 2.20 (2017-03-27 10:10:22.455720) | |
training loss at step 3720: 2.21 (2017-03-27 10:12:07.646834) | |
training loss at step 3730: 2.20 (2017-03-27 10:13:52.603329) | |
training loss at step 3740: 2.15 (2017-03-27 10:15:37.802880) | |
training loss at step 3750: 2.05 (2017-03-27 10:17:23.461953) | |
training loss at step 3760: 2.10 (2017-03-27 10:19:08.248311) | |
training loss at step 3770: 2.16 (2017-03-27 10:20:54.288569) | |
training loss at step 3780: 2.11 (2017-03-27 10:22:39.970617) | |
training loss at step 3790: 2.18 (2017-03-27 10:24:26.618720) | |
training loss at step 3800: 2.10 (2017-03-27 10:26:15.282614) | |
training loss at step 3810: 2.19 (2017-03-27 10:28:02.527574) | |
training loss at step 3820: 2.19 (2017-03-27 10:29:49.525751) | |
training loss at step 3830: 2.38 (2017-03-27 10:31:36.367193) | |
training loss at step 3840: 2.24 (2017-03-27 10:33:22.335968) | |
training loss at step 3850: 2.22 (2017-03-27 10:35:08.892455) | |
training loss at step 3860: 2.24 (2017-03-27 10:36:54.417080) | |
training loss at step 3870: 2.15 (2017-03-27 10:38:39.514858) | |
training loss at step 3880: 2.10 (2017-03-27 10:40:24.887161) | |
training loss at step 3890: 2.08 (2017-03-27 10:42:10.320653) | |
training loss at step 3900: 2.06 (2017-03-27 10:43:55.543423) | |
training loss at step 3910: 2.03 (2017-03-27 10:45:40.677869) | |
training loss at step 3920: 2.04 (2017-03-27 10:47:26.042272) | |
training loss at step 3930: 2.00 (2017-03-27 10:49:11.179832) | |
training loss at step 3940: 2.12 (2017-03-27 10:51:00.106776) | |
training loss at step 3950: 2.10 (2017-03-27 10:52:45.458647) | |
training loss at step 3960: 2.00 (2017-03-27 10:54:30.736257) | |
training loss at step 3970: 2.00 (2017-03-27 10:56:16.463711) | |
training loss at step 3980: 2.01 (2017-03-27 10:58:01.674769) | |
training loss at step 3990: 1.97 (2017-03-27 10:59:46.869506) | |
training loss at step 4000: 2.26 (2017-03-27 11:01:31.816457) | |
training loss at step 4010: 2.19 (2017-03-27 11:03:16.596131) | |
training loss at step 4020: 2.15 (2017-03-27 11:05:02.665613) | |
training loss at step 4030: 2.11 (2017-03-27 11:06:48.388305) | |
training loss at step 4040: 2.24 (2017-03-27 11:08:33.182197) | |
training loss at step 4050: 2.31 (2017-03-27 11:10:18.014248) | |
training loss at step 4060: 2.13 (2017-03-27 11:12:03.519591) | |
training loss at step 4070: 2.09 (2017-03-27 11:13:48.565451) | |
training loss at step 4080: 2.11 (2017-03-27 11:15:34.169408) | |
training loss at step 4090: 2.54 (2017-03-27 11:17:19.258562) | |
training loss at step 4100: 2.22 (2017-03-27 11:19:04.509346) | |
training loss at step 4110: 2.20 (2017-03-27 11:20:49.790546) | |
training loss at step 4120: 2.11 (2017-03-27 11:22:35.504419) | |
training loss at step 4130: 2.24 (2017-03-27 11:24:22.910741) | |
training loss at step 4140: 2.07 (2017-03-27 11:26:08.106232) | |
training loss at step 4150: 2.20 (2017-03-27 11:27:53.046589) | |
training loss at step 4160: 2.20 (2017-03-27 11:29:38.362214) | |
training loss at step 4170: 2.10 (2017-03-27 11:31:23.081657) | |
training loss at step 4180: 2.19 (2017-03-27 11:33:08.398461) | |
training loss at step 4190: 2.09 (2017-03-27 11:34:54.027547) | |
training loss at step 4200: 2.15 (2017-03-27 11:36:39.113994) | |
training loss at step 4210: 2.17 (2017-03-27 11:38:24.214298) | |
training loss at step 4220: 2.17 (2017-03-27 11:40:09.538424) | |
training loss at step 4230: 2.05 (2017-03-27 11:41:54.631740) | |
training loss at step 4240: 2.06 (2017-03-27 11:43:39.956889) | |
training loss at step 4250: 2.16 (2017-03-27 11:45:25.418726) | |
training loss at step 4260: 2.07 (2017-03-27 11:47:10.997757) | |
training loss at step 4270: 2.06 (2017-03-27 11:48:56.165153) | |
training loss at step 4280: 2.26 (2017-03-27 11:50:41.215484) | |
training loss at step 4290: 2.17 (2017-03-27 11:52:26.650731) | |
training loss at step 4300: 2.08 (2017-03-27 11:54:11.654566) | |
training loss at step 4310: 2.19 (2017-03-27 11:55:57.195618) | |
training loss at step 4320: 2.09 (2017-03-27 11:57:42.721482) | |
training loss at step 4330: 2.31 (2017-03-27 11:59:27.980519) | |
training loss at step 4340: 2.06 (2017-03-27 12:01:13.578542) | |
training loss at step 4350: 2.28 (2017-03-27 12:02:58.896090) | |
training loss at step 4360: 1.99 (2017-03-27 12:05:25.133590) | |
training loss at step 4370: 2.01 (2017-03-27 12:08:23.774490) | |
training loss at step 4380: 2.33 (2017-03-27 12:11:35.147815) | |
training loss at step 4390: 2.03 (2017-03-27 12:14:45.214811) | |
training loss at step 4400: 2.10 (2017-03-27 12:17:57.810380) | |
training loss at step 4410: 2.18 (2017-03-27 12:20:58.473693) | |
training loss at step 4420: 2.08 (2017-03-27 12:23:49.905336) | |
training loss at step 4430: 2.09 (2017-03-27 12:26:47.299410) | |
training loss at step 4440: 1.95 (2017-03-27 12:29:24.690263) | |
training loss at step 4450: 1.98 (2017-03-27 12:32:05.751392) | |
training loss at step 4460: 2.20 (2017-03-27 12:34:48.201543) | |
training loss at step 4470: 2.19 (2017-03-27 12:37:30.151241) | |
training loss at step 4480: 1.97 (2017-03-27 12:40:01.248816) | |
training loss at step 4490: 2.02 (2017-03-27 12:42:06.503412) | |
training loss at step 4500: 2.01 (2017-03-27 12:44:11.921621) | |
training loss at step 4510: 2.20 (2017-03-27 12:46:35.464603) | |
training loss at step 4520: 2.06 (2017-03-27 12:49:06.228306) | |
training loss at step 4530: 2.10 (2017-03-27 12:51:30.733010) | |
training loss at step 4540: 2.00 (2017-03-27 12:54:44.314862) | |
training loss at step 4550: 2.10 (2017-03-27 12:57:04.904041) | |
training loss at step 4560: 2.14 (2017-03-27 12:59:03.701119) | |
training loss at step 4570: 1.97 (2017-03-27 13:01:11.485017) | |
training loss at step 4580: 2.06 (2017-03-27 13:03:00.204492) | |
training loss at step 4590: 2.03 (2017-03-27 13:04:49.234072) | |
training loss at step 4600: 1.92 (2017-03-27 13:06:38.759460) | |
training loss at step 4610: 2.09 (2017-03-27 13:08:25.187549) | |
training loss at step 4620: 2.00 (2017-03-27 13:10:11.974429) | |
training loss at step 4630: 2.09 (2017-03-27 13:11:58.647692) | |
training loss at step 4640: 2.05 (2017-03-27 13:13:45.250907) | |
training loss at step 4650: 2.06 (2017-03-27 13:15:31.635094) | |
training loss at step 4660: 2.13 (2017-03-27 13:17:17.852789) | |
training loss at step 4670: 1.93 (2017-03-27 13:19:03.102371) | |
training loss at step 4680: 2.04 (2017-03-27 13:20:48.594200) | |
training loss at step 4690: 2.03 (2017-03-27 13:22:33.904645) | |
training loss at step 4700: 2.06 (2017-03-27 13:24:19.310604) | |
training loss at step 4710: 2.01 (2017-03-27 13:26:07.590058) | |
training loss at step 4720: 2.29 (2017-03-27 13:28:03.363723) | |
training loss at step 4730: 2.13 (2017-03-27 13:30:10.293571) | |
training loss at step 4740: 2.02 (2017-03-27 13:32:10.265692) | |
training loss at step 4750: 2.03 (2017-03-27 13:33:56.618436) | |
training loss at step 4760: 2.14 (2017-03-27 13:35:51.511367) | |
training loss at step 4770: 2.09 (2017-03-27 13:37:38.104756) | |
training loss at step 4780: 2.03 (2017-03-27 13:39:24.682212) | |
training loss at step 4790: 1.96 (2017-03-27 13:41:11.237103) | |
training loss at step 4800: 1.97 (2017-03-27 13:43:15.960240) | |
training loss at step 4810: 2.10 (2017-03-27 13:45:23.059376) | |
training loss at step 4820: 2.11 (2017-03-27 13:47:14.916653) | |
training loss at step 4830: 2.08 (2017-03-27 13:49:13.684312) | |
training loss at step 4840: 2.21 (2017-03-27 13:51:17.146492) | |
training loss at step 4850: 2.26 (2017-03-27 13:53:07.245796) | |
training loss at step 4860: 2.36 (2017-03-27 13:54:54.665370) | |
training loss at step 4870: 2.33 (2017-03-27 13:56:45.285139) | |
training loss at step 4880: 2.04 (2017-03-27 13:58:35.581912) | |
training loss at step 4890: 2.04 (2017-03-27 14:00:23.621491) | |
training loss at step 4900: 2.13 (2017-03-27 14:02:12.164111) | |
training loss at step 4910: 2.11 (2017-03-27 14:04:00.134601) | |
training loss at step 4920: 1.99 (2017-03-27 14:05:49.554570) | |
training loss at step 4930: 2.02 (2017-03-27 14:07:37.690357) | |
training loss at step 4940: 1.99 (2017-03-27 14:09:24.045897) | |
training loss at step 4950: 2.18 (2017-03-27 14:11:09.704199) | |
training loss at step 4960: 1.96 (2017-03-27 14:12:55.346532) | |
training loss at step 4970: 2.07 (2017-03-27 14:14:41.382687) | |
training loss at step 4980: 1.98 (2017-03-27 14:16:27.641697) | |
training loss at step 4990: 1.98 (2017-03-27 14:18:17.231211) | |
training loss at step 5000: 2.03 (2017-03-27 14:20:29.411199) | |
training loss at step 5010: 1.94 (2017-03-27 14:22:37.803121) | |
training loss at step 5020: 2.02 (2017-03-27 14:24:40.341256) | |
training loss at step 5030: 2.14 (2017-03-27 14:26:49.286476) | |
training loss at step 5040: 1.95 (2017-03-27 14:28:56.781627) | |
training loss at step 5050: 2.02 (2017-03-27 14:31:08.229625) | |
training loss at step 5060: 2.18 (2017-03-27 14:33:11.257771) | |
training loss at step 5070: 2.09 (2017-03-27 14:35:16.748978) | |
training loss at step 5080: 2.12 (2017-03-27 14:37:29.138117) | |
training loss at step 5090: 2.10 (2017-03-27 14:39:46.721157) | |
training loss at step 5100: 2.07 (2017-03-27 14:41:47.592741) | |
training loss at step 5110: 2.08 (2017-03-27 14:43:49.479642) | |
training loss at step 5120: 2.31 (2017-03-27 14:45:52.115836) | |
training loss at step 5130: 2.00 (2017-03-27 14:47:59.598074) | |
training loss at step 5140: 2.04 (2017-03-27 14:50:07.713513) | |
training loss at step 5150: 1.96 (2017-03-27 14:52:03.448839) | |
training loss at step 5160: 1.95 (2017-03-27 14:54:07.163130) | |
training loss at step 5170: 1.95 (2017-03-27 14:56:08.287019) | |
training loss at step 5180: 1.93 (2017-03-27 14:57:55.147623) | |
training loss at step 5190: 1.85 (2017-03-27 14:59:42.060235) | |
training loss at step 5200: 2.08 (2017-03-27 15:01:28.375356) | |
training loss at step 5210: 2.00 (2017-03-27 15:03:13.437790) | |
training loss at step 5220: 1.97 (2017-03-27 15:04:59.893012) | |
training loss at step 5230: 1.97 (2017-03-27 15:06:44.702324) | |
training loss at step 5240: 1.93 (2017-03-27 15:08:30.019801) | |
training loss at step 5250: 1.92 (2017-03-27 15:10:14.904533) | |
training loss at step 5260: 2.06 (2017-03-27 15:11:59.940010) | |
training loss at step 5270: 2.05 (2017-03-27 15:13:44.598040) | |
training loss at step 5280: 2.14 (2017-03-27 15:15:29.518200) | |
training loss at step 5290: 2.02 (2017-03-27 15:17:14.956296) | |
training loss at step 5300: 2.08 (2017-03-27 15:19:00.477024) | |
training loss at step 5310: 2.27 (2017-03-27 15:20:45.595989) | |
training loss at step 5320: 2.03 (2017-03-27 15:22:30.780764) | |
training loss at step 5330: 1.90 (2017-03-27 15:24:16.241038) | |
training loss at step 5340: 1.93 (2017-03-27 15:26:01.418975) | |
training loss at step 5350: 2.11 (2017-03-27 15:27:46.379567) | |
training loss at step 5360: 2.09 (2017-03-27 15:29:31.422373) | |
training loss at step 5370: 2.05 (2017-03-27 15:31:19.323426) | |
training loss at step 5380: 1.97 (2017-03-27 15:33:06.404418) | |
training loss at step 5390: 2.07 (2017-03-27 15:34:54.113870) | |
training loss at step 5400: 1.96 (2017-03-27 15:36:42.324265) | |
training loss at step 5410: 2.09 (2017-03-27 15:38:29.939726) | |
training loss at step 5420: 2.15 (2017-03-27 15:40:17.299752) | |
training loss at step 5430: 1.99 (2017-03-27 15:42:04.060330) | |
training loss at step 5440: 2.05 (2017-03-27 15:43:51.248256) | |
training loss at step 5450: 1.95 (2017-03-27 15:45:39.058765) | |
training loss at step 5460: 2.01 (2017-03-27 15:47:26.397603) | |
training loss at step 5470: 2.16 (2017-03-27 15:49:14.607329) | |
training loss at step 5480: 2.04 (2017-03-27 15:51:01.447558) | |
training loss at step 5490: 1.99 (2017-03-27 15:52:48.674558) | |
training loss at step 5500: 1.97 (2017-03-27 15:54:35.226989) | |
training loss at step 5510: 1.97 (2017-03-27 15:56:21.727565) | |
training loss at step 5520: 2.00 (2017-03-27 15:58:07.578644) | |
training loss at step 5530: 1.87 (2017-03-27 15:59:52.829663) | |
training loss at step 5540: 1.89 (2017-03-27 16:01:38.065639) | |
training loss at step 5550: 2.13 (2017-03-27 16:03:23.044886) | |
training loss at step 5560: 2.11 (2017-03-27 16:05:09.484197) | |
training loss at step 5570: 2.17 (2017-03-27 16:06:55.860550) | |
training loss at step 5580: 1.93 (2017-03-27 16:08:41.304794) | |
training loss at step 5590: 2.34 (2017-03-27 16:10:26.794760) | |
training loss at step 5600: 1.90 (2017-03-27 16:12:11.873252) | |
training loss at step 5610: 2.14 (2017-03-27 16:13:57.640949) | |
training loss at step 5620: 1.89 (2017-03-27 16:15:43.005016) | |
training loss at step 5630: 1.94 (2017-03-27 16:17:28.561470) | |
training loss at step 5640: 2.26 (2017-03-27 16:19:13.573122) | |
training loss at step 5650: 2.00 (2017-03-27 16:20:59.344166) | |
training loss at step 5660: 2.05 (2017-03-27 16:22:50.152280) | |
training loss at step 5670: 2.08 (2017-03-27 16:24:37.513419) | |
training loss at step 5680: 1.97 (2017-03-27 16:26:25.330358) | |
training loss at step 5690: 1.90 (2017-03-27 16:28:11.471012) | |
training loss at step 5700: 1.93 (2017-03-27 16:29:57.451421) | |
training loss at step 5710: 1.88 (2017-03-27 16:32:06.370765) | |
training loss at step 5720: 2.00 (2017-03-27 16:34:03.442640) | |
training loss at step 5730: 2.04 (2017-03-27 16:35:52.934717) | |
training loss at step 5740: 1.94 (2017-03-27 16:38:01.261960) | |
training loss at step 5750: 1.93 (2017-03-27 16:40:11.605484) | |
training loss at step 5760: 1.85 (2017-03-27 16:42:15.211027) | |
training loss at step 5770: 2.09 (2017-03-27 16:44:04.965322) | |
training loss at step 5780: 1.98 (2017-03-27 16:45:59.845445) | |
training loss at step 5790: 1.95 (2017-03-27 16:47:55.001740) | |
training loss at step 5800: 1.94 (2017-03-27 16:49:50.814293) | |
training loss at step 5810: 2.02 (2017-03-27 16:51:49.834123) | |
training loss at step 5820: 1.97 (2017-03-27 16:53:48.328778) | |
training loss at step 5830: 1.94 (2017-03-27 16:55:46.953633) | |
training loss at step 5840: 2.04 (2017-03-27 16:57:53.326818) | |
training loss at step 5850: 1.99 (2017-03-27 16:59:56.467006) | |
training loss at step 5860: 1.85 (2017-03-27 17:01:58.318487) | |
training loss at step 5870: 1.96 (2017-03-27 17:04:04.825371) | |
training loss at step 5880: 1.98 (2017-03-27 17:06:10.433862) | |
training loss at step 5890: 2.06 (2017-03-27 17:08:19.913533) | |
training loss at step 5900: 2.00 (2017-03-27 17:10:33.154769) | |
training loss at step 5910: 1.99 (2017-03-27 17:12:40.509283) | |
training loss at step 5920: 2.02 (2017-03-27 17:15:14.978210) | |
training loss at step 5930: 1.84 (2017-03-27 17:18:56.444335) | |
training loss at step 5940: 2.00 (2017-03-27 17:21:48.741976) | |
training loss at step 5950: 1.85 (2017-03-27 17:25:08.284339) | |
training loss at step 5960: 1.92 (2017-03-27 17:28:45.988623) | |
training loss at step 5970: 1.89 (2017-03-27 17:32:12.765588) | |
training loss at step 5980: 2.11 (2017-03-27 17:35:34.217410) | |
training loss at step 5990: 2.13 (2017-03-27 17:39:24.253450) | |
training loss at step 6000: 1.84 (2017-03-27 17:41:57.142346) | |
training loss at step 6010: 1.90 (2017-03-27 17:44:03.667259) | |
training loss at step 6020: 2.03 (2017-03-27 17:46:02.057838) | |
training loss at step 6030: 1.99 (2017-03-27 17:48:03.314431) | |
training loss at step 6040: 1.97 (2017-03-27 17:50:10.623234) | |
training loss at step 6050: 1.91 (2017-03-27 17:52:21.994529) | |
training loss at step 6060: 1.94 (2017-03-27 17:54:31.553620) | |
training loss at step 6070: 2.01 (2017-03-27 17:56:33.827108) | |
training loss at step 6080: 1.91 (2017-03-27 17:58:42.459116) | |
training loss at step 6090: 2.01 (2017-03-27 18:00:55.375479) | |
training loss at step 6100: 2.13 (2017-03-27 18:03:08.582938) | |
training loss at step 6110: 2.16 (2017-03-27 18:05:20.783921) | |
training loss at step 6120: 2.19 (2017-03-27 18:07:28.850933) | |
training loss at step 6130: 2.24 (2017-03-27 18:09:32.863539) | |
training loss at step 6140: 1.96 (2017-03-27 18:11:40.044111) | |
training loss at step 6150: 1.99 (2017-03-27 18:13:37.876081) | |
training loss at step 6160: 2.08 (2017-03-27 18:15:40.921017) | |
training loss at step 6170: 1.99 (2017-03-27 18:17:34.011826) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment