Trainer

Trainer#

This example shows how to use the Trainer class to train the potential.

import logging

import ase.io
from ase.visualize.plot import plot_atoms

from motep.io.mlip.mtp import read_mtp, write_mtp
from motep.train import Trainer

If you need a log, you can set the following.

logging.basicConfig(level=logging.INFO, format="%(message)s")

We first load the atomic configuration to compute.

images = ase.io.read("../0.0_train/ase.xyz", index=":")
ax = plot_atoms(images[0])
0.train

We next load the initial (likely untrained) potential.

mtp_data = read_mtp("initial.mtp")
mtp_data.species = [6, 1]
mtp_data
MTPData(version='1.1.0', potential_name='MTP1m', scaling=1.0, species_count=2, potential_tag='', radial_basis_type='RBChebyshev', min_dist=np.float64(0.5), max_dist=np.float64(5.0), radial_funcs_count=np.int32(1), radial_basis_size=np.int32(8), radial_coeffs=None, alpha_moments_count=np.int32(1), alpha_index_basic_count=np.int32(1), alpha_index_basic=array([[0, 0, 0, 0]], dtype=int32), alpha_index_times_count=np.int32(0), alpha_index_times=array([], shape=(0, 4), dtype=int32), alpha_scalar_moments=np.int32(1), alpha_moment_mapping=array([0], dtype=int32), species_coeffs=None, moment_coeffs=None, _species=array([6, 1], dtype=int32), optimized=['species_coeffs', 'moment_coeffs', 'radial_coeffs'])

We then make the Trainer class and train the potential by the images.

trainer = Trainer(mtp_data, seed=42)
loss = trainer.train(images)
[random seed] = 42
========================================================================

{'method': 'minimize'}

loss 0: 1011.5020145280739
loss 1: 23.96611618212598
loss 2: 16.08095030044797
loss 3: 6.3119413710240755
loss 4: 3.1615359066847732
loss 5: 0.3927903836126457
loss 6: 0.12596640778521503
loss 7: 0.11544503492176873
loss 8: 0.1130770035959136
loss 9: 0.10964666248733665
loss 10: 0.09822874264626308
loss 11: 0.07973445127246515
loss 12: 0.05727215676229607
loss 13: 0.05666844956838209
loss 14: 0.04756055447340204
loss 15: 0.045952527188954526
loss 16: 0.045462526602991854
loss 17: 0.04446130611485637
loss 18: 0.04148174758172112
loss 19: 0.03539303550151858
loss 20: 0.025595599217112283
loss 21: 0.024678603916962024
loss 22: 0.01736262650616083
loss 23: 0.014945724946533618
loss 24: 0.014683757587024998
loss 25: 0.014673688242868892
loss 26: 0.014648036370326101
loss 27: 0.014576868624252108
loss 28: 0.013887005813512519
loss 29: 0.012861413722236476
loss 30: 0.010702510269811614
loss 31: 0.007803168668887195
loss 32: 0.00744120400015907
loss 33: 0.0067495075965101296
loss 34: 0.00513989534928392
loss 35: 0.004919058451416025
loss 36: 0.004834520619263437
loss 37: 0.004832245750031621
loss 38: 0.0048296297327863535
loss 39: 0.004825289755557096
loss 40: 0.004813593974952407
loss 41: 0.0047995669623903826
loss 42: 0.004711896388466128
loss 43: 0.004547632340212576
loss 44: 0.004124846565253617
loss 45: 0.0034454664411430673
loss 46: 0.0030355053929878627
loss 47: 0.002263185516349029
loss 48: 0.002131988203783511
loss 49: 0.0017968069174013556
loss 50: 0.00177460521265668
loss 51: 0.001756311768408603
loss 52: 0.0017454259079628105
loss 53: 0.0017437819048688786
loss 54: 0.00174258825288864
loss 55: 0.0017388557969166061
loss 56: 0.0017307239661818303
loss 57: 0.0017127256834305103
loss 58: 0.0017030754127703963
loss 59: 0.0016801355132198081
loss 60: 0.0016681681260530719
loss 61: 0.0016521008299031342
loss 62: 0.0016467901794000603
loss 63: 0.0016399961671383243
loss 64: 0.0016340446271426465
loss 65: 0.0016295722644197462
loss 66: 0.0016153413800193601
loss 67: 0.0015953267693993225
loss 68: 0.0015785240037028947
loss 69: 0.0015405538278515918
loss 70: 0.0015073461833978363
loss 71: 0.0014758997544690833
loss 72: 0.0014574078150104072
loss 73: 0.00144454288889855
loss 74: 0.0014344737862503432
loss 75: 0.0014277500572725313
loss 76: 0.0014195423975315923
loss 77: 0.0014151043277725217
loss 78: 0.0014119788048326963
loss 79: 0.0014087579732580627
loss 80: 0.0014036780757215287
loss 81: 0.0013941445473845948
loss 82: 0.001388914278307631
loss 83: 0.0013756498576556553
loss 84: 0.001362426442419843
loss 85: 0.0013457667472791457
loss 86: 0.0013293141908273332
loss 87: 0.0013165305495130054
loss 88: 0.0012756657422358544
loss 89: 0.0012619634338540272
loss 90: 0.0012600963901817706
loss 91: 0.0012579123574770672
loss 92: 0.001257284935804734
loss 93: 0.0012564276262695333
loss 94: 0.0012560356824568462
loss 95: 0.0012551288847883243
loss 96: 0.0012545823377572794
loss 97: 0.001254233542640006
loss 98: 0.00125395332295125
loss 99: 0.0012534624152960765
loss 100: 0.0012531017832596795
loss 101: 0.0012527207083416613
loss 102: 0.0012523060117500605
loss 103: 0.0012512419774116574
loss 104: 0.0012503538189082351
loss 105: 0.0012498376184660773
loss 106: 0.0012483562469255177
loss 107: 0.00124759435603108
loss 108: 0.001244502051444935
loss 109: 0.0012404778256534135
loss 110: 0.0012333075793750604
loss 111: 0.0012287682647432563
loss 112: 0.0012228651672336515
loss 113: 0.0012186116920904948
loss 114: 0.0012118927920311507
loss 115: 0.0012013061223928958
loss 116: 0.0011951299102259394
loss 117: 0.0011922611437578275
loss 118: 0.0011910896492854537
loss 119: 0.001190300298743726
loss 120: 0.0011892928134379613
loss 121: 0.0011883035779832673
loss 122: 0.001185946381916196
loss 123: 0.0011834876959423838
loss 124: 0.0011827646819709582
loss 125: 0.0011805888574624869
loss 126: 0.0011798535162448544
loss 127: 0.0011777964730065261
loss 128: 0.0011751576997541247
loss 129: 0.001173872557488734
loss 130: 0.001170628395879823
loss 131: 0.00116965451834899
loss 132: 0.001168042654372097
loss 133: 0.0011665371729926211
loss 134: 0.0011661412447236167
loss 135: 0.0011655132943824682
loss 136: 0.0011653244145395374
loss 137: 0.0011648762871896141
loss 138: 0.001163554651695709
loss 139: 0.0011628005323871603
loss 140: 0.0011612441067152107
loss 141: 0.0011587710855256034
loss 142: 0.001156191000480029
loss 143: 0.0011487519630568733
loss 144: 0.0011455699356373921
loss 145: 0.001143956607075511
loss 146: 0.001142576802338452
loss 147: 0.0011424128566109018
loss 148: 0.001142139317026622
loss 149: 0.0011418744039447092
loss 150: 0.0011416387347063354
loss 151: 0.0011409506770299726
loss 152: 0.0011399383330125761
loss 153: 0.0011396492137788325
loss 154: 0.001138499588264729
loss 155: 0.001137361397925602
loss 156: 0.001136491524955595
loss 157: 0.0011356913090697566
loss 158: 0.0011341006921132
loss 159: 0.0011339499901053072
loss 160: 0.0011322329494683051
loss 161: 0.001130543060050014
loss 162: 0.0011295941677124215
loss 163: 0.0011289482355449963
loss 164: 0.0011276071603150153
loss 165: 0.0011247740287723444
loss 166: 0.0011245226017342763
loss 167: 0.0011203666450899325
loss 168: 0.001116141372227403
loss 169: 0.0011124140627323685
loss 170: 0.0011093849408662325
loss 171: 0.0011078742311522503
loss 172: 0.0011050021849257003
loss 173: 0.0010977235902334417
loss 174: 0.0010938629735377527
loss 175: 0.0010917357414565547
loss 176: 0.0010909826221340278
loss 177: 0.0010907902099628348
loss 178: 0.0010904265502048737
loss 179: 0.0010866910076043565
loss 180: 0.001082052574227813
loss 181: 0.0010739767975427796
loss 182: 0.0010730857403397278
loss 183: 0.0010649389405964847
loss 184: 0.0010205435224687526
loss 185: 0.0009765679300625459
loss 186: 0.0009622488400808433
loss 187: 0.000950421274879854
loss 188: 0.0009500874706006438
loss 189: 0.0009496504836849232
loss 190: 0.0009494017399412751
loss 191: 0.0009491283733009062
loss 192: 0.000948029000554509
loss 193: 0.0009472751536300448
loss 194: 0.0009439412814831407
loss 195: 0.0009396427667846451
loss 196: 0.0009342295764768437
loss 197: 0.0009314337775481431
loss 198: 0.0009289013056522806
loss 199: 0.000926537803060286
loss 200: 0.0009251271893820366
loss 201: 0.0009237764846534741
loss 202: 0.0009230714022160502
loss 203: 0.0009225207023932738
loss 204: 0.0009221508442869167
loss 205: 0.0009219374053380156
loss 206: 0.000921203138395941
loss 207: 0.0009206821258640062
loss 208: 0.0009197837862884233
loss 209: 0.0009192631318483273
loss 210: 0.0009181127085178167
loss 211: 0.0009172025440877463
loss 212: 0.0009161762368677523
loss 213: 0.0009158247296796305
loss 214: 0.0009150454090881012
loss 215: 0.0009144833241092431
loss 216: 0.0009131040642846451
loss 217: 0.0009120771375076089
loss 218: 0.0009113188358173257
loss 219: 0.0009112893811281675
loss 220: 0.0009112207658068842
loss 221: 0.0009111874360827741
loss 222: 0.0009111087574939139
loss 223: 0.0009109747762000914
loss 224: 0.0009105738944582671
loss 225: 0.00090988007864309
loss 226: 0.0009092468655883354
loss 227: 0.0009075406661429644
loss 228: 0.0009038270499789389
loss 229: 0.0009021008904089513
loss 230: 0.0008995606601204408
loss 231: 0.0008980295310020541
loss 232: 0.000895236811874305
loss 233: 0.000892644979775667
loss 234: 0.0008912262181178529
loss 235: 0.0008910086683413547
loss 236: 0.0008907536604759552
loss 237: 0.000890583388418662
loss 238: 0.0008904998248675029
loss 239: 0.0008903846914256553
loss 240: 0.0008903543732164175
loss 241: 0.0008902884100972111
loss 242: 0.0008901424103229555
loss 243: 0.000890088723359148
loss 244: 0.0008899514484052178
loss 245: 0.0008898627567549904
loss 246: 0.0008897889939723585
loss 247: 0.000889694334866454
loss 248: 0.0008894704717242196
loss 249: 0.0008890950278439706
loss 250: 0.0008890637864751343
loss 251: 0.0008885891379732121
loss 252: 0.0008882107505573482
loss 253: 0.0008880325322724123
loss 254: 0.0008879470465725861
loss 255: 0.0008877935526853579
loss 256: 0.0008875085078714071
loss 257: 0.0008870736418246292
loss 258: 0.0008856820740601451
loss 259: 0.0008854545412203681
loss 260: 0.0008847626724707976
loss 261: 0.0008834108464564171
loss 262: 0.0008823398943446515
loss 263: 0.00088022739030145
loss 264: 0.0008772013597457255
loss 265: 0.0008755425623143507
loss 266: 0.0008746374506084121
loss 267: 0.0008734292500863134
loss 268: 0.0008722874786409475
loss 269: 0.0008714694676321822
loss 270: 0.0008691555779724857
loss 271: 0.0008654285404086776
loss 272: 0.0008634470743753475
loss 273: 0.0008609103121291233
loss 274: 0.0008603754589923141
loss 275: 0.0008601168062732748
loss 276: 0.0008594526694150169
loss 277: 0.0008583802313370375
loss 278: 0.0008562356688507557
loss 279: 0.0008544119415608719
loss 280: 0.0008530373694435118
loss 281: 0.0008526254295993414
loss 282: 0.0008521027152105827
loss 283: 0.0008510648515651175
loss 284: 0.0008505424838327995
loss 285: 0.0008498482242009584
loss 286: 0.0008493495327669456
loss 287: 0.0008470081876880531
loss 288: 0.0008433838897452364
loss 289: 0.0008398222054282655
loss 290: 0.0008367914681987702
loss 291: 0.0008351448827715483
loss 292: 0.0008332664206394458
loss 293: 0.0008317329753992108
loss 294: 0.0008177460977258195
loss 295: 0.0007698064770993144
loss 296: 0.0007517827339247844
loss 297: 0.0007123215569616406
loss 298: 0.0006522270889825194
loss 299: 0.0006158047216672522
loss 300: 0.0005762113282127874
loss 301: 0.0005643521743892867
loss 302: 0.0005486712081300005
loss 303: 0.0005421526158623604
loss 304: 0.0005410784768172102
loss 305: 0.0005397976799074014
loss 306: 0.0005381647801365553
loss 307: 0.0005368119182491243
loss 308: 0.0005366308516157788
loss 309: 0.0005358459410988869
loss 310: 0.000535580606386458
loss 311: 0.000534715783104648
loss 312: 0.0005339791926154999
loss 313: 0.0005329581516232395
loss 314: 0.0005323020372074107
loss 315: 0.0005302494750251825
loss 316: 0.0005280438100015748
loss 317: 0.0005211257839230111
loss 318: 0.0005174340230771495
loss 319: 0.0005150799615779474
loss 320: 0.0005131357194929067
loss 321: 0.0005120668611432599
loss 322: 0.0005107314996500156
loss 323: 0.0005087911105913725
loss 324: 0.0005069810570293467
loss 325: 0.0005060487632979078
loss 326: 0.0005054365311177879
loss 327: 0.0005045320417433203
loss 328: 0.0005035949177544216
loss 329: 0.00050268588133296
loss 330: 0.0005012301160576435
loss 331: 0.000501008529326269
loss 332: 0.0004997803787605835
loss 333: 0.0004990367421543377
loss 334: 0.0004985166869472853
loss 335: 0.0004979358603906618
loss 336: 0.000497179159886255
loss 337: 0.0004963995974502054
loss 338: 0.0004961903053582085
loss 339: 0.0004956321198562784
loss 340: 0.0004952496839427052
loss 341: 0.0004951247091035325
loss 342: 0.0004946245664259779
loss 343: 0.0004944652876931786
loss 344: 0.0004942449520553188
loss 345: 0.0004939469954866537
loss 346: 0.0004938069106957588
loss 347: 0.0004933945655010981
loss 348: 0.0004927996754500132
loss 349: 0.0004922519939891517
loss 350: 0.0004905083731393086
loss 351: 0.0004887581064398072
loss 352: 0.00048714394756516966
loss 353: 0.00048671808762749627
loss 354: 0.00048656198671469887
loss 355: 0.0004865345740773501
loss 356: 0.0004865059219162384
loss 357: 0.0004863176398035226
loss 358: 0.00048542729393196186
loss 359: 0.00048514868400690913
loss 360: 0.0004844079983559014
loss 361: 0.00048394076914715453
loss 362: 0.0004834007025351055
loss 363: 0.0004831798915198832
loss 364: 0.00048291207070733224
loss 365: 0.0004826686209176365
loss 366: 0.0004826419357905267
loss 367: 0.00048255123272099857
loss 368: 0.00048199558177763766
loss 369: 0.0004787485729682309
loss 370: 0.00047704330260751914
loss 371: 0.0004766792644455536
loss 372: 0.0004760885911395087
loss 373: 0.00047601976466763357
loss 374: 0.0004759739462273723
loss 375: 0.0004759480193360982
loss 376: 0.00047587707290788745
loss 377: 0.00047584927051795773
loss 378: 0.00047576666346976326
loss 379: 0.00047570311561474153
loss 380: 0.0004756655912033385
loss 381: 0.0004756261781565983
loss 382: 0.00047554242485193854
loss 383: 0.0004753952486511283
loss 384: 0.00047532677108039445
loss 385: 0.0004752487916719828
loss 386: 0.0004751966263082636
loss 387: 0.0004751301201828917
loss 388: 0.0004751096457293234
loss 389: 0.00047503359335255375
loss 390: 0.00047492035695628467
loss 391: 0.00047486554148177206
loss 392: 0.0004748222615448749
loss 393: 0.0004748004338972926
loss 394: 0.0004747789587147418
loss 395: 0.00047475511880590857
loss 396: 0.0004746825492617012
loss 397: 0.0004746435084232676
loss 398: 0.00047453546440721277
loss 399: 0.00047441823532132427
loss 400: 0.00047435691733493593
loss 401: 0.00047426701375777173
loss 402: 0.0004740684620729073
loss 403: 0.0004739610826106935
loss 404: 0.0004737961432885261
loss 405: 0.0004737026466437856
loss 406: 0.0004735748414443405
loss 407: 0.00047337519489055953
loss 408: 0.00047328175919752
loss 409: 0.0004731545943398799
loss 410: 0.0004731206277595829
loss 411: 0.0004730513614213757
loss 412: 0.00047303859588248816
loss 413: 0.0004729858892728113
loss 414: 0.00047292624637273756
loss 415: 0.000472906846239175
loss 416: 0.0004728892005417366
loss 417: 0.00047286275894206425
loss 418: 0.0004728533973171063
loss 419: 0.0004728347753322375
loss 420: 0.0004728224446224718
loss 421: 0.0004728207228815544

Optimization result:
  Message: CONVERGENCE: RELATIVE REDUCTION OF F <= FACTR*EPSMCH
  Success: True
  Status code: 0
  Number of function evaluations: 488
  Number of iterations: 421

Energy (eV):
    Errors checked for 101 configurations
    MAX error: 0.0523475965280511
    ABS error: 0.010964749299322668
    RMS error: 0.014379275635013062

Energy per atom (eV/atom):
    Errors checked for 101 configurations
    MAX error: 0.006543449566006387
    ABS error: 0.0013705936624153336
    RMS error: 0.0017974094543766327

Forces per component (eV/angstrom):
    Errors checked for 808 atoms
    MAX error: 0.65951447358816
    ABS error: 0.09411372756180986
    RMS error: 0.12511195548186327

Stress per component (GPa):
    Errors checked for 0 configurations
    MAX error: nan
    ABS error: nan
    RMS error: nan

Time (step 0: minimize): 20.262692146999996 (s)

We can see the final loss function value.

loss(mtp_data.parameters)
np.float64(0.0004728207228815544)

After the training, the given mtp_data is updated in-place. We finally store the trained potential in a new file.

write_mtp("final.mtp", mtp_data)

Total running time of the script: (0 minutes 20.761 seconds)

Gallery generated by Sphinx-Gallery