Comments (12)
For FMAES, I got 0.152 on my personal computer using the following settings:
options = {
'fitness_threshold': 1e-4,
'max_runtime': 30_000,
'seed_rng': 500,
'max_function_evaluations': 100_000_000,
'sigma': 1.3,
'verbose': 500,
'stagnation': 100
}
I only added the 'stagnation' setting and increased 'sigma' from 0.3 to 1.3 for better exploration.
* Generation 0: best_so_far_y 3.45900e+01, min(y) 3.45900e+01 & Evaluations 16
* Generation 500: best_so_far_y 3.61442e-01, min(y) 3.66357e-01 & Evaluations 8016
* Generation 1000: best_so_far_y 2.96007e-01, min(y) 2.96007e-01 & Evaluations 16016
* Generation 1500: best_so_far_y 2.92347e-01, min(y) 2.92352e-01 & Evaluations 24016
* Generation 2000: best_so_far_y 2.75207e-01, min(y) 2.75319e-01 & Evaluations 32016
* Generation 2500: best_so_far_y 2.62970e-01, min(y) 2.62970e-01 & Evaluations 40016
* Generation 3000: best_so_far_y 2.62729e-01, min(y) 2.62729e-01 & Evaluations 48016
* Generation 3500: best_so_far_y 2.54470e-01, min(y) 2.54471e-01 & Evaluations 56016
* Generation 4000: best_so_far_y 2.54416e-01, min(y) 2.54416e-01 & Evaluations 64016
* Generation 4500: best_so_far_y 2.54416e-01, min(y) 2.54416e-01 & Evaluations 72016
* Generation 4654: best_so_far_y 2.54416e-01, min(y) 2.54416e-01 & Evaluations 74464
....... *** restart *** .......
* Generation 0: best_so_far_y 2.54416e-01, min(y) 1.89882e+01 & Evaluations 74496
* Generation 500: best_so_far_y 2.54416e-01, min(y) 3.33518e-01 & Evaluations 90496
* Generation 1000: best_so_far_y 2.41184e-01, min(y) 2.41184e-01 & Evaluations 106496
* Generation 1500: best_so_far_y 2.38546e-01, min(y) 2.38546e-01 & Evaluations 122496
* Generation 2000: best_so_far_y 2.38490e-01, min(y) 2.38490e-01 & Evaluations 138496
* Generation 2152: best_so_far_y 2.38490e-01, min(y) 2.38490e-01 & Evaluations 143328
....... *** restart *** .......
* Generation 0: best_so_far_y 2.38490e-01, min(y) 2.58909e+01 & Evaluations 143392
* Generation 500: best_so_far_y 2.38490e-01, min(y) 2.38844e-01 & Evaluations 175392
* Generation 1000: best_so_far_y 2.19327e-01, min(y) 2.19327e-01 & Evaluations 207392
* Generation 1500: best_so_far_y 2.17826e-01, min(y) 2.17886e-01 & Evaluations 239392
* Generation 2000: best_so_far_y 2.17233e-01, min(y) 2.17233e-01 & Evaluations 271392
* Generation 2500: best_so_far_y 2.17181e-01, min(y) 2.17181e-01 & Evaluations 303392
* Generation 3000: best_so_far_y 2.17172e-01, min(y) 2.17172e-01 & Evaluations 335392
* Generation 3500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 367392
* Generation 4000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 399392
* Generation 4500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 431392
* Generation 5000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 463392
* Generation 5500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 495392
* Generation 6000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 527392
* Generation 6500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 559392
* Generation 7000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 591392
* Generation 7500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 623392
* Generation 8000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 655392
* Generation 8500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 687392
* Generation 9000: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 719392
* Generation 9500: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 751392
* Generation 9583: best_so_far_y 2.17170e-01, min(y) 2.17170e-01 & Evaluations 756640
....... *** restart *** .......
* Generation 0: best_so_far_y 2.17170e-01, min(y) 2.11403e+01 & Evaluations 756768
* Generation 500: best_so_far_y 2.17170e-01, min(y) 2.78004e-01 & Evaluations 820768
* Generation 1000: best_so_far_y 2.17170e-01, min(y) 2.25295e-01 & Evaluations 884768
* Generation 1297: best_so_far_y 2.17170e-01, min(y) 2.25235e-01 & Evaluations 922656
....... *** restart *** .......
* Generation 0: best_so_far_y 2.17170e-01, min(y) 2.41040e+01 & Evaluations 922912
* Generation 500: best_so_far_y 2.17170e-01, min(y) 2.27704e-01 & Evaluations 1050912
* Generation 968: best_so_far_y 1.97796e-01, min(y) 1.97796e-01 & Evaluations 1170464
....... *** restart *** .......
* Generation 0: best_so_far_y 1.97796e-01, min(y) 2.07147e+01 & Evaluations 1170976
* Generation 500: best_so_far_y 1.97796e-01, min(y) 2.16960e-01 & Evaluations 1426976
* Generation 1000: best_so_far_y 1.84787e-01, min(y) 1.85319e-01 & Evaluations 1682976
* Generation 1500: best_so_far_y 1.52076e-01, min(y) 1.52076e-01 & Evaluations 1938976
* Generation 1636: best_so_far_y 1.52076e-01, min(y) 1.52076e-01 & Evaluations 2008096
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.50320e+01 & Evaluations 2009120
* Generation 500: best_so_far_y 1.52076e-01, min(y) 3.68050e-01 & Evaluations 2521120
* Generation 1000: best_so_far_y 1.52076e-01, min(y) 1.75294e-01 & Evaluations 3033120
* Generation 1483: best_so_far_y 1.52076e-01, min(y) 1.64093e-01 & Evaluations 3526688
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.59880e+01 & Evaluations 3528736
* Generation 348: best_so_far_y 1.52076e-01, min(y) 6.50549e-01 & Evaluations 4239392
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.74142e+01 & Evaluations 4243488
* Generation 500: best_so_far_y 1.52076e-01, min(y) 6.52559e-01 & Evaluations 6291488
* Generation 1000: best_so_far_y 1.52076e-01, min(y) 3.01358e-01 & Evaluations 8339488
* Generation 1016: best_so_far_y 1.52076e-01, min(y) 3.04507e-01 & Evaluations 8400928
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.35338e+01 & Evaluations 8409120
* Generation 181: best_so_far_y 1.52076e-01, min(y) 6.78880e-01 & Evaluations 9883680
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 8.43856e+00 & Evaluations 9900064
* Generation 375: best_so_far_y 1.52076e-01, min(y) 6.89216e-01 & Evaluations 16027680
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.12617e+01 & Evaluations 16060448
* Generation 141: best_so_far_y 1.52076e-01, min(y) 6.80394e-01 & Evaluations 20647968
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 8.35395e+00 & Evaluations 20713504
* Generation 195: best_so_far_y 1.52076e-01, min(y) 6.74632e-01 & Evaluations 33427488
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.01387e+01 & Evaluations 33558560
* Generation 259: best_so_far_y 1.52076e-01, min(y) 6.71572e-01 & Evaluations 67375136
....... *** restart *** .......
* Generation 0: best_so_far_y 1.52076e-01, min(y) 1.17326e+01 & Evaluations 67637280
* Generation 124: best_so_far_y 1.52076e-01, min(y) 6.79190e-01 & Evaluations 100000000
0.15207564457445216
from pypop.
Thanks very much for your tips. We are checking your code, in order to help improve it.
from pypop.
I just run your codes and get the results:
100%|██████████| 300/300 [01:02<00:00, 4.77it/s, best_result_so_far=0.158]
message: Optimization terminated successfully.
success: True
status: 0
fun: 0.1583149714373654
x: [ 4.046e-01 1.729e+00 ... -5.250e-01 4.071e-01]
nit: 283
jac: [ 1.756e-05 -2.446e-05 ... -2.330e-06 9.231e-06]
hess_inv: [[ 1.666e+00 6.581e-02 ... 5.589e-01 -1.330e-01]
[ 6.581e-02 5.340e-01 ... 1.823e-01 -1.440e-01]
...
[ 5.589e-01 1.823e-01 ... 1.251e+00 -4.515e-01]
[-1.330e-01 -1.440e-01 ... -4.515e-01 3.689e+00]]
nfev: 18239
njev: 299
- Generation 0: best_so_far_y 1.65875e+01, min(y) 1.65875e+01 & Evaluations 16
- Generation 500: best_so_far_y 4.43191e-01, min(y) 4.44579e-01 & Evaluations 8016
- Generation 1000: best_so_far_y 3.63186e-01, min(y) 3.63186e-01 & Evaluations 16016
- Generation 1500: best_so_far_y 3.53268e-01, min(y) 3.53329e-01 & Evaluations 24016
- Generation 2000: best_so_far_y 3.05547e-01, min(y) 3.05707e-01 & Evaluations 32016
- Generation 2500: best_so_far_y 2.95432e-01, min(y) 2.95478e-01 & Evaluations 40016
- Generation 3000: best_so_far_y 2.79785e-01, min(y) 2.79901e-01 & Evaluations 48016
- Generation 3500: best_so_far_y 2.69350e-01, min(y) 2.69350e-01 & Evaluations 56016
- Generation 4000: best_so_far_y 2.65173e-01, min(y) 2.65265e-01 & Evaluations 64016
- Generation 4500: best_so_far_y 2.62260e-01, min(y) 2.62262e-01 & Evaluations 72016
- Generation 5000: best_so_far_y 2.61340e-01, min(y) 2.61340e-01 & Evaluations 80016
...............................
....... *** restart *** ....... - Generation 0: best_so_far_y 2.53208e-01, min(y) 1.40257e+01 & Evaluations 1939728
- Generation 500: best_so_far_y 2.47293e-01, min(y) 2.47293e-01 & Evaluations 1955728
- Generation 1000: best_so_far_y 2.16235e-01, min(y) 2.16235e-01 & Evaluations 1971728
- Generation 1500: best_so_far_y 2.15913e-01, min(y) 2.15913e-01 & Evaluations 1987728
- Generation 1637: best_so_far_y 2.15913e-01, min(y) 2.15913e-01 & Evaluations 1992080
....... *** restart *** ....... - Generation 0: best_so_far_y 2.15913e-01, min(y) 1.71281e+01 & Evaluations 1992144
- Generation 500: best_so_far_y 2.15913e-01, min(y) 3.12021e-01 & Evaluations 2024144
- Generation 1000: best_so_far_y 2.15913e-01, min(y) 2.74514e-01 & Evaluations 2056144
- Generation 1500: best_so_far_y 2.15913e-01, min(y) 2.35233e-01 & Evaluations 2088144
- Generation 2000: best_so_far_y 1.57412e-01, min(y) 1.57556e-01 & Evaluations 2120144
- Generation 2500: best_so_far_y 1.55672e-01, min(y) 1.55673e-01 & Evaluations 2152144
- Generation 3000: best_so_far_y 1.55385e-01, min(y) 1.55386e-01 & Evaluations 2184144
- Generation 3500: best_so_far_y 1.55307e-01, min(y) 1.55307e-01 & Evaluations 2216144
- Generation 4000: best_so_far_y 1.55283e-01, min(y) 1.55284e-01 & Evaluations 2248144
- Generation 4500: best_so_far_y 1.55274e-01, min(y) 1.55274e-01 & Evaluations 2280144
- Generation 5000: best_so_far_y 1.55269e-01, min(y) 1.55269e-01 & Evaluations 2312144
- Generation 5500: best_so_far_y 1.55267e-01, min(y) 1.55267e-01 & Evaluations 2344144
- Generation 6000: best_so_far_y 1.55266e-01, min(y) 1.55266e-01 & Evaluations 2376144
- Generation 6500: best_so_far_y 1.55265e-01, min(y) 1.55265e-01 & Evaluations 2408144
- Generation 7000: best_so_far_y 1.55265e-01, min(y) 1.55265e-01 & Evaluations 2440144
- Generation 7500: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2472144
- Generation 8000: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2504144
- Generation 8500: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2536144
- Generation 9000: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2568144
- Generation 9500: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2600144
- Generation 10000: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2632144
- Generation 10500: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2664144
- Generation 11000: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2696144
- Generation 11500: best_so_far_y 1.55264e-01, min(y) 1.55264e-01 & Evaluations 2728144
- Generation 12000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2760144
- Generation 12500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2792144
- Generation 13000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2824144
- Generation 13500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2856144
- Generation 14000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2888144
- Generation 14500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2920144
- Generation 15000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2952144
- Generation 15500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 2984144
- Generation 16000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3016144
- Generation 16500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3048144
- Generation 17000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3080144
- Generation 17500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3112144
- Generation 18000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3144144
- Generation 18500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3176144
- Generation 19000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3208144
- Generation 19500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3240144
- Generation 20000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3272144
- Generation 20500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3304144
- Generation 21000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3336144
- Generation 21500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3368144
- Generation 22000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3400144
- Generation 22500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3432144
- Generation 23000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3464144
- Generation 23500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3496144
- Generation 24000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3528144
- Generation 24500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3560144
- Generation 25000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3592144
- Generation 25500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3624144
- Generation 26000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3656144
- Generation 26500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3688144
- Generation 27000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3720144
- Generation 27500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3752144
- Generation 28000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3784144
- Generation 28500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3816144
- Generation 29000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3848144
- Generation 29500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3880144
- Generation 30000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3912144
- Generation 30500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3944144
- Generation 31000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 3976144
- Generation 31500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4008144
- Generation 32000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4040144
- Generation 32500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4072144
- Generation 33000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4104144
- Generation 33500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4136144
- Generation 34000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4168144
- Generation 34500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4200144
- Generation 35000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4232144
- Generation 35500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4264144
- Generation 36000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4296144
- Generation 36500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4328144
- Generation 37000: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4360144
- Generation 37500: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4392144
- Generation 37780: best_so_far_y 1.55263e-01, min(y) 1.55263e-01 & Evaluations 4410000
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.66687e+01 & Evaluations 4410128
- Generation 500: best_so_far_y 1.55263e-01, min(y) 2.63335e-01 & Evaluations 4474128
- Generation 1000: best_so_far_y 1.55263e-01, min(y) 2.06230e-01 & Evaluations 4538128
- Generation 1345: best_so_far_y 1.55263e-01, min(y) 2.06176e-01 & Evaluations 4582160
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.29086e+01 & Evaluations 4582416
- Generation 500: best_so_far_y 1.55263e-01, min(y) 1.86254e-01 & Evaluations 4710416
- Generation 998: best_so_far_y 1.55263e-01, min(y) 1.81902e-01 & Evaluations 4837648
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.95505e+01 & Evaluations 4838160
- Generation 500: best_so_far_y 1.55263e-01, min(y) 2.00258e-01 & Evaluations 5094160
- Generation 1000: best_so_far_y 1.55263e-01, min(y) 1.77227e-01 & Evaluations 5350160
- Generation 1060: best_so_far_y 1.55263e-01, min(y) 1.77227e-01 & Evaluations 5380368
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.04358e+01 & Evaluations 5381392
- Generation 500: best_so_far_y 1.55263e-01, min(y) 2.43145e-01 & Evaluations 5893392
- Generation 1000: best_so_far_y 1.55263e-01, min(y) 1.84885e-01 & Evaluations 6405392
- Generation 1010: best_so_far_y 1.55263e-01, min(y) 1.84885e-01 & Evaluations 6414608
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.10192e+01 & Evaluations 6416656
- Generation 342: best_so_far_y 1.55263e-01, min(y) 6.37732e-01 & Evaluations 7115024
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.21271e+01 & Evaluations 7119120
- Generation 500: best_so_far_y 1.55263e-01, min(y) 6.20709e-01 & Evaluations 9167120
- Generation 1000: best_so_far_y 1.55263e-01, min(y) 2.58374e-01 & Evaluations 11215120
- Generation 1228: best_so_far_y 1.55263e-01, min(y) 2.51251e-01 & Evaluations 12144912
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.13383e+01 & Evaluations 12153104
- Generation 500: best_so_far_y 1.55263e-01, min(y) 3.86129e-01 & Evaluations 16249104
- Generation 861: best_so_far_y 1.55263e-01, min(y) 3.61371e-01 & Evaluations 19198224
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 8.38923e+00 & Evaluations 19214608
- Generation 500: best_so_far_y 1.55263e-01, min(y) 4.27539e-01 & Evaluations 27406608
- Generation 708: best_so_far_y 1.55263e-01, min(y) 4.22711e-01 & Evaluations 30798096
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 1.42631e+01 & Evaluations 30830864
- Generation 270: best_so_far_y 1.55263e-01, min(y) 6.60824e-01 & Evaluations 39645456
....... *** restart *** ....... - Generation 0: best_so_far_y 1.55263e-01, min(y) 9.53739e+00 & Evaluations 39710992
The results of multi-start optimization and FMAES are 0.1583149714373654 and 0.155263, respectively.
My system is based on python3.11.
from pypop.
Hmm... Strange. Could you send me numpy, scipy, pypop, numba versions. @Chang-SHAO
from pypop.
numpy==1.25.2
scipy==1.11.3
pypop7==0.0.72
numba==0.58.0
from pypop.
python==3.11.4
from pypop.
@Chang-SHAO
This is super weird.
I installed python 3.11.4. Windows 10 x64.
celluloid==0.2.0
colorama==0.4.6
contourpy==1.1.1
cycler==0.12.1
filelock==3.12.4
fonttools==4.43.1
fsspec==2023.9.2
imageio==2.31.5
Jinja2==3.1.2
joblib==1.3.2
kiwisolver==1.4.5
llvmlite==0.41.0
MarkupSafe==2.1.3
matplotlib==3.8.0
mpmath==1.3.0
networkx==3.1
numba==0.58.0
numpy==1.25.2
packaging==23.2
pandas==2.1.1
Pillow==10.0.1
pyparsing==3.1.1
pypop7==0.0.72
python-dateutil==2.8.2
pytz==2023.3.post1
scikit-learn==1.3.1
scipy==1.11.3
seaborn==0.13.0
six==1.16.0
sympy==1.12
threadpoolctl==3.2.0
torch==2.1.0
tqdm==4.66.1
typing_extensions==4.8.0
tzdata==2023.3
And have got the same results
"multi-start optimization is 0.161725642456162, FMAES - 0.172185"
if I use numpy generator, which might be more stable, I have got 0.1534:
message: Optimization terminated successfully.
success: True
status: 0
fun: 0.15348349998395383
x: [ 1.810e+00 -1.906e-02 ... -1.579e-01 -9.263e-01]
nit: 488
jac: [ 3.576e-07 -1.401e-05 ... 1.729e-06 -2.235e-08]
hess_inv: [[ 8.521e-01 1.285e-02 ... 3.749e-02 5.405e-01]
[ 1.285e-02 1.087e-03 ... -2.505e-05 3.619e-02]
...
[ 3.749e-02 -2.505e-05 ... 6.151e-01 -5.427e-02]
[ 5.405e-01 3.619e-02 ... -5.427e-02 3.545e+00]]
nfev: 33062
njev: 542
def multi_start_optimization(func, bounds, n_starts=10, method='BFGS', callback=None, seed=None, verbose=True,
**kwargs):
best_result = None
generator = numpy.random.default_rng(seed=seed)
for _ in (pbar := tqdm(range(n_starts), disable=not verbose)):
x0 = generator.random(*bounds[0].shape) * (bounds[1] - bounds[0]) + bounds[0]
result = scipy.optimize.minimize(func, x0, method=method, callback=callback, **kwargs)
if best_result is None or result.fun < best_result.fun:
best_result = result
pbar.set_postfix({'best_result_so_far': best_result.fun})
return best_result
from pypop.
@FiksII According to the official suggestion of NumPy, it is better to use its generator: https://numpy.org/doc/stable/reference/random/generator.html
from pypop.
@Evolutionary-Intelligence The problem is I can't achieve FMAES 0.155:
....... *** restart *** .......
* Generation 0: best_so_far_y 1.84766e-01, min(y) 1.28830e+01 & Evaluations 4225392
* Generation 500: best_so_far_y 1.84766e-01, min(y) 4.66329e-01 & Evaluations 5249392
* Generation 1000: best_so_far_y 1.72185e-01, min(y) 1.74094e-01 & Evaluations 6273392
* Generation 1087: best_so_far_y 1.72185e-01, min(y) 1.72500e-01 & Evaluations 6449520
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 1.31143e+01 & Evaluations 6453616
* Generation 500: best_so_far_y 1.72185e-01, min(y) 3.64424e-01 & Evaluations 8501616
* Generation 780: best_so_far_y 1.72185e-01, min(y) 2.91337e-01 & Evaluations 9644400
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 1.11503e+01 & Evaluations 9652592
* Generation 500: best_so_far_y 1.72185e-01, min(y) 3.37978e-01 & Evaluations 13748592
* Generation 685: best_so_far_y 1.72185e-01, min(y) 2.82461e-01 & Evaluations 15255920
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 9.21559e+00 & Evaluations 15272304
* Generation 500: best_so_far_y 1.72185e-01, min(y) 4.49968e-01 & Evaluations 23464304
* Generation 717: best_so_far_y 1.72185e-01, min(y) 4.29491e-01 & Evaluations 27003248
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 1.20239e+01 & Evaluations 27036016
* Generation 215: best_so_far_y 1.72185e-01, min(y) 6.52917e-01 & Evaluations 34048368
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 8.29719e+00 & Evaluations 34113904
* Generation 500: best_so_far_y 1.72185e-01, min(y) 6.05617e-01 & Evaluations 66881904
* Generation 802: best_so_far_y 1.72185e-01, min(y) 5.39836e-01 & Evaluations 86608240
....... *** restart *** .......
* Generation 0: best_so_far_y 1.72185e-01, min(y) 1.50296e+01 & Evaluations 86739312
* Generation 102: best_so_far_y 1.72185e-01, min(y) 6.62570e-01 & Evaluations 100000000
from pypop.
For PyPop7, we always use this generator way to control the random number generation, if possible.
from pypop.
For BFGS, I obtained 0.148 on my personal computer:
100%|██████████████████████████████████████████████████████| 300/300 [04:24<00:00, 1.13it/s, best_result_so_far=0.148]
fun: 0.14828481985383515
hess_inv: array([[ 2.0961215 , 0.08509032, 0.18346933, ..., 0.16121667,
0.06686123, -0.43784459],
[ 0.08509032, 1.91309737, 0.75694706, ..., 0.05655528,
0.08857216, 0.31681217],
[ 0.18346933, 0.75694706, 0.69920324, ..., 0.05734176,
0.11169623, -0.15906565],
...,
[ 0.16121667, 0.05655528, 0.05734176, ..., 0.59763052,
-0.10901236, 0.11503432],
[ 0.06686123, 0.08857216, 0.11169623, ..., -0.10901236,
0.89061476, -0.24634906],
[-0.43784459, 0.31681217, -0.15906565, ..., 0.11503432,
-0.24634906, 4.17223822]])
jac: array([-8.75052065e-05, 3.37455422e-05, 2.41957605e-06, 2.34860927e-05,
-2.36947089e-05, 1.23474747e-05, 2.92062759e-06, -1.43740326e-05,
5.03454357e-05, 3.47476453e-05, -6.09885901e-05, 5.98505139e-05,
-1.31204724e-05, 3.48389149e-05, -1.76597387e-05, -2.08821148e-05,
-6.03646040e-05, 8.12429935e-05, 3.06293368e-05, -6.67702407e-05,
1.09653920e-05, 8.75815749e-06, -1.92467123e-05, 1.03972852e-05,
1.18333846e-05, -6.42407686e-05, 6.33951277e-05, 5.14611602e-05,
-8.22097063e-05, 4.25986946e-06, -2.04294920e-05, 3.18437815e-05,
-7.43083656e-05, -1.32974237e-05, -3.81935388e-05, -1.77603215e-05,
1.90306455e-05, 8.14720988e-06, 9.55574214e-05, 1.44578516e-05,
1.95633620e-05, 1.09635293e-05, -3.16649675e-07, -1.98744237e-05,
7.72066414e-06, -1.95614994e-05, -1.54785812e-05, 3.71038914e-06,
6.73905015e-06, -7.23451376e-06, 2.01575458e-05, 3.55895609e-05,
1.08387321e-05, -2.13719904e-05, 7.06315041e-05, -3.15345824e-06,
4.14438546e-06, 2.66432762e-05, 1.62962824e-05, 4.97139990e-06])
message: 'Optimization terminated successfully.'
nfev: 18361
nit: 282
njev: 301
status: 0
success: True
x: array([ 0.44604777, 1.57291812, 2.18368842, -0.21170684, 4.09407358,
0.95095942, 1.61002277, 0.93623952, 1.63431024, 4.04614979,
1.21756344, 0.70340479, 1.26349757, 0.65616069, 4.51423851,
0.90463141, 0.58072188, 1.0397659 , 1.81215946, 1.65111476,
1.66346183, -0.11780882, -0.03126143, 2.29387165, 3.90712599,
1.08005201, 0.92082673, 5.02446848, 5.12294812, 2.28257676,
0.52091617, 5.63332664, 5.58009323, 4.29602413, 3.57000464,
6.21840832, 6.77843958, 1.8425023 , 5.69346366, 5.03920492,
-0.33451392, -0.32414526, -0.40593616, -0.40431025, -0.3642978 ,
-0.439838 , -0.45072511, -0.40888369, -0.74354353, -0.29325016,
-0.48155137, -0.36315907, -0.50587948, -0.13964949, -0.51739519,
-0.2651726 , -0.4835695 , -0.12372969, -0.45856249, 0.38577874])
from pypop.
@Evolutionary-Intelligence I am completely confused
from pypop.
Related Issues (9)
- n_parents issue HOT 1
- Possible bug after restart of RMES HOT 1
- Optimizers don't take into account lower and upper boundaries HOT 1
- Feature request. Early stopping HOT 5
- Parllelization HOT 1
- How to get candidate solutions from each iteration? HOT 3
- Add Arxiv article resource link to readme HOT 2
- No module named 'pypop7.optimizers.bo' HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pypop.