Search the Community
Showing results for tags 'HMMO'.
Found 1 result
Hello, is there any literature available on how HMMO is working in HyperStudy besides the online documentation? I have a few problems understanding the exact concept. It would be great to get more information about how the GA opererators and the gradient search steps are used within the iteration steps to find the optimum. In the online documentation it says "HMMO consumes only a few model evaluations (typically 5) for gradient estimation". Which model evaluations are selected for the gradient estimation if they are that few? Does the gradient estimation influence the offspring created by the genetic algorithm in any way? Or are they independant of each other in the way that the GA is used to find a region near the optimum and the gradient search to reach convergence? Thanks for your help!!