Jump to content
Sign in to follow this  
ampair

Hybrid Multi-Objective Method for Optimization

Recommended Posts

Hello,


 


is there any literature available on how HMMO is working in HyperStudy besides the online documentation? I have a few problems understanding the exact concept. 


 


It would be great to get more information about how the GA opererators and the gradient search steps are used within the iteration steps to find the optimum. 


 


In the online documentation it says "HMMO consumes only a few model evaluations (typically 5) for gradient estimation".


 


Which model evaluations are selected for the gradient estimation if they are that few?


 


Does the gradient estimation influence the offspring created by the genetic algorithm in any way?  Or are they independant of each other in the way that the GA is used to find a region near the optimum and the gradient search to  reach convergence?


 


Thanks for your help!!


 


 


Share this post


Link to post
Share on other sites
Guest

Hi ampair,


 


In HMMO, GA searches the whole design space and generate promising start points for gradient search. The improved designs in gradient search are picked as part of the candidates to generate offspring in GA.

Share this post


Link to post
Share on other sites
Sign in to follow this  

×
×
  • Create New...