reset password
Author Message
RobertKarapet
Posts: 20
Posted 14:43 Feb 19, 2018 |

Topic: Genetic Algorithms in Optimization Problems 

In a genetic algorithm, a population of candidate solutions to an optimization problem is evolved toward better solutions.

The evolution usually starts from a population of randomly generated individuals. The more fit individuals are "stochastically' selected from the current population, and each individual's genome is modified to form a new generation. The new generation of candidate solutions is then used in the next iteration of the algorithm.

Question 1: How can a selection be stochastic and preferential (fits are preferred) at the same time?

Question 2: Will the population of algorithms emerge to have few powerful individuals and many lame individuals?

There are many selection algorithms that do not consider all individuals for selection, but only those with a fitness value that is higher than a given (arbitrary) constant. Other algorithms select from a restricted pool where only a certain percentage of the individuals are allowed, based on fitness value.

Answer 1: The selection is only random during sample gathering. Afterwards, the fittest are chosen not "randomly," yet the essence of the selection is stochastic owing to the randomness of the first selection.

Answer 2: No, that is not guaranteed to be the case because the selection is initially random, yet it is still unlikely for weakest individuals (solutions/heuristics) to significantly improve in the mutation or crossover paradigm. 

Topic: Genetic Representation of the Solution Domain

In some problems, it is hard or even impossible to define the fitness expression; in these cases, a simulation may be used to determine the fitness function value of a phenotype. 

Question 3: In the Travelling Salesman Problem, would the total distance be a good genetic representation of the solution domain? 

The fitness function is defined over the genetic representation and measures the quality of the represented solution. The fitness function is always problem dependent.

Question 4: In relation to what optimization will the fitness function measure the quality of a given solution?

Topic: Crossover VS Mutation 

It is worth tuning parameters such as the mutation probability, crossover probability and population size to find reasonable settings for the problem class being worked on. A very small mutation rate may lead to genetic drift. A crossover rate that is too high may lead to premature convergence of the genetic algorithm. A mutation rate that is too high may lead to loss of good solutions, unless "elitist" selection is employed.

Elitism is a practical variant of the general process of constructing a new population which allows the best organisms from the current generation to carry over to the next, unaltered. This strategy is known as elitist selection and guarantees that the solution quality obtained by the GA will not decrease from one generation to the next.

Question 5: Would it not be in our better interest to confine ourselves to mutative elitism for the Travelling Salesman Problem? 

Diversity is important in genetic algorithms (and genetic programming) because crossing over a homogeneous population does not yield new solutions. In evolution strategies and evolutionary programming, diversity is not essential because of a greater reliance on mutation.

The suitability of genetic algorithms is dependent on the amount of knowledge of the problem; well known problems often have better, more specialized approaches.

Answer 5: It would, but then again we would no longer engage in genetic programming. We would need to switch to evolutionary programming.  

Topic: Global Variables in Dr. Abbot's Code

Question 6: What does the "path-forward-stack" do?

Topic: The setup Function in Dr. Abbot's Code

Question 7: what does "set showing-all-edges show-all-edges"  do? Why did you include the aforesaid expression twice in the setup function?

 

 

 

 

 

 

 

Last edited by RobertKarapet at 23:40 Feb 19, 2018.