Genetic Algorithms – A Friendly Introduction

A few days back, I was going through one of the research papers and realised I have a new topic to cover and have my hands dirty with in the field of Machine Learning (ML).

Yeah, you guessed it’s what we call Genetic Algorithms (GAs). While to my surprise, I myself found very less content of GAs being used in ML. If you find some, do link them in the comments. I’ll love to see others’ work on the same.

Without further gossips, lets just dig in and understand what GAs are all about.

In their documentation MathWorks® define Genetic Algorithms as:

The genetic algorithm is a method for solving both constrained and unconstrained optimization problems that is based on natural selection, the process that drives biological evolution.

For the sake of simplicity, let’s just say GAs are methods(set of algorithms) of search, often applied to optimization and learning.

While GAs are a set of algorithms, the bigger picture is that they are encapsulated by a much bigger set of algorithms known as Evolutionary Algorithms(EAs). So, what is so special about GAs? Well, their basis of evolutionary analogy is a thing that separates them from others in the same set of EAs. GAs follow the evolutionary Analogy – Survival of the Fittest. And if you weren’t sleeping in your Biology classes, you know that this evolutionary process is what we see in nature.

Now that we have a clear picture of what GAs are, let me introduce you to the concept of Search Space(a term constantly used in GAs).

Search Space

While solving a given problem, we generally look for a solution, which is best
among a set of possible solutions.
Going by the definition:

The space of all feasible solutions (it means objects among those the desired solution is) is called search space (also state space).

By the definition, it is clear that every point in search space refers to a feasible solution. To visualize these solutions, we can mark each point by its value or fitness for the problem at hand. The solution to the problem will be point or set of points from the search space.

How do we look for a solution? Well, looking for a solution generally follows the process of looking for an extreme(minimum or maximum) in the search space.

This might be a little confusing. Let’s just take an example. Consider a problem of finding a password which is 8 characters long containing only lower case alphabets. What do you think are plausible solutions to this problem? Well, since I attended my Permutations and Combinations classes, I know that the search space contains 26^8 choices. If you are confused, just think of 8 blanks and try thinking of possible choices(26 lower case alphabets) for each blank. The example was all good but the thing we have to keep in mind is in the case of a problem solved by GA – we usually know a few points from the search space and are generating other points as the process of finding solution continues.

Problem with search:
The problem is that search can be very COMPLICATED!
One does not know where to look for the solution and where to start.

To overcome this problem and find suitable solution (that may not necessarily be best solution), some nice guys came up with various methods, for example, hill climbing, Simulated annealing, etc. Solutions found using these methods are generally considered good because it is often not possible to prove what is real optimum(who knows in future maybe we’ll be able to – checkout these Quora answers on NP problems).

Before going ahead, we’ve to understand the concept of Chromosomes(don’t worry I’m not gonna blabber about the whole concept and go deep in the world of those nuclei).

We all know that every living being, from the mosquito you killed from that spray to you my friend, possess cells. Now, these cells have a fascinating thing about them, that is, every cell in an organism has the same set of chromosomes, for example, in humans there is a set of 23 pairs while in Gorillas there is a set of 24 pairs of chromosomes(1 pair can make a hell lot of difference, guys).

Chromosomes in general form are strings of DNA(I don’t want to go in detail of those ATGCs though) and serve as a model for the whole organism. Now, these chromosomes consist of genes (blocks of DNA).

Each gene(in certain terms) encodes a trait of the organism, example, color of eyes. Now, if you aren’t living all by yourself, you know that every individual organism in the same species have many distinctions from each other. This is because of different settings of the traits encoded in the genes. The possible settings for a trait(example blue eyes, brown hair, etc) are called alleles. These genes have a special position in each chromosome, which is reffered to as locus. The set of genetic material(a set of chromosomes) is called genome.
The last two terms of this biology lecture(bear with me for 2 more minutes guys)
So, we know every organism has this “internally coded, inheritable information”(set of chromosomes) and we refer to it as genotype. While these traits are being passed by parents, we only see a few of them(dominant ones). These actual observed properties of an organism is referred to as phenotype.

That was a heck of a biology lecture I know…you are a survivor man.


This table gives quick view whenever you need to check what we are referring to in the Methodology below.

Methodology of A Genetic Algorithm

In one liner, we can explain the methodology used as,

In a genetic algorithm, a population of candidate solutions (called individuals, creatures or phenotypes) to an optimization problem is evolved towards better solutions.

Going deeper, each of these candidate solutions have a set of properties (its genotype or chromosomes) which can be mutated and altered. Traditional way of representing a candidate solution is to use binary numbers to form strings of 0s and 1s(but other encodings are also possible).

Representation Example:
Bits or subset of bits represent choice of some features, for example, let us represent choice of a smart TV using a 10 bit string using following protocol:


Continuing to our discussion on Methodology…
Since we now understand how we can represent the solutions, let us understand how we go about making changes to them so as to reach a good solution.

In general terms, evolution of a population of randomly generated individuals is an iterative process, with population present in each iteration is called a generation. In every iteration, fitness of every individual in the population is evaluated(fitness is usually the value of objective function in the optimization problem being solved). To form a new generation with a bit of a randomness, we select more fit individuals stochastically from the current population and then modify the genome(recombine and randomly mutate) of each individual in the population. This new generation is then used in next iteration.Now, the question which might be popping in your mind is..Hey, you said its an iterative process, so when should it be brought to a hault?
Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.

With power comes responsibility. GAs give you a power to get good optimum solutions for problems but for every new problem to be solved using GAs, a typical GA requires:
1. A genetic representation of the solution domain(i.e., the genotype will be different for a different problem domain).
2. A fitness function to evaluate the solution domain(i.e., the fitness function can be different for different problem domains).
These two items must be developed again whenever a new problem is specified.

General Criteria Followed By Genetic Algorithms

Every GA follows a basic criteria for computing solutions. Let us have a look at these steps.


So, the questions that arise after giving a thought over the algorithm are:
1. How do we select parents for crossover?. Though this can be achieved in many ways, but the main idea followed is to select the better parents(this is done in hopes that better parents will produce better offspring).
2. If the new population is generated only by new offspring then this can cause loss of the best chromosomes from the last population.. This is indeed true; To overcome this problem, a method named elitism is used. It simply means that at least one best solution is copied without changes to the new population so the best solution found can survive to the end of run.

Looking at the algorithm, it is clear that three more operations other than fitness evaluation are being performed – Selection, Mutation and Crossover. These are done using three Genetic operators. Let us visit the territory of these Genetic Operators and have a clear look.

Genetic Operators

Genetic operators, as mentioned above, are methods to perform Selection, Mutation and Crossover. And not so surprisingly, they are named as Selection, Mutation and Crossover. We will now visit each of these one at a time.


To perform crossover and produce new offspring, we need to select parents(chromosomes). The problem that lies in the selection process is how to select these chromosomes such that the best ones should survive and create new offspring. Traditional way of doing so is to use Proportional Selection, i.e., parents are chosen to mate with probability proportional to their fitness. Also, in traditional selection process, the children replace their parents. There are many methods for selecting the best chromosomes, for example, Roulette Wheel Selection, Boltzmann Selection, Tournament Selection, Rank Selection and Steady-State Selection(Code for these will be produced with explanation in future post).

Looking at the broader picture, one can clearly see the principle being used is Survival of the Fittest.


Mutation(reminds us of X-MEN I know..oh Logan) is one of the most simple processes to explain in GA. This operator operates on a single parent at a time to produce an offspring. Think of it as a magic wand that can only change one (or more) bit in a binary representation, where each bit has same probability of mutation. Let us remember our previous example string.
Example String:  0101110011 (Represents a Samsung LED Smart TV in range 46″-49″ with 3D and HDR capabilities but no 4K UHD support and no curved display)
Could Mutate to: 0101111011 (Represents a Samsung LED Smart TV in range 46″-49″ with 3D and HDR capabilities along with 4K UHD support but no curved display)

Oh God..if only my algorithm can really mutate my smart TV!
It might or might not happen in future but we must keep going(with my broken heart though).


This, my boy, is how you were born(not exactly though). I guess I must put a hault on my lame jokes now.So, this operator as you might have guessed, operates on two parents selected by Selection operator to produce one or two offspring. Classical crossover occurs at one or 2 points in the selected parents.

Example of generating offspring using 1 point crossover:


Looking at the image, it is clear that we just swapped the sub-strings on either side of the point of crossover.

Example of generating offspring using 2 point crossover:


Just like 1 point crossover, in 2 point crossover, we swapped the sub-strings enclosed in the two points of crossover.

With this, I think, you guys have a pretty good idea of how GAs work. Giving some rest to our minds, I’ll be back with a post on a very simple GA with an example problem to apply GA upon.

Connect with me on:

fb logo   insta-logojpg.jpg   gmail.png


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s