Brain  Crist

Brain Crist


Paxos Algorithm

Since writing my previous article about the CAP theorem, I have come to realize that I may have misunderstood the CAP theorem and that consistency and availability could be potentially achieved(?).

While the CAP theorem asserts that consistency and availability cannot be achieved simultaneously in a distributed system, the CAP theorem defines availability and consistency in strict binary terms. The term “availability” is defined as a continuum and consistency can be divided into different levels, such as weak consistency, strong consistency, read and write consistency, and final consistency. In layman’s terms, the CAP theorem argues that strong consistency and ultimate availability cannot be achieved at the same time.

To address these limitations, Turing-award winning **Paxos Protocol **was introduced to maximize the efficiency of availability and consistency in such systems. Paxos algorithm helps systems work in the presence of network errors and node failures (availability) while ensuring consistency.

Paxos is a family of distributed algorithms for solving consensus in a network of unreliable or fallible processors.

The Paxos algorithm is based on simple majority rule which is capable of ensuring that only consistent resulting values can be achieved. The protocol proposes that if the majority of the nodes in a system are available, then the system as a whole is available and can guarantee strong data consistency, which is a great improvement for availability.

>> How it works

Essentially the Paxos protocol compares each write request to a proposal. In terms of entities, the Paxos protocol has the following entities:

  1. Proposers: Receive requests (values/proposals) from clients and try to convince acceptors to accept their proposed values
  2. Acceptors: Accept certain proposed values from proposers and let proposers know if something else was accepted. A response represents a vote for a particular proposal
  3. Learners: Announces the outcome

Each proposal can be broken down into two phases: phase 1 (Prepare & Promise) and _phase 2 (Accept & Accepted). _Proposers interact with the acceptors twice.

  1. **Phase 1: **Proposer selects a proposal number _n _and asks all acceptors to accept that _prepare request. _If acceptors receive a prepare request with number n and the value of n is greater than the number of all prepare requests it has already responded to, then it will guarantee that no proposal with a number less than n is accepted.
  2. **Phase 2: **If the Proposer receives a response from the majority of Acceptors for its prepare requests (also as n), then it sends an _accept request _for the proposal with the number n and the value as v to Acceptors, where v is the value of the proposal with the highest number in the response received. If acceptors receive an accept request with a number n, it can accept the proposal as long as it has not responded to a _prepare request _with a number greater than n.

#algorithms #software-development #coding #programming #distributed-systems

What is GEEK

Buddha Community

Paxos Algorithm

A greedy algorithm is a simple

The Greedy Method is an approach for solving certain types of optimization problems. The greedy algorithm chooses the optimum result at each stage. While this works the majority of the times, there are numerous examples where the greedy approach is not the correct approach. For example, let’s say that you’re taking the greedy algorithm approach to earning money at a certain point in your life. You graduate high school and have two options:

#computer-science #algorithms #developer #programming #greedy-algorithms #algorithms

Tia  Gottlieb

Tia Gottlieb


KMP — Pattern Matching Algorithm

Finding a certain piece of text inside a document represents an important feature nowadays. This is widely used in many practical things that we regularly do in our everyday lives, such as searching for something on Google or even plagiarism. In small texts, the algorithm used for pattern matching doesn’t require a certain complexity to behave well. However, big processes like searching the word ‘cake’ in a 300 pages book can take a lot of time if a naive algorithm is used.

The naive algorithm

Before, talking about KMP, we should analyze the inefficient approach for finding a sequence of characters into a text. This algorithm slides over the text one by one to check for a match. The complexity provided by this solution is O (m * (n — m + 1)), where m is the length of the pattern and n the length of the text.

Find all the occurrences of string pat in string txt (naive algorithm).

#include <iostream>
	#include <string>
	#include <algorithm>
	using namespace std;

	string pat = "ABA"; // the pattern
	string txt = "CABBCABABAB"; // the text in which we are searching

	bool checkForPattern(int index, int patLength) {
	    int i;
	    // checks if characters from pat are different from those in txt
	    for(i = 0; i < patLength; i++) {
	        if(txt[index + i] != pat[i]) {
	            return false;
	    return true;

	void findPattern() {
	    int patternLength = pat.size();
	    int textLength = txt.size();

	    for(int i = 0; i <= textLength - patternLength; i++) {
	        // check for every index if there is a match
	        if(checkForPattern(i,patternLength)) {
	            cout << "Pattern at index " << i << "\n";


	int main() 
	    return 0;
view raw
main6.cpp hosted with ❤ by GitHub

KMP approach

This algorithm is based on a degenerating property that uses the fact that our pattern has some sub-patterns appearing more than once. This approach is significantly improving our complexity to linear time. The idea is when we find a mismatch, we already know some of the characters in the next searching window. This way we save time by skip matching the characters that we already know will surely match. To know when to skip, we need to pre-process an auxiliary array prePos in our pattern. prePos will hold integer values that will tell us the count of characters to be jumped. This supporting array can be described as the longest proper prefix that is also a suffix.

#programming #data-science #coding #kmp-algorithm #algorithms #algorithms

Beth  Nabimanya

Beth Nabimanya


Algorithm trading backtest and optimization examples

Algorithm trading backtest and optimization examples

Algorithmic trading backtests

Algorithm trading backtest and optimization examples.



#algorithms #optimization examples #algorithm trading backtest #algorithm #trading backtest

Genetic Algorithm (GA): A Simple and Intuitive Guide

Learn what are metaheuristics and why we use them sometimes instead of traditional optimization algorithms. Learn the metaheuristic Genetic Algorithm (GA) and how it works through a simple step by step guide.

#genetic-algorithm #algorithms #optimization #metaheuristics #data-science #algorithms

Lina  Biyinzika

Lina Biyinzika


Introduction to Genetic Algorithm

What is Optimization?

  • Making something better.
  • Increase efficiency.

Optimization problem

  • A problem in which we have to find the values of inputs (also called solutions or decision variables) from all possible inputs in such a way that we get the “best” output values.
  • Definition of “best”- Finding the values of inputs that result in a maximum or minimum of a function called the objective function.
  • There can be multiple objective functions as well (depends on the problem).

Optimization algorithm

An algorithm used to solve an optimization problem is called an optimization algorithm.

Evolutionary Algorithms

Algorithms that simulate physical and/or biological behavior in nature to solve optimization problems.

Genetic Algorithm (GA)

  • It is a subset of evolutionary algorithms that simulates/models Genetics and Evolution (biological behavior) to optimize a highly complex function.
  • A highly complex function can be:
  • 1. Very difficult to model mathematically.
  • 2. Computationally expensive to solve. Eg. NP-hard problems.
  • 3. Involves a large number of parameters.

Background of GA

  • Introduced by Prof. John Holland in 1965.
  • The first article on GA was published in 1975.
  • GA is based on two fundamental biological processes:
  • 1. Genetics (by G.J. Mendel in 1865): It is the branch of biology that deals with the study of genes, gene variation, and heredity.
  • 2. Evolution (by C. Darwin in 1875): It is the process by which the population of organisms changes over generations.

Natural selection in Evolution

  1. A population of individuals exists in an environment with limited resources.
  2. Competition for those resources causes the selection of those fitter individuals that are better adapted to the environment.
  3. These individuals act as seeds for the generation of new individuals through recombination and mutation.
  4. Evolved new individuals act as initial population and Steps 1 to 3 are repeated.

Nature-GA Analogy

Structure of GA

GA vs Traditional Algorithms

Applications of GA

  • 1. Acoustics
  • 2. Aerospace Engineering
  • 3. Financial Markets
  • 4. Geophysics
  • 5. Materials Engineering
  • 6. Routing and Scheduling
  • 7. Systems Engineering

Problems with GA

  • 1. Population Selection Problem
  • 2. Defining Fitness Function
  • 3. Premature or rapid convergence of GA
  • 4. Convergence to Local Optima

#evolutionary-algorithms #data-science #genetic-algorithm #algorithm