Gordon  Matlala

Gordon Matlala


Merge All Your History

Sometimes it may happen that you created too many commits, and it will be better if you squash them into 1.

In git, we don’t have a squash command; instead, you can use a git rebase with an additional interactive flag.

For instance, our branches look like that.

To make proper squash, you need to specify commit you will start from


What is GEEK

Buddha Community

Merge All Your History
Enos  Prosacco

Enos Prosacco


History MCQs to Test Your History Knowledge

The beginning of the medieval period is ordinarily taken to be the moderate breakdown of the Gupta Empire from around 480 to 550, closure the “old style” period, just as “antiquated India”, albeit both these terms might be utilized for periods with broadly various dates, particularly in specific fields, for example, the historical backdrop of workmanship or religion. At any rate in northern India, there was no bigger state until maybe the Delhi Sultanate, or unquestionably the Mughal Empire. By 1413, the Tughlaq dynasty completely declined and the neighboring governor captured Delhi and this led to the start of the Sayyid Dynasty. In 1398, Timur attacked India and ransacked Indian riches. While returning back, he named Khizr Khan as the legislative head of Delhi.

#indian history tutorials #history mcqs #indian history mcqs #indian history quiz

Julie  Donnelly

Julie Donnelly


History of Computing PtI

We learn and know (hopefully) a basic history of the world, particularly major events like the French revolution, the American Civil War, World War I, World War II (wow lots of wars), the Spaceage etc. It is important to understand the concepts of these and many other historical events. Being able to recall the start year or the exact details of how such events unfolded is one thing, but on a human level, it is more important to understand the rationale, lessons and philosophy of major events. Ultimately history teaches us what makes us innately human. Furthermore, understanding history helps us realise the how and why we operate in todayHistory provides the context for today. It makes today seem ‘obvious,’ ‘justifable’ and ‘logical’ given the previous events that unfolded.

So, following this thread of logic, understanding the history of computers should help us understand how we have got to today. A today when computers moderate much of our communication with one another. A today where computers and screens are stared at for many (and often a majority) of our waking hours (especially during Covid). A today where the thought of working, socialising or learning without a computer would be an affront and a disadvantage. Just as major events like World War II and the Cold War have greatly contributed to today’s political and social climate. I would argue computers influence just as much (if not more) of our daily lives.

Therefore it is important for us to understand the evolution of computers to understand where we may be heading in our relationship with computers.

I would like to preface that the following articles outlining the history of computers by saying this is in no way an exhaustive history of the origin of computers. Some major events have been glossed over while other meaningful contributions omitted entirely.

Whilst the thought of history for some may make the eyes automatically glisten over, I will try and make the following series as painless and exciting as possible. While I paint a story of linear progress of computation, this is hindsight bias in action. We like to create a story of history attributing certain importance to some events and not others when in reality as these events were unfolding (and continue to unfold) it was not always obvious what was a gigantic discovery. It is only now with some distance that we can appreciate past event. This means perhaps in ten years this recount will emphasis other features and neglect some of the stories today we find so foundational to computer’s creation.

With all this in mind let’s begin !!

The first computers

Since their inception computers have taken over human work by performing tedious, complex and repetitive tasks. Interestingly, t_he__ word computer initially described humans_!! Initially computers were humans (often women) who were able to perform complex mathematical computations — usually with pen and paper. Often teams would work on the same calculation independently to confirm the end results. It is interesting to note that initally when electronic computers were developed they were referred to as such — electronic computers. With time as electronic computers became more and more pervasive and powerful, it became the human computer that was deemed obsolete and inefficient. The electronic was dropped and now when we discuss computers we think of nothing else besides our gracefull and versalite electronic tools. It is important to keep computer’s mathematical origin in mind as we will see it only further emphasises the never imagined pervasiveness and uses of computers today.

Our story begins with the humble abacus, generally considered the first computer. When researching I was puzzled how an abacus could be considered a computer. Luckily my curiosity was settled by a quick Google search (thank you Google). Google was even able to suggest my search before I completed typing ‘Why is the abacus considered the first computer’! I ended up on trusty Quora where one users: Chrissie Nysen put things simply-“Because it is used to compute things.” Though estimates vary, the abacus is thought to originate in Babylon approximately 5000 years ago. The role of the abacus was to ease in simple mathematical calculations- addition, subtraction, division and multiplication. In this sense, we can consider the abacus as a simple calculator. As farming, produce and populations increased in size, the abacus allowed the educated to more easily manage logistics. After the abacus the first computer, computer’s evolution remained dormant for some time……

#history #computer-science #history-of-technology #computers #computer-history #data science

A History Of Artificial Intelligence — From the Beginning


the seminal paper on AI, titled Computing Machinery and Intelligence, Alan Turing famously asked: “Can machines think?” — or, more accurately, can machines successfully imitate thought?

70 years later, the answer is still “no,” as a machine hasn’t passed the Turing test.

Turing clarifies that he’s interested in machines that “are intended to carry out any operations which could be done by a human computer.” In other words, he’s interested in complex digital machines.

Since the achievement of a thinking digital machine is a matter of the evolution of machines, it reasons to start at the beginning of machine history.

The History of Machines

A machine is a device that does work. In engineering terms, work means transferring energy from one object to another. Machines enable us to apply more force, and/or do it more efficiently, resulting in more work being done.

The evolution of Boston Dynamics’ robots from 2009 to 2019.

Modern machines — like the above Boston Dynamics robot, _Atlas — _use hundreds of parts, including hydraulic joints, pistons, gears, valves, and so on to accomplish complex tasks, such as self-correcting stabilization, or even backflips.

Simple Machines

However, “simple machines” fit our earlier definition as well, including wheels, levers, pulleys, inclined planes, wedges, and screws. In fact, all mechanical machines are made of some combination of those six simple machines.

Atlas is not just a mechanical machine, but also a digital one.

Simple mechanical machines are millions of years old. For instance, “stonecutting tools [a type of wedge] are as old as human society,” and archaeologists have found stone tools “from 1.5 to 2 million years ago.”

Complex Machines

Combinations of simple machines could be used to make everything from a wheelbarrow to a bicycle to a mechanical robot.

In fact, records of mechanical robots date back to over 3,000 years ago.

The Daoist text Lieh-tzu_, _written in the 5th century BCE, includes an account of a much earlier meeting between King Mu of the Zhou Dynasty (1023–957 BCE) and an engineer named Yen Shi. Yen Shi presented the king with a life-sized, human-shaped mechanical automaton:

“The king stared at the figure in astonishment. It walked with rapid strides, moving its head up and down, so that anyone would have taken it for a live human being. The artificer touched its chin, and it began singing, perfectly in tune. He touched its hand, and it began posturing, keeping perfect time… As the performance was drawing to an end, the robot winked its eye and made advances to the ladies in attendance, whereupon the king became incensed and would have had Yen Shi executed on the spot had not the latter, in mortal fear, instantly taken the robot to pieces to let him see what it really was. And, indeed, it turned out to be only a construction of leather, wood, glue and lacquer…”

Mechanical heart diagram. Date unknown.

The king asked: “Can it be that human skill [in creating a machine] is on a par with that of the great Author of Nature [God]?”

In other words, Turing’s question of whether machines can imitate humans is actually thousands of years old.

At the same time, Greek scientists were creating a wide range of automata. Archytas (c. 428–347 BC) created a mechanical bird that could fly some 200 meters, described as an artificial, steam-propelled flying device in the shape of a bird.

“Archytas made a wooden model of a dove with such mechanical ingenuity and art that it flew.”

Some modern historians believe it may have been aided by suspension from wires, but in any case, it was a clear attempt to create a machine.

Another Greek scientist, Daedalus, created statues that moved:

“Daedalus was said to have created statues that were so lifelike that they could move by themselves.”

The “first cuckoo clock” was described in the book The Rise and Fall of Alexandria: Birthplace of the Modern World (page 132):

“Soon Ctesibius’s clocks were smothered in stopcocks and valves, controlling a host of devices from bells to puppets to mechanical doves that sang to mark the passing of each hour — the very first cuckoo clock!”

Over the centuries, more and more complex contraptions were used to create automata, such as wind-powered moving machines.

#history-of-technology #artificial-intelligence #history

Git Merge: A Git Workflow explained 

What is Git Merge?

Merge is a command used in Git to move the changes in branch to another. Usually, the new features are developed in the dev branch and merged into the master branch after finishing the development. All the changes in the dev branch is added to the master branch on the merge. but the dev branch will be unaffected.

— merge pic —

How to do a Git Merge

Let’s do a Git Merge step by step to understand how it works. Except the merging part, many steps from cloning the repo to publishing the changes will be the same as in Git Rebase Tutorial because we are trying to do the same thing in a different way.

Step 1: Fork and clone the desired repo

Let’s reuse our rebase-demo repository for this. Go to https://github.com/kdanW/rebase-workflow-demo and click the button ‘Fork’ in the top right-hand corner. Now go to your forked repo, click ‘Clone or Download’ button and copy the link shown.

Image for post

Now go to a directory of your preference and type the following command on the terminal to download the repo into your local PC.

git clone https://github.com/<YOUR_USERNAME>/rebase-workflow-demo

#git-merge #git-workflow #github #merge #git

Lina  Biyinzika

Lina Biyinzika


History of AI: Timeline, Advancement & Development

Artificial Intelligence is a young domain of 60 years, which includes a set of techniques, theories, sciences, and algorithms to emulate the intelligence of a human being. Artificial intelligence plays a very significant role in our lives. The revolution of industries has made a lot of developments in business with the implementation of artificial intelligence. In this blog, we will discuss an outlook on the history of artificial intelligence.

What is Artificial Intelligence?

Artificial intelligence is defined as the ability of a machine to perform tasks and activities that are usually performed by humans. Artificial intelligence organizes and gathers a vast amount of data to make useful insights. It is also known as machine intelligence. It is a domain of computer science.

#artificial intelligence #ai #history #history of ai