Algorithms and Greatest Common Divisors

An algorithm is a sequence of steps used to solve a problem. An algorithm does not need to involve a computer; it could be the sequence of loops and knots that you use to tie your shoes, or the steps that you follow in the kitchen when baking your favorite cake. An admittedly lame party trick of your author is an algorithm that allows him to quickly determine the day of the week that your birthday falls on this year.

STOP: How might one be able to quickly determine the day of the week of your birthday, without needing to memorize a lot of information?

For a more quantitative example of an algorithm, long ago you learned about a method to divide any two integers by hand, called “long division”. This algorithm is almost 500 years old and exemplifies that many algorithms predating the computational era were developed to help us perform computations by hand. In fact, most of us are so ingrained into long division that it may seem strange to think of this or any other arithmetical method as an algorithm. Here are some students who have made learning long division enjoyable.

For another example from arithmetic, say that we would like to compute the greatest common divisor (GCD) of two integers. This task is natural to assign to a computer because computers are much faster and more accurate at arithmetical computations than we are, especially as the two input numbers grow. However, computers can only follow very precise instructions  —  at least for the time being! For this reason, we will need to be exact when we convert our ideas into instructions that a computer can understand; this is what we mean by programming the computer.

Before we can program a computer to take the GCD of two integers, we will specify exactly what we intend the computer to take as input, and exactly what it should return as its output; this leads us to the following computational problem, which more generally is a precisely defined specification of input data as well as some output that solves the problem on the input data. The following computational problem is well defined because there is only a single GCD of any two input integers.

GCD Problem

Input: Integers a and b.

Output: The greatest common divisor of a and b, denoted GCD(ab).

In the GCD problem, the input integers a and b are called variables, since they can change depending on what values we want them to have. (Input variables in a computational problem are also often called parameters.) Although we chose a and b to name the input variables, the can be named essentially anything that we like; we could just as well have named them x and y, or airplane and rocket, or foo and bar.

Note: An analogy for variables and their values is to think of a variable as a box, and the variable’s value as the contents of that box. For example, when a function “returns” a variable, the function is really returning the value of the variable at that point in time. This distinction will soon prove useful when we start working with functions.

A simple algorithm for computing GCD(a,b), and one you may have learned as a student, proceeds according to the following steps.

  1. Set a variable d equal to 1. This variable will represent the largest divisor common to a and b that we have found thus far.
  2. For every integer n between 1 and the minimum of a and b, we ask: “Is n a divisor of both a and b?” If the answer to this question is “Yes”, then we update the largest identified common divisor d to be equal to the current value of n.
  3. After ranging through all these integers, the current value of d must be GCD(a,b).

The table below illustrates this algorithm for a = 378 and b = 273. As we increase n, we keep track of whether n is a divisor of a or b. Values of n that are a divisor of both an and b are shown in green. After considering all possible n up to 273, we take the largest green number, 21, as the GCD of a and b.

Divisors of 37812367914182127425463126189378
Divisors of 27313713213991273
Divisors of 378 (top row) and 273 (bottom row). Common divisors are shown in green, and we can see that the largest such common divisor is 21.

However, for very large integers, this process will quickly become impossible for a human  —  imagine having to test every possible divisor of two 100-digit numbers!  —  and time-intensive even for a powerful computer.

This point, that computers have limitations, is a critical one that we will discuss in greater detail soon. For now, suffice it to say that the better the algorithms that we provide computers with, the faster they can solve our problems.

What does it mean for one algorithm to be “better” than another? Is it possible to find a GCD of two numbers without having to consider all of their divisors? And how do the ancient Greeks fit in to all this? We hope that we have intrigued you sufficiently to read on and find out!

close

Love P4❤️? Join us and help share our journey!

Scroll to Top