Numbers, The Roots of the Integers are Irrational

The Roots of the Integers are Irrational

If the jth root of n is not an integer, it is irrational.

Suppose the jth root of n is a rational number a/b, and write aj = nbj. We can assume a and b are relatively prime. In other words, the fraction is reduced. The right side of the equation is divisible by the primes of b, and by unique factorization, those same primes divide a, which contradicts a and b relatively prime. So b = 1, and the jth root of n is an integer, namely a.

This caused some concern in ancient Greece when people realized that something perfectly normal, like the diagonal of a unit square, could have a length (square root of 2) that was not expressible as a fraction. It was not a ratio, (the latin word for reason). Even today, we call these numbers irrational, as though they make no sense. And when we start imagining the square root of -1, well, that's imaginary. And trying to mix real and imaginary numbers; that's pretty complex. The words actually reflect the evolution of mathematics.