Written: April 25, 2010

Seven years ago, I really enjoyed
Amir Alexander's Geometrical Landscapes book, and
was eagerly awaiting his next book. Finally it appeared
and I was not disappointed! It is even better than *Geometrical Landscapes*. Like its predecessor,
it can be read on at least two levels. On the "lower" level these are gripping stories of the life
and struggles of mathematical geniuses, while on the "higher" level it illustrates so well the human side
of the platonic "Queen of Science", and how even mathematics is full of myths, personality-cults, and
over-simplistic stereotypes, and these perceptions-both from the outside and within-shape the actual
*body* of mathematical knowledge.

But this implies even a deeper truth! *Our* mathematics is an accidental outcome of the
*random walk* of history, and would have been very different with a different
historical narrative. Even if, for the sake of argument, there is an "objective" mathematics out there,
independent of us (or of the creatures in the fifth planet of star number 130103 in Galaxy number 4132,
who are far smarter than us), whatever tiny fraction of it that *we* (or even our smarter
colleagues from that galaxy) could have discovered, is entirely a fluke of history.

Another fluke of history is that the computer didn't exist in Euclid's, Gauss's, and even Hilbert's time.
Lots of mathematics, for example the search for "closed-form", or "analytic"
expressions, and good,
simple approximations, and the obsession with *linearization*,
came from the fact that we only had paper-and-pencil.
It would be a great intellectual exercise to "rewind" mathematics, and start it all from scratch,
taking advantage of the great computing power that we now have, and see how different it would be.
Had the computer been invented 2500 years ago, mathematics would have been **so** different.
Conversely, had the computer been invented 100 years from now, our present mathematics would
also be quite different (but not as much).

And indeed, the computer did make some changes, but most of mathematical papers, and mathematical talks, are still in the narrative, semi-formal, humaneze, what David Ruelle calls "a dance around a formal proof".

Take for example, Galois theory. It was motivated by the stupid quest to solve a quintic by the highly inefficient and artificial format of "radicals", and by the dumb questions of whether you can trisect an angle, or double a cube, with ruler and compass; or the fact that Pi is transcendental, that came from the even dumber question of whether you can square the circle; or Non-Euclidean Geometry, that came from trying to (stupidly!) "prove" the parallel postulate. Of course, neither Galois theory, nor transcendence theory, nor Non-Euclidean Geometry, are stupid, they are beautiful theories, and it is possible that other, equally stupid, questions would have lead to them as well, but then again it is very possible that even stupider questions would have lead to even more beautiful and deep theories. Who knows?

The concluding chapter of Amir Alexander's book is particularly
intriguing.
He first summarizes the book, reviewing the
18th-century "Natural Man" persona (epitomized by d'Alembert),
followed by the 19th-century "tragic martyr" myth
(most notably Abel, Galois, and (surprisingly!) Cauchy),
that drifted into the 20th century
in the figures of Ramanujan, Nash, and
into the 21st century in the figure of
Grisha Perelman.
He then wondered how would the image of the future mathematician be.
Alexander pessimisticly speculates the
unappealing image of the "computer-whiz-hacker",
modeled after the "revenge of the nerds".
I hope that he is wrong, and that the future prototypical
mathematician would *not* be a vengeful geek,
so let me propose two other personality types,
neither tragic nor malicious, and much nicer than a power-hungry
nerdy hacker.

One is that of the *mathematical software engineer*, who does not have the brilliant powers
and flashes of insights of a Galois, an Abel, a Ramanujan, or a Perelman, but has nevertheless
a deep understanding of what mathematics is all about, and how to get out of the computer
as much mathematical knowledge as possible, without being hung-up on perfect rigor, or just
as badly, complete lack of it, but striking a middle ground of semi-rigor and diversity,
adjusting the level of rigor according to the importance of the results and the
computer resources. That genial figure would still work for the love of math,
but experience and maturity would play a greater role in his or her success
than brilliant insights. In particular, mathematics would cease
to be a "young man's game".

However, not all mathematics would be computerish! Sure,
eventually all *serious* mathematics would be
computer-generated
(although for the next one hundred years, mostly computer-assisted),
but human mathematics
would become a *sport*, and there would be new kinds of mathematical heroes, and celebrities, just like
in football, basketball, and baseball, and they would make lots of money. These future Art Benjamins
would not only do four-digit-multiplications in their heads, but prove Fermat's Last Theorem and
the Poincaré conjecture by only using pencil-and-paper (and ultimately, completely mentally),
without resorting to computers (and hopefully w/o steroids either). Now mathematicians would really
become household names, and even if they won't be as popular as baseball players, at
least they would be on a par with chess grand-masters.

So let's hope!

P.S.: Here are some minor errata .

Opinions of Doron Zeilberger