Well, back in those days of childhood, in those cheap books brought from the wheel carts at the railway stations, and out of my curiosity about the inquisitveness of the men in general, I had read that name, Dr. Alan M Turing. The mention had come against the term “Computer”, in the juxtaposed column of the table headed under the section titled “Inventors and Inventions”. Another column along side, mentioned 1944 with the label at the top as ‘Year’.
It’s not that I remember all of those today, but for the ones that perceptually seem to be the marvels of human intellect, I do. Thomas Alva Edison to the Bulb, John Lougie Baird to the Television, Alexander Graham Bell to the Telephone, I remember even today. Computer was no less a miraculous human invention.
But there was always this question, what was supposed to be called a Computer? Was it the combination of the visual display unit, the cabinet with wires going in and out and the peripherals like keyboard and mouse, that I saw in the so called laboratory rooms in the school made up the computer. Was that the computer, Dr. Turing had built? No, definitely not. It is supposed to be called as Personal Computer and IBM, argumentatively, first released its designs, Microsoft built it and Apple as you see it today, revolutionized it.
From my earliest of lessons in Computers at the school(Standard 5, lucky enough), I had learnt about the Generations of Computers, transcending from big hefty mechanical boards to the use of vaccum tubes and upto the kill of silicon chips making it possible for information to be processed and communicated at lightning speeds and at microscopic levels of space. So I knew, it was not the computers we see today that Dr. Turing had invented. It was something else.
So what is a Computer? Technically, a machine that could compute. So, could Abacus, a kids toy seen until last few decades with beads embedded inside wooden rods. It was built by Egyptians in the ancient times and was used to solve problems on addition and multiplication. It is said that some Chinese even today make use of it and can perform calculations faster than most other humans would with the usual methods of long addition and multiplication taught in schools. Forget about the toys, Blaise Pascal’s Adding Machine could add Integral Numbers, John Napier’s crooked Bones could even have them multiplied, Herman Hollerith’s Analytical Engine did something more computational, I don’t remember. I remember all these from the books at school. And the last thing I should not forget to mention I remember, is the picture of that bald man, framed and hung close to the ceiling of the computer laboratory room, put beside that of another of a pokemon trying to eat up a computer, a mock of a virus. That man was called Charles Babbage, and to his name was entitled the designation of the Father of Computer. All these great men and their miraculaous inventions, bearing the potential to be termed a computer, passed through the timeline much before Dr. Turing and his device did. So, why did the cheap book from the railway station wheeler carts mentioned him as the inventor of Computer. Aah, why to worry so much about cheap books. They are no way reliable.
So, the school was finally over and I was lucky to major Computer Science at the college. In my further lessons in the discipline of the “Theory of Automaton”, I learnt about the first principles of the machines that could compute, the first principles of computation in general and the first principles of what qualifies to be called as a computing machine, or optimistically a Computer. All such principle machines conceptualised a physical prototype of a device that made mechanical changes in response to external input, processing it at the same time and halting it to provide the solution (although, not all such machines guaranteed to halt for all set of problem inputs).
All such machines were restricted by the kind of computations they can perform, the kind of problems they can solve and the kind of input they can process. All such machines catered to a specific subset of the universe of computational problems. All such machines coagulated together to form the basic fundamentals of formalising the computation of a problem, the modern day name of algorithm. But it was only his hypothesis, The Turing Machine, which could conceptually boast to solve any computational process(or any algorithm) within practical limitations of time and space.
So, it was something worth a contribution making him earn the title of the inventor of Computer.
But it was until I watched the movie, ‘The Imitation Game’, I never knew of his actual contribution to the world of Computer Science, and to the humanity itself. ‘Christopher’, the machine, the real physical one, capable of breaking the the ciphers generated by the machine ‘Enigma’ which Nazis used to encrypt their messages with, is supposed to have reduced the length of second world war by two years and saved millions of lives. His team, a bunch of cryptanalysts, was designated with the task of breaking the codes by manually inspecting each one of them. It was his foresight to think of a machine which could in real time process and break each and every message without the intervention of humans and their limitation of computational speed and capacity.
Brilliant, I would say to end with, both the man and the movie. Hats off Dr. Alan Turing.
No wonder he’s termed as the Father of Theoretical Computer Science and Artificial Intelligence.