I finished John Naughton‘s A Brief History of the Future: Origins of the Internet – an interesting diversion, to be sure and a book worth reading (not only because it reminds you of how rapidly internet adoption has accelerated in the last decade.)
By now I should be on to proper revision, but I indulged myself last week and bought a copy of P, NP, and NP-Completeness: The Basics of Computational Complexity which I am now attempting to read between dozes in the garden (weather in London is fantastic). The book is written in a somewhat sparse style but even so is somewhat more approachable that many texts (the author, Professor Oded Goldreich, rules out using “non-deterministic Turing machines“, for instance, saying they cloud the explanation).
But in his discussion of (deterministic) Turing machines he states:
Is that right? (Serious question, please answer if you can).
Surely the set of all strings has the cardinality of the reals?
If we had a set of strings like this:
and so on…
Then surely the standard diagonalisation argument applies? (i.e. take the diagonal and switch states of each member –
and this string cannot be in the original set as it is guaranteed that for the
member of the set, the with elements
,
will be different. (See blog on diagonalisation.)
In Naughton’s book he makes (the very valid) point that students of the sciences are generally taught that when their results disagree with the paradigm, then their results are wrong and not the paradigm: so what have I got wrong here?
Related Articles
- Levels of Infinity (xamuel.com)
- A Brief History of the Future (cartesianproduct.wordpress.com)
- Random Axioms and Gödel Incompleteness (rjlipton.wordpress.com)
- State Machines – Basics of Computer Science (markwshead.com)
- Yogi Berra and Complexity Theory (rjlipton.wordpress.com)
- Understanding Computational Complexity (amitasuri.wordpress.com)
Did you get any update on this? I did read this post, and now I’m really interested on those questions.