Here's the third episode (slightly updated to take account of some initial comments). Not anywhere near so exciting as the first two -- but after all that arm-waving generality, we do need to get our hands dirty looking at some actual formal theories of arithmetic, mildly tedious though that is! And you really ought to know, e.g., what Robinson Arithmetic is.
Tuesday, October 20, 2009
As I said, I'm planning to blog, chapter by chapter, about Curtis Franks’s new book on Hilbert, The Autonomy of Mathematical Knowledge (all page references are to this book). Any comments on my comments will of course be welcome!
Let's take ourselves back to the "foundational crisis" at beginning of the last century. Mathematicians have, over the preceding decades, freed themselves from the insistence that mathematics is tied to the description of nature: as Morris Kline puts it, "after about 1850, the view that mathematics can introduce and deal with arbitrary concepts and theories that do not have any immediate physical interpretation ... gained acceptance" (p. 11). And Cantor could write "Mathematics is entirely free in its development and its concepts are restricted only by the necessity of being non-contradictory and coordinated to concepts ... introduced by previous definition" (p. 9). Very bad news, then, if all this play with freely created concepts in fact gets us embroiled in contradiction!
As Franks notes, there are two kinds of responses that we can have to the paradoxes that threaten Cantor's paradise.
- We can seek to "re-tether" mathematics. Could we confine ourselves again to applicable mathematics which has, as we'd anachronistically put it, a model in the natural world so must be consistent? The trouble is we're none too clear what this would involve (remember, we are back at the beginning of the twentieth century, as relativity and quantum mechanics are getting off the ground, and any Newtonian confidence that we had about structure of the natural world is being shaken). So put that option aside. But perhaps (i) we could try to go back to find incontrovertible logical principles and definitions of mathematical notions in logical terms, and try to reconstruct mathematics on a firm logical footing. Or (ii) we could try to ensure that our mathematical constructions are grounded in mental constructions that we can perform and have a secure epistemic access to. Or (iii) we could try to diagnose a theme common to the problem paradoxical cases -- e.g. impredicativity -- and secure mathematics by banning such constructions. Of course, the trouble is that the logicist response (i) is problematic, not least because (remember where we are in time!) logic itself isn't in as good a shape as most of the mathematics we are supposedly going to use it to ground, and what might count as logic is obscure. Indeed, as Peirce saw, "a mature science like mathematics, with a history of successful elucidation and problem solving, was needed in order to develop logic" (p. 20); and indeed "all formal logic is merely mathematics applied to logic" (p. 21). The intuitionistic line (ii) depends on an even more obscure notion of mental construction, and in any case -- in its most worked out form -- cripples mathematics. The predicativist option (iii) is perhaps better, but still implies that swathes of seemingly harmless classical mathematics will have to be abandoned. So what to do? What foundational programme will rescue us?
- Well, perhaps we shouldn't seek to give mathematics a philosophical "foundation" at all. After all, the paradoxes arise within mathematics, and to avoid them we just ... need to do mathematics better. As Peirce -- for example -- held, mathematics risks being radically distorted if we seek to make it answerable to some outside considerations (from philosophy or logic) rather than being developed "by the continuous confrontation with and the creative solution of ordinary mathematical problems" (p. 21). And we don't need to look outside for a prior justification that will guarantee consistency. Rather we need to improve our mathematical practice, in particular improve the explicitness of our regimentations of mathematical arguments, to reveal where the fatal mis-steps must be occurring, and expose the problematic assumptions.
But Franks is having none of this. His Hilbert is a sort-of-naturalist like Peirce (in sort-of-Maddy's sense of "naturalist:), and he is firmly situated in Camp (2). "His philosophical strength was not in his ability to carve out a position among others about the nature of mathematics, but in his realization that the mathematical techniques already in place suffice to answer questions about those techniques -- questions that rival thinkers had assumed were the exclusive province of pure philosophy. ... One must see him deliberately offering mathematical explanations where philosophical ones were wanted. He did this, not to provide philosophical foundations, but to liberate mathematics from any apparent need for them." (p. 7).
So there, in outline -- and we don't get much more than outline in Chap. 1 -- is the shape of Franks's Hilbert. So, now let's read on to Chap. 2 to see how well Franks makes the case for his reading. To be continued.
Posted by Peter Smith at 3:10 PM
Monday, October 19, 2009
On Saturday, from the new books stand the CUP bookshop, I picked up a copy of Curtis Franks's The Autonomy of Mathematical Knowledge: Hilbert's Program Revisited.
Two quick grumbles. First, the book is short: just a hundred and ninety very generously spaced pages, maybe 60,000 words in all? Well, I'm all for short books, and I'm trying myself to write one now. But £45/$75? Much as though I love CUP, that really is more than a tad extortionate (and I probably wouldn't have coughed up but for a big discount as a press author). Secondly, I can't say that I particularly like Franks' prose style, which tends to the unnecessarily flowery and slightly contorted, which makes you occasionally too aware of the medium rather than the message.
But having got those grumbles off my chest, let me say that the book looks very interesting indeed -- a must read for anyone interested in matters round and about Hilbert's Programme, which means pretty much any philosopher of mathematics. So order for your library today. And I plan to blog about this book, chapter by chapter, starting here tomorrow ... (promises, promises!).
Posted by Peter Smith at 11:40 AM
Saturday, October 17, 2009
As promised, Episode 2 of Gödel Without Tears (in which we prove sufficiently strong theories are undecidable and incomplete -- just like that!)
As explained, I'm writing these notes as just-after-the-event handouts for weekly lectures. And each week I'll be checking through the previous handout (and no doubt finding small corrections to make) before I give the next lecture. So here's the latest version of Episode 1, dated 16 October.
Posted by Peter Smith at 3:21 PM
Wednesday, October 14, 2009
The logic crew were minded to do some more modal logic. And, casting around for a modern book that might link up with recent stuff on e.g. second order modal logic, I suggested that in our reading group we tried Nino Cocchiarella and Max Freund's Modal Logic (OUP, 2008). Mea culpa. I confess I didn't look at it closely enough in advance. Today was the first meeting, and it fell to me to introduce the first couple of chapters.
This really is a poorly written book, and it is pretty difficult to imagine for whom it is written. Although it is subtitled "An introduction to its syntax and semantics", no one who hasn't already done some modal logic is going to get anything much out of the opening chapters. For this is written in that style of hyper-formalization and over-abstraction that philosophers writing logic books still too often affect. Why? Who is it supposed to impress? (It is as if the authors are trying to prove that they aren't really weedy soft-minded philosophers, but can play tough with the big boys. The irony is that the big boys, the good mathematicians, don't play the game this way.)
Here's a trivial example. If you or I were introducing a suitable language for doing propositional modal logic, we might say: OK, we need an unlimited supply of propositional atoms, and here they are, P, P', P'', P''', etc.; we want a couple of propositional connectives, say → and ¬; and the Box as a necessity operator. Then we'd remark, parenthetically, that of course the precise choice of symbolism is neither here nor there. Job done. For of course, sufficient unto the day is the rigour thereof.
But Cocchiarella and Freund are having none of this. In fact they don't tell us what any actual modal language looks like. Rather they introduce some metalinguistic names for the atoms, whatever they are; and then there are three other symbols named c, n and l, whatever they might be, to serve as a conditional, negation and necessity operator. And the rest of the discussion proceeds at one remove, without us ever actually meeting an object language modal sentence. (Well, actually there's another problem: for on their account it would be jolly hard to meet one, as for them a modal sentence is a set of sets of sets of numbers and symbols. Despite their extreme pernicketiness about formal matters, they are cheerfully casual about identifying set-theoretic proxies with the real thing -- but let that pass.)
OK, what does their formalistic fussing get us? Nothing that I can see. The surface appearance of extra generality is spurious. And in fact, Cocchiarella and Freund soon stop any pretence at generality. For example, when the wraps are off, they require any logistic system based on the conditional and negation to have a bracket-free Polish grammar, where logical operators are prefix. And they require any derivation in such a system to be in linear Hilbert style, without rules of proof or suppositional inferences. Those requirements combined make most modal logical systems you've ever seen not count as such according to them.
Consider your old friend, von Wright's M. As we all learnt it in the cradle from Hughes and Cresswell, and ignoring the fact that they go for particular modal axioms and a rule of substitution rather than using axiom schemata, their system has two rules of inference, modus ponens and a rule of necessitation that allows us to infer BoxA if we've proved A from no assumptions. But such a rule of course isn't allowed if derivations all have to be Hilbert style, with conclusions always being derived by the application of rules to previous wffs, not to previous (sub)proofs. This means that Hughes and Cresswell's M is not a modal system according to Cocchiarella and Freund. And when they talk about M, since they only have modus ponens as an inference rule, they have to complicate the axioms, by allowing us to take any of Hughes and Cresswell's axioms and precede it by as many necessity operators as you want. They then prove what they call the rule of necessitation, which tells us that if there is a proof of A from no assumptions in their system M, then there is also a proof of BoxA in their system. But note, the C&F "rule of necessitation" is quite different from H&C's rule. In fact the C&F rule stands to H&C's rule pretty much as the Deduction Theorem stands to Conditional Proof.
Now, I don't particularly object to Cocchiarella and Freund doing things this way. But I do object to their doing it this way without bothering to tell us what they are doing, how it relates to the more familiar way, and why they've chosen to do things their way. Why is the reader left trying to figure out which deviations from the familiar might be significant, and which not?
Anyway, we certainly weren't impressed. The grad students -- a very bright and interested bunch -- uniformly found the style rebarbative and entirely off-putting. There was no general will to continue. And democracy rules in the reading group!
Posted by Peter Smith at 8:34 PM
Monday, October 12, 2009
Here, as promised, is the first of a series of lecture handouts (roughly weekly, and about twelve in all) encouragingly titled Gödel Without Tears -- 1. As is the way with lecture handouts, this was dashed off at great speed, and I don't promise that this is free of either typos or thinkos. So do please let me know of any needed corrections, or indeed of any passage which is too unclear/could do with just a little amplification. Enjoy!
Later: I've already replaced the first version with a slightly better one ...
Posted by Peter Smith at 10:40 AM
Sunday, October 11, 2009
I should have mentioned before that Tim Gowers's blog is running installments of a "conversation" on complexity lower bounds. It's structured as a dialogue between three characters, a cheerful mathematical optimist who likes to suggest approaches to problems, a more sceptical mathematician who knows a bit of theoretical computer science (and is tagged with a "cool" smiley), and an occasionally puzzled onlooker who chips in asking for more details and gives a few comments from the sidelines. We're just on instalment IV, and there are oodles of comments on the previous instalments.
This is fascinating stuff for philosophers of maths, in both form and content -- though I don't begin to pretend to be following all the ins and outs. In form, because it's always intriguing to see mathematical work-in-progress, exploring ideas, guesses, dead-ends (live mathematics as an activity, if you like, as opposed to the polished product presented according to the norms for "proper" publication). And in content, because you begin to get a sense of why something that initially seems as though it ought to be easy to settle (P = NP?) is really hard.
Posted by Peter Smith at 7:10 AM
Saturday, October 10, 2009
The Bach recital that Viktoria Mullova gave at the Wigmore Hall last week was simply terrific. Up there with my all-time great concerts, including some Brendel, Holzmair singing Die Schöne Müllerin, and the Lindsays (often). Mullova finished up playing the great Chaconne from the second Partita. And she didn't attack it as some do. As a reviewer said, "... many violinists try to match its immensity with a heroic sound. But Mullova often went the other way, becoming light and dancing where most violinists would be losing bow-hairs in an effort to wring a bigger sound from the instrument ... totally convincing." Certainly, she stunned the audience who sat in silence for some moments after she finished.
But the revelation for me was the two sonatas she played with Octavio Dantone. I didn't know their recording of the sonatas on Onyx (I like Rachel Podger's recording quite a bit, and hadn't sought out another). But their performances last week bowled me over too, and so I sent off for the discs. And yes, hugely recommended!
Posted by Peter Smith at 9:13 PM
Well, that's the beginning of term survived, and I hope to pick up the philosophical threads here next week.
It's been back to first year logic lectures, for what I guess -- with retirement looming -- will be the penultimate time. The opening two lectures went tolerably well. Drat. Just getting the hang of doing this and I'm having to stop! Lecture pacing is an odd thing, though: there are fewer lectures in the course this year, and I need to push things on. So I've put the admin stuff in a hand-out, cut out some other slides from the Beamer presentations, and felt I was cracking on faster. Yet I'm exactly where I got to last year after two lectures. Ah well: maybe it is good not to put the foot on the accelerator too hard too soon. But we must push on next week.
The other course I'm starting this term, which I'm planning to repeat when I get to NZ, is a dozen lectures on Gödel's (Incompleteness) Theorems for third year undergrads and postgrads. This is much more difficult to get right. Last year, I just did talk and chalk, introducing chunks of my book. But that didn't really work: there was too much gap between what I had time to do in relaxed chat, and what's in the book. So maybe use Beamer presentations for this course too? After one class I think this isn't going to work either -- or at least, the effort put into writing the presentation would be much better used writing a couple of pages of lecture handouts as a more careful/comprehensive intro that can be followed up in the book, better filling the gap between lecture chat and the book. OK, down to it then, and I'll write some weekly handouts, Gödel Without Tears. Watch this space ...
The logical highpoint of the week, though, was the first Logic Seminar, where Fraser MacBride was talking about neo-logicism. He gave an terrific impromptu intro for the surprising number of third-years who turned up, quite innocent of the debates, and then he had a persuasive bash at the latest Hale/Wright effort, ‘The Meta-Ontology of Abstraction’. Fraser set the bar pretty high for the rest of term. Excellent stuff.
Posted by Peter Smith at 5:20 PM