Re: Paper and slides on indefiniteness of CH

Dear Hugh,

I don’t think that we have a big disagreement here. The Friedman-Holy models are certainly not canonical and I agree that a key question is whether there are canonical, fine-structural inner models for large cardinals.

But no unverified hypotheses are needed to create the Friedman-Holy models and they witness the compatibility of arbitrary large cardinals with a key component of fine-structure theory: acceptability. We also force Global Square and I suspect that these models can be built as inner models using “internal consistency” arguments for reverse Easton forcings. (Of course the real motivation for the models was to use ideas of Neeman to show that a degree of supercompactness is a “quasi” lower bound on the consistency of the Proper Forcing Axiom).

Best,
Sy

Re: Paper and slides on indefiniteness of CH

Dear Sy,

For the uninitiated I think we should be clear about the difference between what has been proved and what has only been conjectured. As I understand it:

1. The Friedman-Holy models actually exist (they can be forced) and fulfill John’s 3 conditions.

I do not agree here. I do not think that the Friedman-Holy models are a starting point for fine-structure. That was the point I was trying to make.

For example, (global) strong condensation does not even imply \square_{\omega_1} and \square is a central feature of fine-structure.

For me anyway, fine-structure is a feature of canonical models and it is the canonical models which are important here. Identified instances of fine-structure can be forced but this in general is a difficult problem.  Could one create a list of such features, force them all while retaining all large cardinals, and finally argue that the result is a canonical model?

This seems extremely unlikely to me.

2. The models you are discussing are only conjectured to exist. In the final paragraph above you hint at a way of actually producing them.

3. If your models do exist then they also fulfill John’s 3 conditions but have condensation properties which are rather different from those of the Friedman-Holy models.

You are giving “my” models far more relevance here than they deserve.

Very technical point here: The models only reach the finite levels of supercompact and so do not fulfill John’s conditions. In fact, fine-structural extender models can never work since any such model is always a generic extension once one is past the level of one Woodin cardinal. One needs the hierarchy of fine-structural strategic-extender models. Though the latter occur naturally they have been much more difficult to explicitly construct. For example it is not yet known (as far as I know) if there can exist such an inner model at the level of a Woodin limit of Woodin cardinals no matter what iteration hypothesis and large cardinals one assumes in V.

If the Ultimate L Conjecture is true then V = Ultimate L meets John’s conditions. If this conjecture is false or more generally if there is an anti-inner model theorem (say at supercompact) then the Friedman-Holy models and their generalizations may be the best one can do (and to me this is the essence of the the inner model versus outer model debate).

But as you know the history of inner model theory has been full of surprises, and in particular we can’t just assume that the iterability hypotheses will be verified. For this reason, I do think it important to be clear about what has been proved and what has only been conjectured.

Of course I agree with this last point. The entire of theory of iterable models at the level of measurable Woodin cardinals and beyond could be vacuous and not because of an inconsistency. But it has not yet happened that a developed theory of canonical inner models as turned out to be vacuous. Will it happen? That is an absolutely key question right now.

Regards,
Hugh

Re: Paper and slides on indefiniteness of CH

Dear Hugh,

On Sun, 31 Aug 2014, W Hugh Woodin wrote:

The condensation principle that you force (local club condensation) actually does not hold in fine-structural models for the finite levels of supercompact which have been constructed (assuming the relevant iteration hypothesis). There are new fine-structural phenomena which happen in the long-extender fine structure models and which do not have precursors in the theory of short-extender models. (These models are generalizations of the short-extender models with Jensen indexing, the standard parameters are solid etc.)

When you say “Jensen indexing” do you mean the one that I proposed: index at the successor of the image of the critical point?

At the same time these models do satisfy other key condensation principles such as strong condensation at all small cardinals (and well past the least weakly compact). I believe that it is still open whether strong condensation can be forced even at all the \aleph_n’s by set forcing. V = Ultimate L implies strong condensation holds at small cardinals and well past the least inaccessible.

Very interesting! I guess we provably lose strong condensation at the level of \omega-Erdős, but it would of course be very nice to have it below that level of strength.

Finally the fine structure models also satisfy condensation principles at the least limit of Woodin cardinals which imply that the Unique Branch Hypothesis holds (for strongly closed iteration trees) below the least limit of Woodin cardinals. If this could be provably set forced (without appealing to the \Omega Conjecture) then that would be extremely interesting since it would probably yield a proof of a version of the Unique Branch Hypothesis which is sufficient for all of these inner model constructions.

For the uninitiated I think we should be clear about the difference between what has been proved and what has only been conjectured. As I understand it:

  1. The Friedman-Holy models actually exist (they can be forced) and fulfill John’s 3 conditions.
  2. The models you are discussing are only conjectured to exist. In the final paragraph above you hint at a way of actually producing them.
  3. If your models do exist then they also fulfill John’s 3 conditions but have condensation properties which are rather different from those of the Friedman-Holy models.

But don’t your models, if they exist, have some strong absoluteness properties that the Friedman-Holy models are not known to have? That’s why I suggested that John’s list of 3 conditions may have been incomplete.

Hugh, it is wonderful that you have the vision to see how inner model theory might go, and the picture you paint is fascinating. But as you know the history of inner model theory has been full of surprises, and in particular we can’t just assume that the iterability hypotheses will be verified. For this reason, I do think it important to be clear about what has been proved and what has only been conjectured.

Thanks,
Sy

Re: Paper and slides on indefiniteness of CH

Dear Sy,

I guess I need to weigh in here on your message to Steel.

You then advertise Hugh’s new axiom as having 3 properties:

  1. It implies core existence.
  2. It suggests a way of developing fine-structure theory for the core.
  3. It may be consistent with all large cardinals.

Surely there is something missing here! Look at my paper with Peter Holy: “A quasi-lower bound on the consistency strength of PFA, to appear, Transactions American Mathematical Society“. (I spoke about it at the 1st European Set Theory meeting in 2007.)

We use a “formidable” argument to show that condensation with acceptability is consistent with essentially all large cardinals. As we use a reverse Easton iteration the models we build are the “cores” in your sense of their own set-generic multiverses. And condensation plus acceptability is a big step towards a fine-structure theory. It isn’t hard to put all of this into an axiom so our work fulfills the description you have above of Hugh’s axiom.

The condensation principle that you force (local club condensation) actually does not hold in fine-structural models for the finite levels of supercompact which have been constructed (assuming the relevant iteration hypothesis). There are new fine-structural phenomena which happen in the long-extender fine structure models and which do not have precursors in the theory of short-extender models. (These models are generalizations of the short-extender models with Jensen indexing, the standard parameters are solid etc.)

At the same time these models do satisfy other key condensation principles such as strong condensation at all small cardinals (and well past the least weakly compact). I believe that it is still open whether strong condensation can be forced even at all the \aleph_n’s by set forcing. V = Ultimate L implies strong condensation holds at small cardinals and well past the least inaccessible.

Finally the fine structure models also satisfy condensation principles at the least limit of Woodin cardinals which imply that the Unique Branch Hypothesis holds (for strongly closed iteration trees) below the least limit of Woodin cardinals. If this could be provably set forced (without appealing to the \Omega Conjecture) then that would be extremely interesting since it would probably yield a proof of a version of the Unique Branch Hypothesis which is sufficient for all of these inner model constructions.

Regards,
Hugh

Re: Paper and slides on indefiniteness of CH

Dear John,

On Mon, 11 Aug 2014, John Steel wrote:

If certain technical conjectures areproved, then “V=ultimate L” is an attractive axiom. The paper “Godel’s program“, available on my webpage, gives an argument for that.

Many thanks for mentioning this paper of yours, which I read with great interest. It would be very helpful if you could clarify a few points made in that paper, especially with regard to interpretative power, practical completeness, your multiverse discussion and Woodin’s new axiom. I also have a few more minor questions which I’ll stick at the end.

Page 6

Footnote 10: I think it’s worth mentioning that for reflexive theories, S is interpretable in T iff S proves the consistency of T’s finite subtheories. So in fact for “natural” theories there really is no difference between consistency strength and interpretative power. This comes up again later in the paper.

You state:

If T, U are natural theories of consistency strength at least that of ‘there are infinitely many Woodin cardinals’ then the consequences of T in 2nd order arithmetic are contained in those of U, or vice-versa.

I don’t agree. First recall a milestone of inner model theory:

If the singular cardinal hypothesis fails then there is an inner model with a measurable cardinal.

From this it is clear that ZFC + “there is an inner model with a measurable cardinal” is a natural theory. Similarly for any large cardinal axiom LCA the theory ZFC + “There is an inner model satisfying LCA” is a natural theory. But this theory has the same consistency strength as ZFC + LCA yet fails to prove \Pi^1_3 sentences which are provable in ZFC + LCA. (By Shoenfield it does prove any \Sigma^1_3 sentence provable in ZFC + LCA.) Nor does it prove even \Pi_1^1 determinacy.

So what definition of natural do you have in mind here? The fact is, and this comes up later in your paper, there are many theories equiconsistent with theories of the form ZFC + LCA which do not prove the existence of large cardinals.

Footnote 11: Another issue with natural. Consider the theories PA + The least n at which the Paris-Harrington function is not defined is even, and the same theory with “even” replaced by “odd”. These theories are equiconsistent with PA yet contradict each other.

Page 8

Theorem 3.2: Given arbitrarily large Woodin cardinals, set-generic extensions satisfy the same sentences about L(\mathbb R).

It is easy to see that if one drops “set-generic” then the result is false and even if one replaces “set-generic extensions” with “extensions with arbitrarily large Woodin cardinals” it is still false (in general). I mention this as you claim that “large cardinal axioms are complete for sentences about L(\mathbb R)“; what notion of “complete” do you have in mind? Surely the requirement of set-genericity is a very strong requirement!

Page 10

There may be consistency strengths beyond those we have reached to date that cannot be reached in any natural way without deciding CH.

This is a fascinating comment! I never imagined this possibility. Do you have any suggestions as to what these could be?

Page 11

Here you talk about the set-generic multiverse. This is where you lose me almost completely; I don’t seem to grasp the point. Here are some comments and questions.

You have explained, and I agree, that natural theories under the equivalence relation of equiconsistency fall into a wellordered hierarchy with theories of the form ZFC + LCA as representatives of the equivalence classes. I would go further and say that equiconsistency is the same as mutual interpretability for natural theories. But as I said there are many natural theories not of the form ZFC + LCA; yet you claim the following:

The way we interpret set theories today is to think of them as theories of inner models of generic extensions of models satisfying some large cardinal hypothesis … We don’t seem to lose any meaning this way.

Of course by “generic” you mean “set-generic” as you are trying to motivate the set-generic multiverse. Now what happened to the theories (which may deny large cardinal existence) associated to Easton’s model where GCH fails at all regular cardinals, or more dramatically a model where \kappa^+ is greater than (\kappa^+)^\text{HOD} for every cardinal $latex \kappa$? These don’t seem to fit into your rather restricted framework. In other words, for reasons not provided, you choose to focus only on theories which describe models obtained by starting with a model of some large cardinal hypothesis and then massaging it slightly using set-generic extensions and ground models. Otherwise said you have taken our nice equivalence relation under mutual interpretability and cut it down to a very restricted subdomain of theories which respect large cardinal existence. What happened to the other natural theories?

On Page 13 you give some explanation as to why you choose just the set-generic multiverse and claim:

“Our multiverse is an equivalence class of worlds under ‘has the same information’.”

But this can’t be right: Suppose that G_0, G_1 are mutually generic for an Easton class-product; the V[G_0] and V[G_1] lie in different set-generic multiverses but have the same information.

It is a bit cumbersome but surely one could build a suitable and larger multiverse using some iterated class-forcing instead of the Levy collapses you use. But as you know I’m not much of a fan of any use of forcing in foundational investigations and moreover I’d first like to better understand what it is you want a multiverse foundation to achieve.

Page 14

Here you introduce 3 theses about the set-generic multiverse. I don’t understand the first one (“Every proposition”? What about CH?) or the second one (What does “makes sense” mean?). The third one, asserting the existence of a “core” is clear enough, but for the uninitiated it should be made clear that this is miles away from what a “core” in the sense of core model theory is like: Easton products kill “cores” and reverse Easton iterations produce them! So being a “core” of one’s set-generic multiverse is a rather flimsy notion. Can you explain why on Page 15 you say that the “role [of 'core' existence] in our search for a universal framework theory seems crucial”?

You then advertise Hugh’s new axiom as having 3 properties:

  1. It implies core existence.
  2. It suggests a way of developing fine-structure theory for the core.
  3. It may be consistent with all large cardinals.

Surely there is something missing here! Look at my paper with Peter Holy: “A quasi-lower bound on the consistency strength of PFA, to appear, Transactions American Mathematical Society“. (I spoke about it at the 1st European Set Theory meeting in 2007.)

We use a “formidable” argument to show that condensation with acceptability is consistent with essentially all large cardinals. As we use a reverse Easton iteration the models we build are the “cores” in your sense of their own set-generic multiverses. And condensation plus acceptability is a big step towards a fine-structure theory. It isn’t hard to put all of this into an axiom so our work fulfills the description you have above of Hugh’s axiom.

Is it possible that what is missing is some kind of absoluteness property that you left out, which Hugh’s axiom guarantees but my work with Holy does not?

OK, those are my main comments and questions. I’ll end with some lesser points.

Page 3

Defn 2.1: Probably you want to restrict to 1st order theories.

Adding Con(T) to T can destroy consistency unless you are talking about “true” theories.

“Reflection principles”: It is wrong to take these to go past axioms consistent with V = L. Reflection has a meaning which should be respected.

Page 5

Conjecture at the top about well-ordered consistency strengths: It is vague because of “natural extension of ZFC” and “large cardinal hypothesis”. But at least the first vagueness can be removed by just saying that the axioms Con(ZFC + LCA) are cofinal in all Con(T)’s, natural or otherwise. A dream result would be to show that the minimal ordinal heights of well-founded models of “LCA’s” are cofinal in those of arbitrary sentences with well-founded models. It is not totally out of the question that one could prove something like that but I don’t know how to do it (with some suitable definition of “LCA”).

In your discussion of consistency lower bounds it is very much worthwhile to consider “quasi” lower bounds. Indeed core model theory doesn’t get more than Woodin cardinals from PFA but the Viale-Weiß work and my work with Holy cited above show that there are quasi lower bounds in the supercompact range. For me this is important evidence for the consistency of supercompacts.

Page 12

The Laver-Woodin result: Are you sure you’ve got this right? I see that the ground model W is definable in the P-generic extension with parameter P(\kappa)^W where \kappa = |P|^+ of the extension, but not from P and the P-generic.

Best,
Sy

PS: How do you use the concept of “truth” in set theory? Is it governed by “good math” and “mathematical depth”, or do you see it as more than that? Your answer will help me understand your paper better.

Re: Paper and slides on indefiniteness of CH

On Thu, Aug 28, 2014 at 8:42 PM, Solomon Feferman wrote:

I provide one answer to these questions via a formal system W (‘W’ in honor of Hermann Weyl) that has variables ranging over a universe of individuals containing numbers, sets, functions, functionals, etc., and closed under pairing, together with variables ranging over classes of individuals. (Sets are those classes that have characteristic functions.) While thus conceptually rich, W is proof-theoretically weak. The main metatheorem, due to joint work with Gerhard Jäger, is that W is a conservative extension of Peano Arithmetic, PA. Nevertheless, a considerable part of modern analysis can be developed in W. In W we have the class (not a set) R of real numbers, the class of arbitrary functions from R to R, the class of functionals on such to R, and so on. I showed in detail in extensive unpublished notes from around 1980 how to develop all of 19th c. classical analysis and much of 20th c. functional analysis up to the spectral theorem for bounded self-adjoint operators.

​I developed some alternative conservative extensions of PA and HA called ALPO and B. The former, “analysis with the limited principle of omniscience”, was based on classical logic low down, and constructive logic higher up, and the latter for “Bishop” which was based on constructive logic. If I recall, both systems accommodated extensionality, which demanded additional subtleties in the conservation proofs. Both of these systems, if I recall, had substantially simpler axioms, and of course, there is the obvious issue of how important an advantage it is to have extensionality and to have simple axiomatizations.

The real issue you want to deal with is the formalization of the actual mathematics. There is a “new” framework to, at least potentially, deal with formalizations of actual mathematics without prejudging the matter with a particular style of formalization. This is the so called SRM = Strict Reverse Mathematics program. I put “new” in quotes because my writings about this PREDATE my discovery and formulation of the Reverse Mathematics program. SRM at the time was highly premature. I’m just now trying to get SRM off the ground properly.

The basic idea is this. Suppose you want to show that a body of mathematics is “a conservative extension of PA”. Closely related formulations are that the body of mathematics is “interpretable in PA” or “interpretable in \textsf{ACA}_0” … You take the body of mathematics itself as a system of mathematical statements including required definitions, and treat that as an actual formal system. This is quite different than the usual procedures for formalizing actual mathematics in a formal system, where one has, to various extents, restate the mathematics using coding and various and sundry adjustments. SRM could well be adapted to constructive mathematics as well. Actually, it does appear that there is merit to treating actual abstract mathematics as partially constructive. When I published on ALPO and B, I did not relate it to SRM ideas. ​

Now it is a mistake in your appendix to Ch. 9 to say that I can’t quantify over all real numbers; given that we have the class R of “all” real numbers in W, we can express various propositions containing both universal and existential quantification over R. Of course, we do not have any physical language itself in W, so we can’t express directly that “there is a [physical] point corresponding to every triple of real numbers.” But we can formulate mathematical models of physical reality using triples of real numbers to represent the assumed continuum of physical space, and quadruples to represent that of physical space-time, and so on; moreover, we can quantify over “nice” kinds of regions of space and space-time as represented in these terms. So your criticism cannot be an objection to what is provided by the system W and the development of analysis in it.

​There is a research program that I have been suggesting that I haven’t had the time to get into – namely, systematically introduce physical notions directly into formalisms from f.o.m. I conjecture that something striking will come out of pursuing this with systematic imagination. ​

As to the philosophical significance of this work, the conservation theorem shows that W is justified on predicative grounds, though it has a direct impredicative interpretation as well. When you say you disagree with my philosophical views, you seem to suggest that I am a predicativist; others also have mistakenly identified me in those terms. I am an avowed anti-platonist, but, as I wrote in the Preface to In the Light of Logic, p. ix, “[i]t should not be concluded from … the fact that I have spent many years working on different aspects of predicativity, that I consider it the be-all and end-all in nonplatonistic foundations. Rather, it should be looked upon as the philosophy of how we get off the ground and sustain flight mathematically without assuming more than the structure of natural numbers to begin with. There are less clear-cut conceptions that can lead us higher into the mathematical stratosphere, for example that of various kinds of sets generated by infinitary closure conditions. That such conceptions are less clear-cut than the natural number system is no reason not to use them, but one should look to see where it is necessary to use them and what we can say about what it is we know when we do use them.” As witness for these views, see my considerable work on theories of transfinitely iterated inductive definitions and systems of (what I call) explicit mathematics that have a constructive character in a generalized sense of the word. However, the philosophy of mathematics that I call “conceptual structuralism” and that has been referred to earlier in the discussion in this series is not to be identified with the acceptance or rejection of any one formal system, though I do reject full impredicative second-order arithmetic and its extensions in set theory on the grounds that only a platonistic philosophy of mathematics provides justification for it.

​I am curious as to where your anti-Platonist view kicks in. I understand that you reject \textsf{Z}_2 per se on anti-Platonist grounds. Presumably, you do not expect to be able to interpret \textsf{Z}_2 in a system that you do accept? Perhaps the only candidate for this is Spector’s interpretation? Now what about \Pi_1^1\textsf{-CA}_0? This is almost interpretable in \textsf{ID}_{{<}\omega} and interpretable just beyond. So you reject \Pi_1^1-\textsf{CA}_0 but accept the target of an interpretation? What about \Pi^1_2\textsf{-CA}_0? How convincing are ordinal notation systems as targets of interpretations – or more traditionally, their use for consistency proofs?

Here is my view. There are philosophies of mathematics roughly corresponding to a lot of the levels of the interpretation hierarchy ranging from even well below EFA (exponential function arithmetic) to perhaps j:V_{\lambda+1}\to V_{\lambda+1} and j:V \to V without choice, and perhaps beyond. These include your philosophy. Most of these philosophies have their merits and demerits, their advantages and disadvantages, which are apparent according to the skill levels of the philosophers who advocate them. I regard the clarity of the associated notions as “continuously” degrading as you move up, starting with something like a 3 x 3 chessboard.

I decided long ago that the establishment of the following Thesis – which has certainly not yet been fully established – is going to be of essential importance in any dialog. Of course, exactly what its implications are for the dialog are unclear, and it may be used for unexpected or competing purposes in various ways by various scholars – just like Gödel’s first and second incompleteness theorems, and the Gödel/Cohen work on AxC and CH.

THESIS. Corresponding to every interesting level in the interpretation hierarchy referred to above, there is a \Pi^0_1 sentence of clear mathematical interest and simplicity. I.e., which is demonstrably equivalent to the consistency of formal systems corresponding to that level, with the equivalence proved in EFA (or even less). There are corresponding formulations in terms of interpretations and conservative extensions. ​

Furthermore, the only way we can expect the wider mathematical community to become engaged in such issues (finitism, predicativity, realism, Platonism, etcetera) is through this Thesis.

Harvey

Re: Paper and slides on indefiniteness of CH

Dear Hilary,

Thank you for bringing my attention to Ch. 9 of your book, Philosophy in an Age of Science, and especially to its Appendix where you say something about my work on predicative foundations of applicable analysis. I appreciate your clarification in Ch. 9 of the relation of your arguments re indispensability to those of Quine; I’m afraid that I am one of those who has not carefully distinguished the two. In any case, what I addressed in my 1992 PSA article, “Why a little bit goes a long way. Logical foundations of scientifically applicable mathematics”, reprinted with some minor corrections and additions as Ch. 14 in my book, In the Light of Logic, was that if one accepts the indispensability arguments, there still remain two critical questions, namely:

Q1. Just which mathematical entities are indispensable to current scientific theories?, and

Q2. Just what principles concerning those entities are need for the required mathematics?

I provide one answer to these questions via a formal system W (‘W’ in honor of Hermann Weyl) that has variables ranging over a universe of individuals containing numbers, sets, functions, functionals, etc., and closed under pairing, together with variables ranging over classes of individuals. (Sets are those classes that have characteristic functions.) While thus conceptually rich, W is proof-theoretically weak. The main metatheorem, due to joint work with Gerhard Jäger, is that W is a conservative extension of Peano Arithmetic, PA. Nevertheless, a considerable part of modern analysis can be developed in W. In W we have the class (not a set) R of real numbers, the class of arbitrary functions from R to R, the class of functionals on such to R, and so on. I showed in detail in extensive unpublished notes from around 1980 how to develop all of 19th c. classical analysis and much of 20th c. functional analysis up to the spectral theorem for bounded self-adjoint operators. These notes have now been scanned in full and are available with an up to date introduction on my home page under the title, “How a little bit goes a long way. Predicative foundations of analysis.” The same methodology used there can no doubt be pushed much farther into modern analysis. (I also discuss in the introduction to those notes the relationship of my work to that of work on analysis by Friedman, Simpson, and others in the Reverse Mathematics program.)

Now it is a mistake in your appendix to Ch. 9 to say that I can’t quantify over all real numbers; given that we have the class R of “all” real numbers in W, we can express various propositions containing both universal and existential quantification over R. Of course, we do not have any physical language itself in W, so we can’t express directly that “there is a [physical] point corresponding to every triple of real numbers.” But we can formulate mathematical models of physical reality using triples of real numbers to represent the assumed continuum of physical space, and quadruples to represent that of physical space-time, and so on; moreover, we can quantify over “nice” kinds of regions of space and space-time as represented in these terms. So your criticism cannot be an objection to what is provided by the system W and the development of analysis in it.

As to the philosophical significance of this work, the conservation theorem shows that W is justified on predicative grounds, though it has a direct impredicative interpretation as well. When you say you disagree with my philosophical views, you seem to suggest that I am a predicativist; others also have mistakenly identified me in those terms. I am an avowed anti-platonist, but, as I wrote in the Preface to In the Light of Logic, p. ix, “[i]t should not be concluded from … the fact that I have spent many years working on different aspects of predicativity, that I consider it the be-all and end-all in nonplatonistic foundations. Rather, it should be looked upon as the philosophy of how we get off the ground and sustain flight mathematically without assuming more than the structure of natural numbers to begin with. There are less clear-cut conceptions that can lead us higher into the mathematical stratosphere, for example that of various kinds of sets generated by infinitary closure conditions. That such conceptions are less clear-cut than the natural number system is no reason not to use them, but one should look to see where it is necessary to use them and what we can say about what it is we know when we do use them.” As witness for these views, see my considerable work on theories of transfinitely iterated inductive definitions and systems of (what I call) explicit mathematics that have a constructive character in a generalized sense of the word. However, the philosophy of mathematics that I call “conceptual structuralism” and that has been referred to earlier in the discussion in this series is not to be identified with the acceptance or rejection of any one formal system, though I do reject full impredicative second-order arithmetic and its extensions in set theory on the grounds that only a platonistic philosophy of mathematics provides justification for it.

Best,
Sol

Re: Paper and slides on indefiniteness of CH

Dear Harvey,

You are right, to get strength from reflection one not only needs higher-order logic (lengthenings) but also class (2nd order) parameters. As I said in what I wrote, I chose to ignore parameters to simplify the discussion. In fact allowing more than 2nd-order parameters will lead to inconsistency unless one treats them carefully using embeddings, what I call “Magidor reflection”. But obviously I wanted to avoid this subtle discussion of parameters to bring out the main point: Lenthenings are required to derive an inaccessible from reflection.

Best,
Sy

Re: Paper and slides on indefiniteness of CH

From the mathematical point of view, the discussion of Reflection in what you wrote (by Sy) seems to be oversimplified (and seems to be incorrect). The reflection principles

  1. anything first order true in V is true in some V_\lambda
  2. anything second order true in V is true in some V_\lambda

are both fairly weak. (1) has models (V_\lambda,\in) where \lambda < \mathfrak c^+. So does (2). If you only want to consider such models where \lambda is strongly inaccessible, then 2) has models (V_\lambda,\in) where lambda is among the first \mathfrak c^+ strongly inaccessible cardinals. So (2) only gives you a handful of strongly inaccessible cardinals in a context like MK or MK + global choice.

  1. anything first order true in V is true in some V_\lambda, with set parameters.
  2. anything second order true in V is true in some V_\lambda, with set parameters.

As is well known, the models (V_\lambda,\in) of (3) are exactly the models (V_\lambda,\in) of ZF. If (4) is formulated as a scheme over NBG then we get a system which is equiconsistent with the normal formulation of second order reflection, (6) below. However, it does not appear that (4) over even MK with global choice will prove the existence of a Mahlo cardinal (I haven’t thought about showing this).

  1. anything first order true in any (V,A) is true in some (V_\lambda,A \cap V_\lambda,\in), A arbitrary.
  2. anything second order true in any (V,A) is true in some (V_\lambda,A \cap V_\lambda,\in), A arbitrary.

(5) holds in exactly the (V_\lambda,\in) for which lambda is strongly inaccessible. (6) is the normal way of formulating second order reflection. As a scheme over NBG, it proves the existence of weakly compact cardinals. A subtle cardinal proves the existence of models (V_{\lambda+1},V_{\lambda},\in).

Harvey