[hist-analytic] Methods of Proof: Re: Clarity Is Not Enough

steve bayne baynesrb at yahoo.com
Mon Feb 23 10:27:26 EST 2009


I'm pretty sure I get what you are talking about w.r.t
the metalanguage stuff. But take this fragment:

"represent the syntax of A in the ontology of B."

Could you give an example of syntax being represented
"in the ontology"?

I appreciate your comments on redundancy. Sometimes when
we are searching for a criterion for opting for one
system over another simplicity is a consideration. I think
there is a connection between simplicity and redundancy.
It's worth mentioning that redundancy in nature is very
common. But if we think of metalanguages as constructed
redundancy takes on a different significance. Or so 
it seems.

Regards

STeve

--- On Mon, 2/23/09, Roger Bishop Jones <rbj at rbjones.com> wrote:

> From: Roger Bishop Jones <rbj at rbjones.com>
> Subject: Re: Methods of Proof: Re: Clarity Is Not Enough
> To: baynesrb at yahoo.com
> Cc: hist-analytic at simplelists.com
> Date: Monday, February 23, 2009, 10:10 AM
> On Monday 23 February 2009 13:46:00 steve bayne wrote:
> > "In general, to reason about language A in
> language B one must
> > be able to represent the syntax of A in the ontology
> of B."
> >
> > I agree with most all of what you have said here, but
> I'm not
> > sure about the above. Some maintain no ontology
> involved in
> > logic at all. Now I don't agree with this, but we
> have to be
> > clear before passing on to other matters. Can you give
> me an
> > example of how syntax is represented in ontology. Now
> a point
> > on structure.
> 
> Its convenient to assume the existence of the things you
> want to talk about, technically it might be possible to
> avoid this but it would be cumbersome.
> 
> Typically we chose a metalanguage which does not assume
> the existence of syntactic objects, but presumes an
> ontology sufficient to represent them.
> The most common examples are arithmetic, in which the
> existence of the natural numbers is presumed, and set
> theory in which the existence of a rich variety of
> sets is presumed.  In the former case we speak about
> syntax via arithmetisation, in the latter using some
> way of coding up syntactic objects as sets (it would
> also be possible to do this via arithmetisation given
> that we have ways of representing numbers as sets).
> 
> > My interest here is redundancy in a logical system.
> When you
> > speak of "languages" I take it you mean
> canonical or formal
> > languages, if so then I think there is a difference in
> how
> > redundancy is to be regarded. For example, in making
> the
> > grammar of a natural language explicit lack of
> redundancy
> > has always been considered a virtue. So when you are
> setting
> > up principles or parameters that will
> "generate" (as in
> > "generative syntax") all on only the
> sentences of a given
> > language alternatives must be evaluated and redundancy
> in the
> > *application* of a rule becomes paramount. Now in
> logic there
> > is a difference. Let me give an example. In some
> proofs of
> > both up and down versions of Lowenheim/Skolem the
> occurrence
> > of vacuous quantifiers makes no difference, so that
> there is
> > nothing really wrong with '(Ex)(Ey)(Ez)Fxy',
> but in standard
> > linguistic theory, where natural languages are the
> "object
> > language" there is nothing wrong with this. So my
> point was
> > this: if you set up an two axiomatic systems, and one
> involves
> > less redundancy then that is the better one.
> Redundancy would
> > be determined by how many ways a theorem can be
> proven; the
> > more ways of doing it the more redundancy. The
> quantificacional
> > example I've given above would if duplicated in
> standard
> > linguistic theory would yield massive over generation.
> In logic
> > this doesn't lead to inconsistency but my point
> was, in part,
> > to raise the question whether the linguistic case and
> the
> > logic case are similar. One other thing: If you take
> redundancy
> > to mean repeated application of a rule, then it might
> be that in
> > the sense I intend, a system like Hilbert's would
> be less
> > redundant than, say, Frege's. I suppose
> formalization of
> > arithmetic and the formalization of natural language
> may
> > differ here. 
> 
> I can certainly see certain kinds of redundancy which it is
> usual to avoid if possible.
> The classic example in first order theories is the
> preference
> for axiomatisations of a theory in which none of the axioms
> can be proven from any combination of the others.
> Similar considerations would apply to rules.
> 
> As far as syntax is concerned some redundancy is common.
> For example, repetition is often achieved by recursion,
> and this will often lead to harmlessly ambiguous parse
> trees,
> Similarly with proofs, where for example a linear
> conception
> of forward proof leaves open the order in which lemmas are
> proven.
> 
> > Finally, on topology.
> > There is a connection here. Tarski wrote on it briefly
> and there
> > have been others. It might be argued, and I think not
> without
> > reason that topology is a branch of model theory; the
> connection,
> > notwithstanding the flawed suggestion of Heine-Borel
> (perhaps) seems
> > pretty clear.
> 
> I'm sure there are connections.
> For example, one way of constructing models for positive
> set theory
> is topological.
> 
> > Anyway, I think we need to get clear on what,
> precisely, you
> > take to be ontology and its relation to syntax.
> 
> I hope the above helps in that.
> 
> regards,
> Roger




More information about the hist-analytic mailing list