Eugene I wrote: ↑Thu Jul 22, 2021 2:41 pm
Well, no, because analytical math is based on the theory of real numbers. As an engineer I do use computed Fourier transforms obtained by simulators. But I also need Fourier and Laplace transforms in analytical form because that gives me much more insight into the system behavior and properties. And analytical forms are only obtainable though calculus and algebra which is all based on real number theory.
This situation is similar to physics: everyone admits that QM is inconsistent and incompatible with GR/SR, and it is a big problem from the point of view of the foundations of physics. But QM practically works very well and physicists use it everyday. If one day in the future a unified and consistent physical theory becomes available, everyone will be happy to adopt it, yet QM equations will still be widely used as a practical approximation. Mechanics engineers still use Newton laws because they practically work well enough, even though they are inconsistent from the point of view of foundations of physics and QM/SR.
Similarly, the real number theory is inconsistent and I applaud your (and other mathematicians) search for a more consistent foundation of mathematics. However, it has not happened yet, so we engineers still need to use math tools based on real numbers in the meantime. And even if times come when a more consistent foundation will be developed, real number theory and its calculus will still be widely used because it is simple and it practically works very well in spite of its foundational inconsistencies.
Many quantum physicists consider the current mathematical and logical foundation of the theory highly problematic, and are looking for a different approach, a better theory of math.
When I say coherent, I mean coherent, not consistent. Posing questions based on bivalent logic leads to foundational undecidablity in many contexts, so LNC can't be a global foundational axiom of coherent foundations.
I agree that most practical applied math does not necessarily correlate with most coherent foundations, but math is also an open and evolving system, so who knows what we manage to cook up. Fragmentation into computation theory, continuous geometry and set theory is not a healthy situation, and coherent and communicative foundation could prove also very productive in terms of applied math.
People were doing calculus also before analytic geometry aka coordinate geometry. Coordinate geometry has its benefits, but also costs - e.g. equilateral triangle does not exist in rationally valued Cartesian plane. Einstein was aware of the philosophical problems of "coordinate ontology", when he spoke about necessity of coordinate invariance, but with standard formulations of standard theories being highly dependent from real complex plane, the situation has not exactly improved since Einstein's philosophical doubts against coordinate ontology of coordinate geometry.
Taking holistic philosophical distance to analysis, the etymological meaning of the word has semantic relation with partition. Partition in the mereological sense, not the modern definition based on natural numbers. Continuous geometry and mereological partition (and fractions in that sense!) predate metaphysical theory of natural numbers. Egyptian fractions are called such because they thought mereologically, the "one" in the numerator as continuum and denominator values as mereological partitions of continuum.
In our era we are - however slowly - coming back to our senses from the modernist trip into reductionism - materialistic reductionism which was based on reduction to metaphysical quantification ("There
exists unit of measurement!") which became absurd point-reductionism about century ago when Cantor and Hilbert made their post-modern linguistic turn of formalism and logicism (the latter failing almost immediately thanks to Gödel and Turing).
Foundational thinking in mathematics has been dragging behind variety of holistic philosophies that have emerge after the failure of modernist metanarrative of materialistic reductionism. There's also loads of cumulative evolution of mathematics, which does not foundationally depend from heuristic methods of formalism or coordinate geometry analysis. Wolfram's key finding, IMHO, is that only repetition and nesting are computationally reducible. Nesting is by definition a mereological part-whole relation, and does not depend from metaphysical postulation of existential quantification. Thus it seems very natural that a more coherent foundation could be built from the mereological perspective.
Maybe a mereological theory of Fourier-partition could be developed based on Stern-Brocot type structures and Ford circles?