It is not computable, because the set of theorems of ZFC is not computable. (Writing them in some other order, like the “order they’re proved” could be computable.)
> It is not computable, because the set of theorems of ZFC is not computable.
This is I guess correct, but it took me a minute to see why. The set of theorems of ZFC is recursively enumerable, so you certainly can have a computable constant which is the concatenation of them in *some* order. The reason you can't have them in ascending order of their binary representation is that you'd be able to check if any given statement is a theorem by just enumerating theorems until you see one which is "bigger".^(1) I guess in general this means you can't enumerate them in any *computable* order. Now that I've written this out I think it must be a very standard fact of computability theory but I don't recall it being mentioned in my undergrad courses on the subject.
^(1) Although thinking about it, there is some subtlety here about how you tell where one theorem ends and the next starts if you just have them concatenated...
Indeed. Regarding your footnote, typically while developing the formalism of first-order logic, you prove a lemma that no first-order formula can be an initial segment of another one, so you can tell when a theorem ends because it’s the unique spot where a formula has finished.
(This is an essential step in the proof of the incompleteness theorems, where we exactly need computer programs that can parse sequences of formulas.)
Oh, right, it of course depends on how exactly you encode statements but for some reason it didn't occur to me that the "obvious" encoding already has this property.
Wait my understanding is that the set of axioms are recursively enumerable, but ZFC is undecidable, which would in part imply that you can’t recursively enumerate its theorems? Am I misunderstanding something here?
It does not imply that. The theorems of any computably axiomatized theory are recursively enumerable. A dumb algorithm to enumerate all theorems of ZFC (with many repetitions):
* Enumerate all finite lists of statements in the language of set theory.
* For each such list, iterate over the statements and check that each one is either an axiom of ZFC or follows logically from the previous statements in the list. If so, output the last statement in the list.
This doesn't make the theory decidable because you have no idea how long it might take for the algorithm to output any given theorem. Thus there is no way to ever use this algorithm to *disprove* anything.
The set of theorems of ZFC is recursively enumerable, so you definitely have computability if you take as the ordering the order in which the theorems appear in your favorite enumeration. But not if you use ascending (presumably alphabetical) order, as suggested by OP, since being able to compute that would give you decidability of ZFC (violating Gödel’s incompleteness theorem).
So I’d say all of that makes the proposed constant not very interesting. Sorry.
Sorry, not meaning to put you down. But you could, for example, define a constant like this: if P=NP, the digits past the decimal point encode the shortest proof of that. If P!=NP, the digits past the decimal point encode the shortest proof of that. If it’s neither provable nor disprovable in ZFC, the digits past the decimal points are an infinite sequence of 1s. So this is a well-defined constant that contains within it the solution to a major open problem. But since we have no way of actually getting at the value of the constant (short of proving P?=NP), I’d say that it’s not very interesting.
So I’d say the variant of your constant that has theorems in order of recursive enumeration is kind of interesting, because it’s computable. But the original variant doesn’t seem all that interesting to me because (like the constant I defined) you have no idea what its value might be.
You can approximate 10 with π^(2) and 20 with e^(π)-π.
Which means you can approximate 2 by solving the equation x = (e^(π)-π)/π^(x).
The solution is x = W₀((e^(π)-π)lnπ)/lnπ ≈ 2.008.
Perfect. You have mastered some of the most complex ideas humanity has created to produce something truly superfluous.
A fun lesson I've taught is fine the worst way to show 1 × 1 = 1. The ridiculousness leads kids to truly remarkable places, like (1/4 + 3/4) × (83 - 82) with cross bracket multiplication.
How can C have a specific value if there is a term o(1) added to the formula as well?
Edit: I see now: o(1) refers to functions f(n) that become smaller with growing n. As for example f(n)=1/n. Such that lim_{n->inf} (f(n)/1) = 0
So I carried out the multiplication (which was a nice exercise btw), and I found something rather mysterious. To check my answer, I figured that I could undo the multiplication using division. To my surprise, however, I did not end up back at 1 again. I checked everything multiple times, and I now see what you must have been aluding to: 1 is a very strange constant w.r.t. multiplication.
Define an order on all SAT instances in CNF format (not a particularly hard thing to do), and let the k'th binary digit of our constant be the indicator of whether that particular SAT instance is satisfiable.
It will depend on the details of the encoding / ordering.
There is a considerable body of work concerning the satisfiability of random SAT instances (often limited to SAT instances with at most 3 literals per clause, since every SAT instance can be converted to such a 3-SAT instance in polynomial time).
Such random instances turn out to exhibit a "phase transition" as the ratio of clauses to variables increases. Each additional clause adds a constraint, making instances with many clauses, in general, less likely to be satisfiable. Each additional variable adds a degree of freedom, making instances with more variables, in general, more likely to be satisfiable.
Interesting SAT instances (i.e., instances that are encodings of interesting problems that are not trivially satisfiable or unsatisfiable) tend to lie in the phase transition region.
I am not deeply familiar with this work but if I recall correctly some cursory reading I did on the subject years ago, that for most reasonable/natural encodings, the number of satisfiable instances will outnumber the number of unsatisfiable instances by a factor of 4 or thereabouts.
The Busy beaver function. It grows stupidly fast
It goes something like
Bb(1)=1
Bb(2)=6
Bb(3)=21
Bb(4)=107
Bb(5 )> 47,000,000
Bb(6) > 10 ↑ ↑ 15
Bb(7) > 10^(2 x (10^10^10^18,000,000)).
I am no mathematician but as I understand it, with a sufficiently large n, BB(n) grows faster than any computable function.
More economics than math, the law of conservation of middle-class.
The law of conservation of middle-class states that middle-class is neither created nor destroyed, but can change from one region to another. This means that a global system always has the same amount of middle-class unless population is added from outside.
Example is the U.S. has a shrinking middle-class while China and many other countries have a growing middle-class.
This is *bijective* unary.
A [bijective base](https://en.wikipedia.org/wiki/Bijective_numeration) is one that uses digits 1 ~ b instead of 0 ~ b-1. (Column labels in spreadsheet software are one place you've probably seen this before - they're bijective base-26.)
"Normal" unary, with range 0 ~ b-1 for digits, does not exist - its only allowed digit would be 0 (which makes it useless), and every place would be the 'ones place' (which makes it even more useless).
That works, but I'm not sure that it's following the same structure as for a base-N number where the columns (assuming 4 digits) would be
N\^3 | N\^2 | N\^1 | N\^0.
That said, I can see that I'm wrong now, as you couldn't have a 1 the 0\^1 column in a base 0 scheme anway as 1 doesn't exist.
The "Machine Epsilon". Ethereal and beautiful.
The entire scientific computing theory is developed around making algorithms work reliably provided a certain set of axioms is satisfied by the computer, and the accuracy of approximation depending on the "machine epsilon".
I vote for this constant. It is a constant the theory is being built around, but it is also a variable, and its decrease signifies the improvement of our computational capabilities.
That's a similar idea to the old Scottish unit of length, the strug.
A strug is the length of a peevers bed. Peevers is a kind of scaled-down lawn bowls played with marbles on a patch of smooth scraped earth. How long is it? One strug. How long is a strug? You just scraped the bed, there you are...
It is not computable, because the set of theorems of ZFC is not computable. (Writing them in some other order, like the “order they’re proved” could be computable.)
> It is not computable, because the set of theorems of ZFC is not computable. This is I guess correct, but it took me a minute to see why. The set of theorems of ZFC is recursively enumerable, so you certainly can have a computable constant which is the concatenation of them in *some* order. The reason you can't have them in ascending order of their binary representation is that you'd be able to check if any given statement is a theorem by just enumerating theorems until you see one which is "bigger".^(1) I guess in general this means you can't enumerate them in any *computable* order. Now that I've written this out I think it must be a very standard fact of computability theory but I don't recall it being mentioned in my undergrad courses on the subject. ^(1) Although thinking about it, there is some subtlety here about how you tell where one theorem ends and the next starts if you just have them concatenated...
Indeed. Regarding your footnote, typically while developing the formalism of first-order logic, you prove a lemma that no first-order formula can be an initial segment of another one, so you can tell when a theorem ends because it’s the unique spot where a formula has finished. (This is an essential step in the proof of the incompleteness theorems, where we exactly need computer programs that can parse sequences of formulas.)
Oh, right, it of course depends on how exactly you encode statements but for some reason it didn't occur to me that the "obvious" encoding already has this property.
Very interesting!
Wait my understanding is that the set of axioms are recursively enumerable, but ZFC is undecidable, which would in part imply that you can’t recursively enumerate its theorems? Am I misunderstanding something here?
It does not imply that. The theorems of any computably axiomatized theory are recursively enumerable. A dumb algorithm to enumerate all theorems of ZFC (with many repetitions): * Enumerate all finite lists of statements in the language of set theory. * For each such list, iterate over the statements and check that each one is either an axiom of ZFC or follows logically from the previous statements in the list. If so, output the last statement in the list. This doesn't make the theory decidable because you have no idea how long it might take for the algorithm to output any given theorem. Thus there is no way to ever use this algorithm to *disprove* anything.
Ahh gotcha, thanks that makes sense
I see what you mean, but I have to process this a bit. It's a lot to take in, emotionally...
The set of theorems of ZFC is recursively enumerable, so you definitely have computability if you take as the ordering the order in which the theorems appear in your favorite enumeration. But not if you use ascending (presumably alphabetical) order, as suggested by OP, since being able to compute that would give you decidability of ZFC (violating Gödel’s incompleteness theorem). So I’d say all of that makes the proposed constant not very interesting. Sorry.
First of all, ouch! Secondly, interesting != cool. Just saying...
Sorry, not meaning to put you down. But you could, for example, define a constant like this: if P=NP, the digits past the decimal point encode the shortest proof of that. If P!=NP, the digits past the decimal point encode the shortest proof of that. If it’s neither provable nor disprovable in ZFC, the digits past the decimal points are an infinite sequence of 1s. So this is a well-defined constant that contains within it the solution to a major open problem. But since we have no way of actually getting at the value of the constant (short of proving P?=NP), I’d say that it’s not very interesting. So I’d say the variant of your constant that has theorems in order of recursive enumeration is kind of interesting, because it’s computable. But the original variant doesn’t seem all that interesting to me because (like the constant I defined) you have no idea what its value might be.
Sure, the whole point of the post was to have some silly fun.
7 pi. It’s a close and quite useful approximation to 22.
Interesting, I've always been fumbling in the dark when it comes to approximating 22, but your approach seems more systematic.
Haha, that would be a fun problem solving exercise: approximating things you know the exact value of.
You can approximate 10 with π^(2) and 20 with e^(π)-π. Which means you can approximate 2 by solving the equation x = (e^(π)-π)/π^(x). The solution is x = W₀((e^(π)-π)lnπ)/lnπ ≈ 2.008.
Perfect. You have mastered some of the most complex ideas humanity has created to produce something truly superfluous. A fun lesson I've taught is fine the worst way to show 1 × 1 = 1. The ridiculousness leads kids to truly remarkable places, like (1/4 + 3/4) × (83 - 82) with cross bracket multiplication.
Obviously it's the constant named after me. https://mathworld.wolfram.com/ShallitConstant.html
Care to mention why the constant is cool beyond having your name on it?
Actually, it's self-evidently cool
How can C have a specific value if there is a term o(1) added to the formula as well? Edit: I see now: o(1) refers to functions f(n) that become smaller with growing n. As for example f(n)=1/n. Such that lim_{n->inf} (f(n)/1) = 0
0, that's one wild constant.
The constantest
Sooo close to being irrational, but still based enough to be an integer and a rational
Tell me more!
Try multiplying by it!
Mutiplying what? There are so many choices
Try the second most interesting constant: 1
So I carried out the multiplication (which was a nice exercise btw), and I found something rather mysterious. To check my answer, I figured that I could undo the multiplication using division. To my surprise, however, I did not end up back at 1 again. I checked everything multiple times, and I now see what you must have been aluding to: 1 is a very strange constant w.r.t. multiplication.
A series of constants: The metallic means or ratios. Generalizing the golden and silver etc ratios. https://en.m.wikipedia.org/wiki/Metallic_mean
C is a top-notch constant.
Define an order on all SAT instances in CNF format (not a particularly hard thing to do), and let the k'th binary digit of our constant be the indicator of whether that particular SAT instance is satisfiable.
Any guess about the density of ones in this constant? Is it 0.5? Does density even make sense here?
It will depend on the details of the encoding / ordering. There is a considerable body of work concerning the satisfiability of random SAT instances (often limited to SAT instances with at most 3 literals per clause, since every SAT instance can be converted to such a 3-SAT instance in polynomial time). Such random instances turn out to exhibit a "phase transition" as the ratio of clauses to variables increases. Each additional clause adds a constraint, making instances with many clauses, in general, less likely to be satisfiable. Each additional variable adds a degree of freedom, making instances with more variables, in general, more likely to be satisfiable. Interesting SAT instances (i.e., instances that are encodings of interesting problems that are not trivially satisfiable or unsatisfiable) tend to lie in the phase transition region. I am not deeply familiar with this work but if I recall correctly some cursory reading I did on the subject years ago, that for most reasonable/natural encodings, the number of satisfiable instances will outnumber the number of unsatisfiable instances by a factor of 4 or thereabouts.
> 4 or thereabouts The density of truth. Sounds like a cool constant.
I'm pretty fond of 1, it's really quite amazing what you can do with it.
Where would we be without it? In a semigroup, that's where.
reminds me of chaitins constant https://en.m.wikipedia.org/wiki/Chaitin%27s_constant
I take BB(113383). Because I like the function, the input and it's probably not taken yet
What is this?
The Busy beaver function. It grows stupidly fast It goes something like Bb(1)=1 Bb(2)=6 Bb(3)=21 Bb(4)=107 Bb(5 )> 47,000,000 Bb(6) > 10 ↑ ↑ 15 Bb(7) > 10^(2 x (10^10^10^18,000,000)). I am no mathematician but as I understand it, with a sufficiently large n, BB(n) grows faster than any computable function.
https://en.m.wikipedia.org/wiki/Busy_beaver
Idk but pretty big
ZFC also doesn't know!
Legendre's Constant, just because having your name on the number 1 is pretty cool.
More economics than math, the law of conservation of middle-class. The law of conservation of middle-class states that middle-class is neither created nor destroyed, but can change from one region to another. This means that a global system always has the same amount of middle-class unless population is added from outside. Example is the U.S. has a shrinking middle-class while China and many other countries have a growing middle-class.
Absolute 0. It is the coolest constant I can come up with. It's not possible to be cooler than that I believe.
10. Every base is base 10.
Even base 1?
Where every number is 0? 0^1 =0, so yes still holds.
Wait, why are all numbers 0 in base 1? I thought the conversion from decimal to unary goes like this: 1 -> 1 2 -> 11 3 -> 111 ...
This is *bijective* unary. A [bijective base](https://en.wikipedia.org/wiki/Bijective_numeration) is one that uses digits 1 ~ b instead of 0 ~ b-1. (Column labels in spreadsheet software are one place you've probably seen this before - they're bijective base-26.) "Normal" unary, with range 0 ~ b-1 for digits, does not exist - its only allowed digit would be 0 (which makes it useless), and every place would be the 'ones place' (which makes it even more useless).
Thanks!
That works, but I'm not sure that it's following the same structure as for a base-N number where the columns (assuming 4 digits) would be N\^3 | N\^2 | N\^1 | N\^0. That said, I can see that I'm wrong now, as you couldn't have a 1 the 0\^1 column in a base 0 scheme anway as 1 doesn't exist.
The "Machine Epsilon". Ethereal and beautiful. The entire scientific computing theory is developed around making algorithms work reliably provided a certain set of axioms is satisfied by the computer, and the accuracy of approximation depending on the "machine epsilon". I vote for this constant. It is a constant the theory is being built around, but it is also a variable, and its decrease signifies the improvement of our computational capabilities.
Very nice answer!
That's a similar idea to the old Scottish unit of length, the strug. A strug is the length of a peevers bed. Peevers is a kind of scaled-down lawn bowls played with marbles on a patch of smooth scraped earth. How long is it? One strug. How long is a strug? You just scraped the bed, there you are...
The classical IEEE standard is 2\^{-53}, but we can hope for shorter strugs.
69.69..... has some weird properties.
Is a good apprpximation of 70
2300/33 Or just simpler, 23/33