The equivalence class of Cauchy sequences is vastly larger and misleading compared to those of integers and rational numbers. You can take any finite sequence and prepend it to a Cauchy sequence and it will represent the same real number. For example, a sequence of 0,0,0,...,0 where the number of dots is the count of all the atoms in the universe and then followed by the decimal approximations of pi: 3, 3.1, 3.14, 3.141, ... The key component is the error clause of getting close, but that can vary greatly from sequence to sequence as to when that happens. The cute idea of being able to look at a sequence and see roughly where it is converging just is not captured well in the reality of the equivalence classes.
More or less, one can think of a Cauchy sequence of generating intervals that contain the real number, but it can be arbitrarily long before the sequence gets to "small" intervals. So comparing two Cauchy sequences could be quite difficult. Contrast that with the rational numbers where a/b ~ c/d if and only if ad = bc. This is a relatively simple thing to check if a, b, c, and d are comfortably within the realm of human computation.
Dedekind cuts avoid this as there is just one object and it is assumed to be completed. This is unrealistic in general though the n-roots are wonderful examples to think it is all okay and explicit. But if one considers e, it becomes clear that one has to do an approximation to get bounds on what is in the lower cut. The (lower) Dedekind cut can be thought of as being the set of lower endpoints of intervals that contain the real number.
My preference is to define real numbers as the set of inclusive rational intervals that contain the real number. That is a bit circular, of course, so one has to come up with properties that say when a set of intervals satisfies being a real number. The key property is based on the idea behind the intermediate value theorem, namely, given an interval containing the real number, any number in the interval divides the interval in two pieces, one which is in the set and the other is not (if the number chosen "is" the real number, then both pieces are in the set).
There is a version of this idea which is theoretically complete and uses Dedekind cuts to establish its correctness[1] and there is a version of this idea which uses what I call oracles that gets into the practical messiness of not being able to fully present a real number in practice[2].
> The equivalence class of Cauchy sequences is vastly larger and misleading compared to those of integers and rational numbers. You can take any finite sequence and prepend it to a Cauchy sequence and it will represent the same real number. ...
This can be addressed practically enough by introducing the notion of a 'modulus of convergence'.
> The equivalence class of Cauchy sequences is vastly larger and misleading compared to those of integers and rational numbers. You can take any finite sequence and prepend it to a Cauchy sequence and it will represent the same real number.
What's the misleading part of this supposed to be?
The equivalence classes of integers: pairs of naturals with (a, b) ~ (c, d) := (a + d) = (b + c).
The equivalence classes of rationals: pairs of integers with (a, b) ~ (c, d) := ad = bc.
It’s “easy” to tell whether two integers/rationals are equivalent, because the equivalence rule only requires you to determine whether one pair is a translation/multiple resp. of the other (proof is left to the reader).
Cauchy sequences, on the other hand, require you to consider the limit of an infinite sequence; as the GP points out, two sequences with the same limit may differ by an arbitrarily large prefix, which makes them “hard” to compare.
We can formalise this notion by pointing out that equality of integers and rationals is decidable, whereas equality of Cauchy reals is not. On the other hand, equality of Dedekind reals isn’t decidable either, so it’s not that Cauchy reals are necessarily easier than Dedekind reals, but more that they might lull one into a false sense of security because one might naively believe that it’s easy to tell if two sequences have the same limit.
It is easy if you know the limits; if you don't, it's still true that two sequences {r_n}, {s_n} have the same limit if and only if the limit of the difference sequence {r_n - s_n} is zero, which conveniently enough is an integer and can't mess up our attempt to define the reals without invoking the reals.
That won't help you much if you don't know what you're working with, but the same is true of rationals.
I'm missing something as to this:
> equality of Dedekind reals isn’t decidable either
Two Dedekind reals (A, B) and (A', B') are equal if and only if they have identical representations. [Which is to say, A = A' and B = B'.] This is about as simple as equality gets, and is the normal rule of equality for ordered pairs. Can you elaborate on how you're thinking about decidability?
> Two Dedekind reals (A, B) and (A', B') are equal if and only if they have identical representations. […] Can you elaborate on how you're thinking about decidability?
Direct:
Make one of the sets uncomputable, at which point the equality of the sets cannot be decided. This happens when the real defined by the Dedekind cut is itself uncomputable. BB(764) is an integer (!) that I know is uncomputable off the top of my head. The same idea (defining an object in terms of some halting property) is used in the next proof.
Via undecidability of Cauchy reals:
Equality of Cauchy reals is also undecidable. The proof is by negation: consider a procedure that decides whether a real is equal to zero; consider a
sequence (a_n) with a_n = 1 if Turing machine A halts within n steps on all inputs, 0 otherwise; this is clearly Cauchy, but if we can decide whether it’s equal to 0, then we can decide HALT.
Cauchy reals and Dedekind reals are isomorphic, so equality of Dedekinds must also be undecidable.
Hopefully those two sketches show what I mean by decidable; caveat that I’m not infallible and haven’t been in academia for a while, so some/all of this may be wrong!
> BB(764) is an integer (!) that I know is uncomputable
I meant BB(748) apparently.
To elaborate on this point a bit, I specifically mean uncomputable in ZFC. There may be other foundations in which it is computable, but we can just find another n for which BB(n) is uncomputable in that framework since BB is an uncomputable function.
Your method for deciding whether two rationals are or aren't equal relies on having representations of those rationals. If you don't have those, it doesn't matter that there's an efficient test of equality when you do.
But you're arguing that equality of Dedekind reals is undecidable based on a problem that occurs when you define a particular "Dedekind real" only by reference to some property that it has. If you had a representation of the values as Dedekind reals, it would be trivial to determine whether they were or weren't equal. You're holding them to a different standard than you're using for the integers and rationals. Why?
Let's decide a question about the integers. Is BB(800) equal to BB(801)?
The important point for me is that the equivalence for Cauchy sequences are part of the definition of real numbers as Cauchy sequences. This ought to imply that one has to be able to decide equivalence of two sequences for the definition to make sense. For Dedekind cuts, the crucial aspect is being able to define the set and that is something that can be called into question. But if that is done, it is just a computational question in comparing two Dedekind cuts, not a definitional one.
The intuition of a sequence is that the terms get closer to the convergence point. Looking at the first trillion elements of a sequence feels like it ought to give one some kind of information about the number. But without the convergence information, those first trillion elements of the sequence can be wholly useless and simply randomly chosen rational numbers. This is an "of course", but when talking about defining a real number with these sequences, as opposed to approximating them, this gives me a great deal of unease.
In particular, it is quite possible to prove a theorem that a sequence is Cauchy, but that there is no way to explicitly figure out N for a given epsilon. The sequence is effectively useless. This presumably is possible, and common, with using the Axiom of Choice. One can even imagine an algorithm for such a sequence that produces numbers and eventually converges, but the convergence is not knowable. Again, if this is just approximating something, then we can simply say it is a useless approximation scheme. But defining real numbers as the equivalence class of Cauchy sequences suggests taking such a sequence seriously in some sense and is the answer.
In contrast, consider integer and rational number versions, it is quite immediate how to reduce them to their canonical form, assuming unlimited finite arithmetic ability. For example, 200/300 ~ 2/3 and one recognizes that 200/300 and 2/3 are different forms of what we take to be the same object for most of our purposes. There is no canonical Cauchy sequence to reduce to and concluding two sequences are equivalent could take a potentially infinite number of computations /comparisons. While that is somewhat inherent to the complexity of real numbers, it feels particularly acute when it is something that must be done in defining the object.
Dedekind cuts have the opposite problem. There is only one of them for an irrational number, but it is not entirely clear what we would be computing out as an approximation, particularly if the lower cut viewpoint is adopted.
Intervals, on the other hand, inherently contain the approximation information. By dividing them and picking out the next subinterval, one also has a method to computing out a sequence of ever better approximations. I suppose one could prove the existence of the family of containment intervals without explicitly being able to produce them, but at least the emptiness of the statement would be quite clear (nothing is produced) in contrast to the sequences that could produce essentially meaningless numbers for an arbitrarily large number of terms.
The claim would be that prices rise, offsetting the cost to the consumer who are also workers. It does not do much if that is how it shakes out. The stated hope of those arguing for the raise is that the people at the top of the company make a lot of money and that money would be diverted to the workers. While I am sure that happens sometimes, I am not aware of evidence that this is often the case.
The $10 a burger price increase would be a stupid prediction unless the worker just sells one burger an hour and nothing else. If an employee's share of the selling amounted to, say, 10 burgers an hour on average, then a $1 price increase on burgers would counter the wage raise.
I agree. Also, the cost of the burger includes the cost of labor that may be at minimun wage, byt also the cost of the meat and the bread. That includes the salary of the farm workes that may be at minimum wage, but also the salary of the genetic engenneer making transgenic wheat and the salary of the oil field extraction employes that usualy have big salaries.
The people receiving $750 are neither all workers nor consumers of the businesses being taxed.
Sure as an aggregate I bet at least one Oregonian is a consumer/worker of any business with >$25M of income but if you don't spend at least 25k ($750 * (1/0.03)) at businesses making over >$25M then you're coming out ahead if prices go up by 3% at _just_ those specific businesses.
US government had about a trillion in deficits before the pandemic then spiked up to about 3 trillion in each of 2020 and 2021 [1]. That is a bump of 4 trillion dollars suddenly over those two years. The past two years seems to be about 1.5 trillion each and the inflation rate has decreased.
The theory is that printing money to spend leads to raising prices. Somebody gets that excess money so they can buy more of whatever they need, maybe start new projects or whatever and this bids up prices. Since this is not based on removing the ability for others to buy what they want (no tax increase) then the overall demand goes up. If you also have supply restrictions while spiking demand, it is natural for prices to rise dramatically.
This creates wage rising demands which, after much agony, more or less gets everything back at the same actual price levels though usually with a range of incompleted projects and a decent chunk of people impoverished while some got very rich. The particulars depend on whatever sector was inappropriately stimulated in the start of the process.
The Austrian school of economics is a good place to learn more.
Taxes are an explicit mechanism to say who the government is taking money from in order to give money to whomever they are giving it to. Deficit spending is an implicit way of taking money as determined by the market which means those with the least power are likely to lose the money.
The pandemic had about a trillion going directly to people. The rest went elsewhere.
There is also the issue of this is US spending versus global spending. Not sure what other countries did and also unsure how much the US dollar being the main global reserve currency factors into this.
This thread is a magnet for people who not only didn’t read the article but didn’t even read the headline, or are just continuing to generally repeat the same arguments despite the article contradicting them, without actually making an argument why the article is wrong.
Do you have any good recommendations for a serious dive into the trade-offs for a socialist/communist system and comes out pro-socialist, one that takes into account the analysis of Austrian economics as well as the historical tragedies of those (Lenin/Stalin, Mao, Pol Pot, ...) who have claimed to rule as socialist?
In ideal capitalism, the basic bargain is increased wealth for all at the cost of there being losers and extreme winners along with failures of endeavors being an essential ingredient to overall societal success. It is based on choices being made by individuals to maximize what they value given the options at hand.
What is the bargain in ideal socialism? Historically, it looks like equality for the non-elites being pushed over wealth creation. Is there a different trade-off being proposed. There are, after all, no solutions, just trade-offs.
In the context of this thread, what would the options in the computer market be in an ideal socialist society? Would there be many choices? Who would make those decisions? Who would make the products and why them? How would one determine how to distribute the product? Why would innovation happen, particularly how would one idea triumph over a differing one, particularly over a status quo idea that pushes out entrenched interests?
Would dissent be allowed? How would it be addressed?
I would also be interested in an honest analysis of environmental stewardship questions of socialist versus capitalist. From what I have read, environmental destruction has been way worse under government management (Soviets were notorious, but also the US military and its private contractors) than private management unless that private management is somehow being directed and/or protected by the government. Is there a reason to believe socialists would be good stewards of the environment? What is the incentive structure of the people in the system to make that happen? Under a true capitalist system, it is not getting sued for destroying other people's properties as well as the ability to profit off of well-managed lands.
Also, under anarcho-capitalism, voluntary associations that can be categorized as socialism/communism can easily arise and would be perfectly acceptable, but the reverse is not true. Why would socialist voluntary associations not be sufficient, i.e., why would enforced collectivization be necessary and morally correct? In a certain sense, anarcho-capitalism is what emerges when the right of secession is taken to the individual level while global socialism is explicitly disallowing any secession. Why is uniform collectivization of all humans under a single controlling entity a good thing?
Again, I am hoping for a reference accepted by pro-socialists that thoughtfully addresses these questions.
Because there was a basic set of primitives that could give rise to almost an infinite number of possibilities. Multiple paths and iterative forces leads this to a very diverse and thriving evolutionary landscape. It has become gloriously messy.
It would be interesting if those studying biological evolution could see how much of their techniques, theories, and predictive abilities could apply in this realm.
The medical system is not grossly unregulated. It is not centrally regulated in DC though there is a lot of regulation from there. It is states and localities that regulate it. For example, in some places, a medical professional can add an MRI without much more permission than safety concerns. But in most places, it needs to be negotiated with the various hospital systems, etc. This obviously drives costs up. Ambulance companies need to get permission from other ambulance companies to add an ambulance. These are just a couple of the examples I have come across, but I am sure there are many other examples.
Government in the US pays 60% of the medical costs in the US. State licensing applies to all levels of healthcare, not just doctors. Doing minor in home care requires some kind of nursing credential with a supervising doctor-type.
The common experience of patients is that they are interacting with private health systems and insurance companies. These systems are grotesquely obscure with prices and it feels like one is being fed to the wolves of capitalism unlike in almost any other market experience in the US. This makes it feel like it is unregulated. But, of course, this can only happen because the government has, by force of law and gun, created a medical cartel, limiting options, prices, and divergent practices.
It depends on whether one is referring to the date (see you July 4th) or the holiday (Can't wait for the 4th of July). Though I do think the date version might be becoming more common than the holiday version.
As for why the month comes first, I get the feeling we Americans care about precision up to a month and less so about the numerical day such as "When does school start?" with an answer of "School starts in September". Obviously people need the precise date if you actually have a student going, but in a lot of conversation, people are just looking for the month. I can imagine that being even more true in earlier times.
We also peg things to months such as our elections being the second Tuesday of November or Labor Day being the first Monday of September. Note that in those phrases we are putting the day kind of first, but not in a particular numerical way. The month is what sticks out as a rough guesstimate of the time period.
For me, I would view UBI as a way to transition off of bureaucratic control of people's lives. The simplest in my mind would be a flat 10% income tax for it which would then be divided up equally across to all citizens, leading to about $6,000 (20 trillion economy, 10% is 2 trillion, divided across 330 million) a year and a break even at about $60,000 in income. While $6,000 may not sound like a lot, if one were to share living costs with three other people, then that is $24,000 in cash and there are plenty of places in this country (I live in a metro that is such a place) where one could live on that amount, potentially revitalizing less crowded places. When you factor in the fact that one can work without losing it, unlike welfare, then it becomes plausible to see it as a stepping stone to being more productive without being too much of a hindrance. If there is to be a social safety net, this seems the least damaging.
The concept of UBI is not a clear proposal without the funding mechanism. Many proponents unfortunately talk about the paying out without figuring out how to pay in. The recent attempt with the child tax credit increase, for example, was part of a time period of massive deficit spending and led to inflation, undercutting much of the benefit. MMT + UBI would be a feel good path to economic devastation.
A related idea would be to also eliminate public schools and convert that already property taxed money to direct payments to families of children. In my city, we spend about 16k per student. A quarter of that is for special needs so setting that aside, we would have 12k for the typical child. Funding children instead of bureaucrats. Grouping 20 children would lead to 240k which could easily fund 4 full-time teachers / staff for 40k each with 80k leftover for space costs. Compare this to 30 students per classroom in a decrepit buildings as is the case in my school district with documented, horrible failed learning outcomes.
The ultimate goal for me is the elimination of the government, but getting from here to there in a stable fashion is difficult. My thought is that simple, direct cash payments is a way to eliminate the bureaucrats, try to heal the harms caused by a century of economic government meddling, and eventually lead to a populace that doesn't think it needs a nanny state.
At Sudbury schools, reading naturally occurs for students, but the age ranges from 4-12, with typical being 7-9. There seems to be little correlation to early reading and a love of reading, anecdotally from these schools. In fact, often the ones who learn to read later seem to love reading more than the ones who read earlier.
The first books, of any kind, my daughter read were Harry Potter books. She went from not reading at age 8, to reading chats with her friends at age 9 (pandemic time), to having read the entire 7 book series by the end of age 10. No one taught her how to read. Writing, using devices, arithmetic, she largely acquired them all on her own as she matured.
While I am biased, she seems well-adjusted and quite bright. I have no doubt that she is capable of mastering whatever she would like to master.
Note that the article was looking at a cohort study. It was not based on randomly assigning "pleasure reading in early childhood" which means cause and effect cannot be ascertained. My guess is that if they were to measure lifelong Sudbury students they would find well-developed cortical areas regardless of early childhood reading or not. The development probably relates to constructing complex understanding of the world (or fictional worlds) which independent play as well as reading can do, but conventional teaching cannot.
More or less, one can think of a Cauchy sequence of generating intervals that contain the real number, but it can be arbitrarily long before the sequence gets to "small" intervals. So comparing two Cauchy sequences could be quite difficult. Contrast that with the rational numbers where a/b ~ c/d if and only if ad = bc. This is a relatively simple thing to check if a, b, c, and d are comfortably within the realm of human computation.
Dedekind cuts avoid this as there is just one object and it is assumed to be completed. This is unrealistic in general though the n-roots are wonderful examples to think it is all okay and explicit. But if one considers e, it becomes clear that one has to do an approximation to get bounds on what is in the lower cut. The (lower) Dedekind cut can be thought of as being the set of lower endpoints of intervals that contain the real number.
My preference is to define real numbers as the set of inclusive rational intervals that contain the real number. That is a bit circular, of course, so one has to come up with properties that say when a set of intervals satisfies being a real number. The key property is based on the idea behind the intermediate value theorem, namely, given an interval containing the real number, any number in the interval divides the interval in two pieces, one which is in the set and the other is not (if the number chosen "is" the real number, then both pieces are in the set).
There is a version of this idea which is theoretically complete and uses Dedekind cuts to establish its correctness[1] and there is a version of this idea which uses what I call oracles that gets into the practical messiness of not being able to fully present a real number in practice[2].
1: https://github.com/jostylr/Reals-as-Oracles/blob/main/articl... 2: https://github.com/jostylr/Reals-as-Oracles/blob/main/articl...