For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | CyberDildonics's commentsregister

Fake economics to "stable money causes hitler" in two sentences.

Countries used gold or metals either directly as currency or to back paper currency for hundreds of years. If their governments overspent they collapsed.

Saying you need inflation or you get hitler is pretty wild.


those hundreds and thousands of years were basically before humanity learned that prosperity could exist.

no growth that whole time, other than when you go to war and expand your gold supply by taking somebody else's.

the positive sum present is much better than the zero sum past


those hundreds and thousands of years were basically before humanity learned that prosperity could exist.

No, it was pretty much up until the early 1900s


> Saying you need inflation or you get hitler is pretty wild.

Not really. You need inflation, or you end up with stagnation.

https://en.wikipedia.org/wiki/W%C3%B6rgl#The_W%C3%B6rgl_Expe...


Your proof that the economic system that worked for hundreds of years doesn't actually work is a small town that issued their own currency, which allowed them to nominally pay people less and there were more jobs?

That's what inflation does, everyone gets paid less, but they don't realize it because they just see the same number coming in and more going out.

And then because of that, somehow noninflationary currency results in hitler in 8 years (even though it was used for hundreds of years).

I think you should look at the other end, which is governments going broke and not being able to inflate their way out of it.


> Your proof that the economic system that worked for hundreds of years

It did not. The economic system that was in place for hundreds of years kept the population in a permanent state of depression. Most people had barely enough money to buy necessities.

The economic growth exploded, surpassing _millenia_ of innovation within mere _decades_ once we got rid of the golden chains.

But I guess that you fancy yourself being landed gentry and not a landless serf?


kept the population in a permanent state of depression.

Says who?

people had barely enough money to buy necessities.

Where are you getting that? I think you're confusing and conflating all sorts of things like human advancements with inflationary money. I'm sure what you're saying makes sense if you think every invention, discovery and gain of knowledge was from people having their money inflated, but there isn't any evidence of this, it's just you restating it.

The economic growth exploded, surpassing _millenia_ of innovation within mere _decades_ once we got rid of the golden chains.

That really only happened in the 70s, it wasn't exactly the dark ages before that.


> Says who?

Err... Have you ever studied history?

> Where are you getting that?

From learning the basics of history? Before 1900-s famines were common, and most people were illiterate.

> That really only happened in the 70s, it wasn't exactly the dark ages before that.

Got it. You live in an imaginary world.

In reality, the first inflation happened when the European civilization established contact with South America. It rapidly expanded the monetary base and provided an impetus to economic development.

Then paper money (and equivalent debt-based mechanisms) were invented and became popular, and this turbocharged everything.


Err... Have you ever studied history?

I just explained history to you, you realize being patronizing and repeating yourself isn't evidence right?

From learning the basics of history? Before 1900-s famines were common, and most people were illiterate.

Do you think the printing press, the engine and large scale agriculture might have more to do with food and literacy than inflation?

All those problems were solved before the US went off the gold standard anyway.

In reality, the first inflation happened when the European civilization established contact with South America. It rapidly expanded the monetary base and provided an impetus to economic development.

You realize this doesn't make sense right? This idea that printing money somehow leads to all human advancement is bizarre, even for someone who is ignoring all the things I said before. If two cultures crossed oceans do you think the advancement might come from international trade or do you think some tribe in the jungle could make up a currency, start printing it off, then go straight on to skyscrapers and video games?

All this is just a gish gallop anyway, you never bothered to explain how not having inflation "causes hitler in 8 years" even though it's only been the last 50 years that we have been off of backed currency.

You also didn't confront the fact that anyone can put their money in stocks, bond or gold and essentially have a deflationary currency but they still spend money anyway.


That makes no sense, do you think people are talking about using gold coins instead of paper money in their wallet?

In effect, some of them are. Look how many decry the death Breton Woods. They don't want to necessarily have good coins on them, but they want them to be tethered as a proxy. Why?

In effect, some of them are.

No, not in effect, no one is talking about physically exchanging gold coins, so it doesn't make sense to say "there isn't enough".

but they want them to be tethered as a proxy. Why?

That's a completely separate issue. People want to not have to deal with inflation where they are on a treadmill of needing to get paid more to support the fact that companies can raise their prices easily.


> No, not in effect, no one is talking about physically exchanging gold coins, so it doesn't make sense to say "there isn't enough".

I think my point still stands. If you can't create more gold [1] and you say - I am pegging 35 dollars to an ounce of gold. Then you are limiting the amount of dollars to the mass of gold[2]. The money works as an IOU for gold. You aren't exchanging pieces of gold, but the idea that you have 35$ is as if you have 1 ounce of gold, _that is what people pine for_. It's effectively, an IOU for 1 ounce of gold that you keep in a safe. Yes. I think the point still stands that in effect, that's what people are asking for.

[1] - You can, but it's a much slower process, deff rate limited, it's actually what people really seem to like about gold.

[2] - Yes, I am aware about the concept of fractional reserve, but that's exactly the part that goldbugs want to avoid.


I think my point still stands. I think the point still stands

I'm not sure what point you are making, it seems like you're just describing a gold backed currency and repeating the same things multiple times.

If you aren't using gold directly how would there ever be "not enough" unless a penny became worth too much? People did this for hundreds of years, this isn't some theory or experiment, it's basically how the world worked for most of human history.


> If you aren't using gold directly how would there ever be "not enough" unless a penny became worth too much?

If you peg an amount of money to a mass of gold, would you or would you not limit the amount of money in existence?


It worked before, you do know money is divisible right?

There is already a quantifiable amount of money in existence, can we at least establish that?


Mild inflation is a good thing, at least according to modern economists. In a deflationary environment, money is worth more when you don’t spend them. And when people don’t spend them, there’s no economic growth. It’s just like having very high interest rates but the central bank cannot act to lower them.

This thread was someone saying you can't peg a currency to something else because "there isn't enough of it" which is nonsense.

there’s no economic growth

Printing money doesn't create economic growth, it just inflates assets and depresses nominal wages.

money is worth more when you don’t spend them

People say this stuff like it's gospel, but if someone understands currency dynamics in the first place they would have their money in investments, which already should appreciate and act like a deflationary currency. People can already buy stocks and leave money there to get more valuable, so why does anyone spend money now?

You also have to figure out why it already worked for hundreds of years. People act like it would be an experiment. Floating currency is the experiment and it has lasted 50 years so far. Currency has lost almost all of it's value from before the 70s and minimum wage is a fraction of what it was nominally while asset prices are sky high, then people wonder why people can't afford a house or beef or gas or just to live alone.


Economic growth comes from spending. Households must be incentivized to spend either through inflation or low interest rates.

Buying stocks hoping that it would appreciate doesn’t work when there is no economic growth. So we are back to square one.

And for hundreds of years we didn’t have the same kind of international trade, or the same financial markets. One must wonder whether a new kind of currency must accompany a new era of economy and trade.

Currency losing almost all its value is by design. Modern economists target a 2% inflation rate. This means currency is supposed to lose value. It’s another mechanism to encourage spending to increase economic growth.


Economic growth comes from spending.

Who told you that?

Households must be incentivized to spend

Says who? What about governments and companies? People are already incentivized to spend because they need things.

Buying stocks hoping that it would appreciate doesn’t work when there is no economic growth.

You're contradicting yourself and going around in circles. If there is "economic growth" according to you, then stocks will go up, which means they end up being a deflationary currency, which means people will put there money there and not spend it.

They already go up due to inflation, people do buy stocks and other liquid assets, people still spend money anyway.

Also, gold still exists. By your own logic, because anyone can still buy gold or gold futures they should park their money there and never spend it.

Currency losing almost all its value is by design.

It is by design by governments and for governments. No person wants an inflationary currency, governments want it because they can they can borrow and print money they don't have and hand it out to people who in turn help with political power.

People don't want their currency to inflate away unless they own a business that can raise prices while their employees make less nominally.


There is already lots of popular software that is violates any concept of good software. Facebook messenger, instagram, twitter, minecraft, balena etcher, the original ethereum wallet, almost anything that uses electron...

Are you sure that isn't wifi interference?

Most places in Asia, this is due to massive oversubscription. No relation at all to wireless spectrum.

That's easy to claim, but there are a lot of places where everyone is surrounded by everyone else's wifi routers. If you have 9 routers that you share walls with and even more that can reach you, wifi starts to break down, but people will blame their service provider.

I've been there a bunch, my colleague has lived there. We work in the telco area. My own experiences I would question, his I don't.

It's oversubscription.

Can I provide citations or proof? No. That's extremely hard to do with oversubscription in general, no telco will admit their exact ratio without being forced to. Sometimes you can reverse engineer it from peering relationships, but that doesn't allow identifying bandwidth constraints on medium haul.


It's oversubscription.

You said that already, but repeating something isn't evidence. Oversubscription would be something that happens with cable internet on a node by node basis, so to say the problem is one thing and only that doesn't make any sense. Not only that but people will sign up for hundred megabit to gigabit internet but they only really need to watch some streams that use 3mb each.

You can actually figure out oversubscription if you ping nodes, especially over time.

There are more factors like international bandwidth, lack of caching servers in smaller countries so bandwidth has to be international, cable signal levels etc.

None of these come close to wifi contamination. If you have two neighbors trying to watch tv over wifi and you're trying to watch tv over wifi, you're sunk. Now take that to being surrounded by a dozen people, all watching tv over wifi and all watching videos on phones and tablets.

Unless someone is in a house, more isolated from their neighbors it is going to be a much bigger problem for almost everyone.

You can say 'oversubscription' because you're buddy said that, but even that can have some truth while still being a marginal issue next to the real problems. Even in places with great internet, people get a single wifi router, put all their computers and TVs on it, then blame their ISP.


You seem to be making arguments with significantly less of a connection to the actual scenario being discussed. And you're explaining oversubscription to 2 network engineers, one of them having presented at APRICOT and APNIC.

I guess I should've focused less on oversubscription and made clear that we know it's not spectrum utilization. For that, we have the equipment to measure, and we did, and it's not the problem.


For that, we have the equipment to measure, and we did, and it's not the problem.

It has been a problem for basically everyone living in apartments that had network problems that I've seen. If you measure at the wrong time it's going to look fine. You have to be there when people are watching video over wifi.

Again, just because people can't get their full bandwidth, it doesn't mean oversubscription is the actual bottleneck.


Dude, how oblivious are you. You're {man,nerd}splaining. Hard. Did you miss the "network engineer" part? Do you think our first step in debugging performance issues seen on wifi would be anything other than grabbing a cable?

Did you miss the "network engineer" part?

I get that you don't want to actually confront what I'm saying by pulling out a label and doing the appeal to authority routine, but that isn't real evidence or information.

You're still ignoring that two things can be true, but it doesn't mean they are both equal contributors when you take a birds eye overview of entire countries.

You also aren't explaining why wifi wouldn't be the primary bottleneck when you're surrounded by dozens of routers with shared walls and everyone is watching multiple video streams.

If you go into an apartment 10 floors up anywhere with internet you see dozens of wifi networks.


So sorry, you're right, wifi interference was slowing down the wired ethernet connection on the router that had no wifi.

/eot


It seems like you're taking a single instance and generalizing it to millions of people.

Not a network engineer by any measure - but I think if it was wifi contamination, it wouldn't get worse in the evenings. The routers are on 24/7. Thoughts?

That's why professionals don't call this "wifi contamination", we call it spectrum utilization/congestion. The problem isn't the number of wifi APs, the impact of beacons (i.e. idle APs) is, while not zero, quite limited and only visible in extreme cases. The actual problem is traffic, which consumes available spectrum when being carried.

It's a factor of RF bandwidth, time and space. With some non-obvious parts:

- setting your TX power too high makes you consume spectrum in a larger area, harming your neighbors. Don't yank the TX power to maximum just because it "feels" like that should be better; there is no difference between MCS (= speed/rate) 11 with 10 dBm headroom and MCS 11 with no headroom, you get ≈120Mbit either way.

- conversely, using old APs, devices, or stretching the wifi connection too far consumes excessive spectrum since you'll get a bad data rate and use much more time to convey the same data. Due to this, a repeater can in fact improve performance for devices not even using it, by getting rid of low-MCS traffic.

- don't use wide channels when you don't need the performance. A 160MHz channel means 160MHz width of picking up interference. While chipsets are somewhat intelligent about this, if you're fine with ≈200 Mbit (single MIMO stream) there's no point in going wider than 40 MHz.

- multicast is death. It's a very common wrong belief that wifi requires you to send multicast traffic at the lowest possible rate. It doesn't, but almost all low-end implementations are lazy and do just that. "Lowest possible" in this case means the lowest rate the BSS configured to support. If you have 802.11b enabled, that's generally 1 Mbit/s. Disable that, and you get the lowest 802.11n rate, which is ≈6.5 Mbit/s. If you need to deal with a lot of multicast, disabling some low MCS might also be worth it to raise that even further, but then those MCSes are not available to cover far-away devices anymore. But then again you may not want that to begin with (see above).

- it's highly dependent on building characteristics; thick stone/steel walls block much more RF energy than drywall or wood.

- if you can, just use cables. If it doesn't help you, it might still help your neighbors.

So… yeah, a lot of people consume media in the evening, and that does make it much worse.

P.S.: MCS indices: https://mcsindex.com/


Thanks for that. I thought the AP being on 24/7 was enough, but it makes sense that actual traffic is what makes the difference.

it wouldn't get worse in the evenings

Why not? People come home and start using their internet. They watch TV over wifi, use their PCs, watch videos on their phones, everyone uses what they have at the same time.

Thoughts?


I was going to respond on topic, but you might be a bit too snarky for my taste

Was the "snark" using your exact words?

Love2D uses Luajit and directly calls established game libraries. The CPU usage should be far better for 2D games, luajit is faster than a browser's javascript jit. You can also create single exe games that are a few megabytes and not a few hundred megabytes.

There is a lot more that goes into Love2D than just SDL. It uses many other libraries for sound, image loading, etc as well as using luajit so that the lua runs very fast and has a super easy C FFI.

But SDL already provides an API for all the things you listed. So I am assuming the libraries in Love2D still call those underlying SDL APIs right?

Love2D uses openAL for audio, FreeType 2 for fonts, DevIL for image loading and Box2D for physics. It can also use image fonts. It uses luasocket for networking and has a compression API built in.

On top of that there are love2d specific libraries people have written to deal with 2D games like GUIs and tile libraries.

Then there is the ease of debugging, where you can use lua to have runtime access to the table of variables and can print them on screen if you need to, not to mention dynamically loading new update and draw and input functions.

This is all to say that just downloading SDL is not going to get anywhere close to what love2d has included.


Cool stuff!

What other "datasets" are you talking about? How do you "solve a dataset" ?

You solve a dataset when you learn what there is to learn about the phenomenon of interest. The limit of such phenomenon is “cure all disease”, and clearly this is not solving that.

What are you talking about? "the phenomenon of interest"? There is nothing you wrote in either comment that makes sense.

What is a "dataset" that has been "solved" and what did the program do that 'solved' it?


MNIST (the number classification task) has been “solved” a billion times and it is hard to imagine any subsequent advances there as scores using a variety of methods have hit the saturation point of accuracy. Any further improvements are likely overfitting to noise. Therefore, we know that it is easy to detect handwritten numbers. However, we may not know how to detect other things as well, like reading an MRI. Those datasets/tasks are clearly different and require different techniques. Training an LLM is likewise different.

has been “solved” a billion times

If it was really solved, wouldn't it just need to happen once?

You think classifying handwriting of 10 numbers is the same as this that took 55 hours of GPU time for someone to go through?

I have no idea what point you're trying to make and I can't tell if you do either. You were talking about "solving" other "health datasets" but you can't even come up with one or what that means.


If you want to be literal with language, then do you ever really “solve” anything? Even tying your shoes is not solved. One day you may tie them better, but for practical purposes we can say it is solved.

Likewise, you can spend 55 hours of GPU time to produce very different things. Can those 55 hours cure cancer? Definitely not. Can it pick up correlations with a small subset of proteins that are perhaps not representative of practical problems? Probably. Can it learn a pattern to tie your shoes, given all your life experiences tying them? Sure.

I asked the question to determine what is the impact of the task and dataset. Curing cancer is huge, tying shoes is not. What are the strengths and limitations?


If you want to be literal with language, then do you ever really “solve” anything?

You are the one who said it and you can't even explain what you meant, you just get mad that anyone would ask.


Since I am hitting the reply depth: You “solve” a dataset or task when you translate some model into actual real world problems by creating a model that actually “works” (not just high accuracy). What is otherwise the point of training the model other than writing blog posts? Second to that, you can train a model that performs well on the dataset but is less useful in the real world.

This is a health dataset, there are many inputs and outputs to health (e.g., cell level, protein level, tumors, organs, etc.). In this case, it is mRNA focused, which is a broad category that translates to potentially immune responses like vaccines (exactly what kind of therapy, I’m not sure other than “25 species”). Once the model is trained, you can use it to solve real problems, perhaps to develop a therapy that makes its way to clinical trials and eventually actually treats some disease. The model by itself is useless without the ability to have that impact.

So for other examples, take any disease (e.g., Covid19), create a dataset to mirror that problem using some technique (e.g., Covid19 mRNA prediction of some sort), and solve it to create a treatment (e.g., get a safe and effective vaccine). Obviously, you can say the vaccine can be improved so it is not “solved”, but most people would be quite happy with a “almost cure for cancer” even if it wasn’t literally optimal (we don’t even know if a cure for cancer is possible).

My suggestion and question to the author is to outline what is the implications of the work rather than focusing on accuracy statistics that are meaningless without such context.


yeah lol no shit. lets not get bothered by reactionaries...

We just had a vendor uplift our quote 50% per unit

Good thing they didn't increase it.


Are you talking about minecraft? Minecraft was known for working only because it is so simple graphically compared to other games. It was said to allocate and deallocate hundreds of megabyte of memory every frame.

Minecraft still runs, and it may look graphically simple but it's actually pretty complex (as it has millions of blocks in memory at any time and has to cull which to not render, etc).

Minecraft does do some horrible things to the JVM, but it's strong and can take it.


it may look graphically simple

Because it is graphically simple. That's not even a CPU issue.

millions of blocks in memory at any time and has to cull which to not render, etc).

128x128x128 is already 2 million voxels. Minecraft and any other game like that can use an octree or some variation to not individually deal with blocks. When things are in the distance or occluded or empty space you cull a courser level of the octree.

Java can be fast compared to scripting languages but I don't know why minecraft would be an example. It is a simple game that was poorly written and had to be re-written in C++ for other platforms. It got by on being simple and but running on full PCs at the same time.


Do the investors know that "compute layer for code" doesn't mean anything and is total nonsense?

It does mean something to me, but perhaps not as profound as whoever coined the term was hoping!

A "compute layer for code" is called a microprocessor and a lot of companies already make them.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You