For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | BosStartup's commentsregister

One of the major problems is the income inequality between the lower half and the upper 10%. Tariffs in theory will bring manufacturing back to the US, but only by increasing the cost of those goods. As such the average person can consume less goods. Hopefully well paying jobs balances that out a bit.

One measure is the ratio of CEO compensation to the average employee in the same firm. That ratio was 21 in 1965, today it is 290. Imagine the average worker making 13x what they make today. The late stage capitalism of capital accumulating at the top is accelerating.


Then again, those same people have much more purchasing power and better lives thanks to, checks notes, capitalism.


Are you sure about that?


Well the falsifiable part of his statement is at least true, median real wages are up in the US. Americans are wealthier than we have been since at least 1980. Especially women, as a group our real wages are basically a straight line up. Some confounding factors there obviously but still. As for whether you can attribute the growth to Capitalism specifically, *shrugs*.


It’s a bit more complicated:

https://www.consumeraffairs.com/finance/comparing-the-costs-...

Basically some things have gotten cheaper – TVs, gas, etc. – but things like inflation-adjusted housing or college prices have increased so much that people affected by them have a very different experience. This is a constant refrain many people I know who are under ~50 or so have where older relatives simply don’t understand that, say, they could go to UCLA with a part-time summer job because prior to Saint Ronnie that meant book and lab fees, and at first tuition was an order of magnitude lower (adjusted for inflation).

That creates enormously different beliefs because someone who bought a house in 1982 and has been rolling equity forward ever since has no idea what the subjective experience is like for their kids who graduated with heavy debt service and rents 50% higher.


It's not correct that an increase in home value increases your property tax. The town has a budget which then it divides by the sum of all property values, this gives a ratio that is the tax rate per property value.

In theory, your property went up 40%, if everyone else's did too then your taxes remain the same. You can see this by viewing the tax rate over time and seeing that it has declined (but again, that didn't lower the dollar amount you must pay, just the ratio of tax to property value.)


Calculation of property taxes are different across jurisdictions, you are talking like your rules apply to every system in the world.

I believe in the poster's jurisdiction, property taxes are based on a percentage of the property value and are independent of the city's actual budget.


That value is usually calculated from last sale plus no more than some small percentage each year. I.e. if home 'went up in value' but didn't sell tax doesn't change much.


That would be the first step of rollout. The next step would be to ask you in ways that make it hard for you to say no. An LLM agent isn't going to stop companies from bad behavior/dark patterns, it's going to make them more effective at deploying dark patterns to segments of the market.

I think only competition and/or regulation helps push down the allure that dark patterns present to companies.


This has the same issues as just using an array to store the data. If you remove a node in the list now you have to rewrite all the values in the links array. If you don't do that then you have to implement your own garbage collector and pay the penalty periodically.


> If you remove a node in the list now you have to rewrite all the values in the links array.

You actually don't. You can just keep two lists within the same structure, one for occupied nodes and one for free nodes, and just move deletions to the head of the free nodes list.

Example List:

  Data: [ a, 0, c, d, e ]
  Links: [ 2, -1, 3, 4, -1 ]

  Head of occupied nodes: 0
  Head of free nodes: 1
Link shape:

  Occupied Nodes: 0 -> 2 -> 3 -> 4 -> []
  Free Nodes: 1 -> []
To Delete C:

  data[2] = empty // free data
  links[2] = 1 // Repoint 2 -> 1
  links[0] = 3 // Repoint 0 -> 3
  firstFreeNode = 1
This changes the data such:

  Data: [ a, 0, 0, d, e ]
  Links: [ 3, 2, -1, 4, -1 ]

  Head of occupied nodes: 0
  Head of free nodes: 2
New shape:

  Occupied Nodes: 0 -> 3 -> 4 -> []
  Free Nodes: 2 -> 1 -> []


We migrated a rails app, that needed in memory data structures, from rails to java. We went from needing rolling restarts and 100 aws servers all cpu bound, to running 10% cpu on 2 java servers never needing a restart.

Ruby and rails is great for a PoC or a quick crud app, but it is not close to being performant which usually doesn't matter but sometimes it does.


It's hard to know what solved the problem here. I wouldn't assume that it's the Java bit, although it could be.


Oh got it, I should outline more of the problem we faced.

We had a high throughput server that needed a fair amount of in memory data to service the requests. The in memory data would update, one set of the data would update very quickly while the larger data set would update slowly.

Two things on this: 1) Ruby is not concurrent, it is multi-threaded but not concurrent. When ruby starts up you generally create a new process per CPU to ensure you can utilize all cpus. This is generally fine for a CRUD app but does not work well if you really need to share in memory data structures in a low latency way, it will result in any changes to memory being repeated for each process, so a 32 core machine is 1/32 as efficient with memory. 2) Ruby, at least at the time, had a non-compacting garage collector. This over time leads to memory fragmentation which ends up causing ruby to run out of pages, as such you run out of ram. This mean with Ruby you have to be careful if you get closer to the limit of how much ram you have on a machine and you are allocating/freeing large spans of memory, if freeing memory doesn't free the page you can still run out of memory. This appears as a memory leak for the machine but ruby appears to still just be using the same amount of memory. Only fix for this is having rolling restarts. Java using a compacting garage collector, which can help alleviate this issue, of course it does come with some pauses.

I also found the ruby schedule is not pre-emptive. This might have changed but at the time it meant if you had a lot of threads that were waiting on external IO that the ruby machine's performance deteriorated. Java's scheduled appeared to be more performant. Looking at this post makes me realize I should turn this into a blog post with links to the actual numbers to demonstrate each issue clearly.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You