For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more adam_arthur's commentsregister

There's an easy legal argument to make that discrimination in hiring, or otherwise, on the basis of race is illegal.

So I'm not surprised that companies with policies promoting specific subgroups are worried about legal consequences.

e.g. from the article

[Amazon] "The deletion of previous statements pledging to commit to “equity for Black people”"


These policies are in compliance with 60 years of US federal law. These policies are in compliance with 60 years of US federal case law.

There policies are in compliance with 60 years of executive branch policy/guidance.

The executive branch can not change established law nor case law. The executive branch can not change executive branch policies in an 'arbitrary and capricious' manner.

These are the guardrails to stability that allow our nation to function, our stock markets to attract overseas investment, and our currency to be the worlds reserve currency.

That these 'executives' would rather jeopardize all of the above for expediency/deference from a newly elected President says quite a lot. There are no guardrails when everyone abandons them proactively. And when there are no guardrails there is not stability. And when there is no stability there is no business investment.


This internet comment has a lot of conviction, but my experience differs.

A company I once worked for "asked" me and others EM's to fast-track hiring African American engineers to meet an internal diversity quota. The company didn't write this policy on paper and kept it strictly verbal for fear of legal consequences.

In any case, Affirmative Action has recently been struck down by the Supreme Court so I doubt similar racial-discriminatory policies will hold up.


Everything I wrote is true and is how our nation works, both legally and as a successful country. It is what has lead to us being the number one economy on earth.

But I see instead of facts you went with the old in 'my experience...' internet comment. Did you challenge them in court? Or did you just give up in advance? You saying these written policies should be given up because of secret policies used nefariously (not the ones in question)? Shouldn't it be the secret policies that are challenged? Should we get rid of Trump because he said no project 2025 yet in the first few days implemented a crap ton of project 2025 appearing to have 'secret policies'?

There is nothing to hold up in court if every company just gives in and leaves 60 years of norms behind in one day. If you don't see that or don't care you are putting politics over what has made the USA great. No tyrants jerking policy at their whims. Instead established and protected norms providing a stable country. Look at successful software frameworks and how they are maintained. Much differently than how a company is operates. Frameworks (including social frameworks) need stabilit to work.


Instead, those companies are going to promote specific subgroups and not admit it. After all, you can always claim that it's a coincidence that the only qualified people were straight white men. It's only discrimination if you write it down.


Assuming equal skill distribution among subgroups, an unbiased hiring process would end up with employee demographics matching 1:1 with the demographic make of the talent pool.

What do you think is the gender split in the tech hiring pool?

What do you think is the racial split?


So that is what Amazon wants to do, but the only thing stopping them before was a diversity statement?


It's what individual people want to do. The diversity statement says "no, that's not what we want as a company."

It's not sufficient. It also requires someone looking at the hires and seeing if it looks like it's being carried out. Those were the people Trump just fired from the government.


I respect the area Bun is trying to carve out, but if you follow the creator on Twitter, you'll see the decision making process is very focused on short-term wins/convenience and very light on the deeper implications of the "magic"

I was initially excited about the project, but have no faith in the long term direction given the spurious (and often poor IMO) design decisions. At least as it's publicized on socials.

It would have been fine if they kept it v0.x, but releasing 1.0 should have significantly raised the bar for increasing API surface area


Finally!

First product that directly competes on price with Macs for local inferencing of large LLMs (higher RAM). And likely outperforms them substantially.

Definitely will upgrade my home LLM server if specs bear out.


Inferencing does not require Nvidia GPUs at all, and its almost criminal to be recommending dedicated GPUs with only 12GB of RAM.

Buy a MacMini or MacbookPro with RAM maxed out.

I just bought an M4 mac mini for exactly this use case that has 64GB for ~2k. You can get 128GB on the MBP for ~5k. These will run much larger (and more useful) models.

EDIT: Since the request was for < $1600, you can still get a 32GB mac mini for $1200 or 24GB for $800


> its almost criminal to be recommending dedicated GPUs with only 12GB of RAM.

If you already own a PC, it makes a hell of a lot more sense to spend $900 on a 3090 than it does to spec out a Mac Mini with 24gb of RAM. Plus, the Nvidia setup can scale to as many GPUs as you own which gives you options for upgrading that Apple wouldn't be caught dead offering.

Oh, and native Linux support that doesn't suck balls is a plus. I haven't benchmarked a Mac since the M2 generation, but the figures I can find put the M4 Max's compute somewhere near the desktop 3060 Ti: https://browser.geekbench.com/opencl-benchmarks


A Mac Mini with 24GB is ~$800 at the cheapest configuration. I can respect wanting to do a single part upgrade, but if you're using these LLMs for serious work, the price/perf for inferencing is far in favor of using Macs at the moment.

You can easily use the MacMini as a hub for running the LLM while you do work on your main computer (and it won't eat up your system resources or turn your primary computer into a heater)

I hope that more non-mac PCs come out optimized for high RAM SoC, I'm personally not a huge Apple fan but use them begrudgingly.

Also your $900 quote is a used/refurbished GPU. I've had plenty of GPUs burn out on me in the old days, not sure how it is nowadays, but that's a lot to pay for a used part IMO


if you're doing serious work, performance is more important than getting a good price/perf ratio, and a pair of 3090s is gonna be faster. It depends on your budget, however as that configuration is a bit more expensive, however.


Whether performance or cost is more important depends on your use case. Some tasks that an LLM can do very well may not need to be done often, or even particularly quickly (as in my case).

e.g. LLM as one step of an ETL-style pipeline

Latency of the response really only matters if that response is user facing and is being actively awaited by the user


> M4 Max's compute somewhere near the desktop 3060 Ti

The only advantage is the M4 Max's ability to have way more VRAM than a 3060 Ti. You won't find many M4 Maxes with just 8 or 16 GB of RAM, and I don't think you can do much except use really small models with a 3060 Ti.


It's a bit of a moot point when CUDA will run 4 3060Tis in parallel, with further options for paging out to system memory. Since most models (particularly bigger/MOE ones) are sparsely decoded, you can get quite a lot of mileage out of multiple PCIe slots fed with enough bandwidth.

There's no doubt in my mind that the PC is the better performer if raw power is your concern. It's far-and-away the better value if you don't need to buy new hardware and only need a GPU. $2,000 of Nvidia GPUs will buy you halfway to an enterprise cluster, $2,000 of Apple hardware will get you a laptop chip with HBM.


You need a lot of space for that, cooling, and a good fuse that won't trip when you turn it on. I would totally just pay the money for an M4 Ultra MacStudio with 128 GB of RAM (or an M4 Max with 64 GB). It is a much cleaner setup, especially if you aren't interested in image generation (which the Macs are not good at yet).

If I could spend $4k on a non-Apple turn key solution that I could reasonably manage in my house, I would totally consider it.


Well, that's your call. If you're the sort of person that's willing to spend $2,000 on a M4 Ultra (which doesn't quite exist yet but we can pretend it does), then I honest to god do not understand why you'd refuse to spend that same money on a Jetson Orin with the same amount of memory in a smaller footprint with better performance and lower power consumption.

Unless you're specifically speccing out a computer for mobile use, the price premium you spend on a Mac isn't for better software or faster hardware. If you can tolerate Linux or Windows, I don't see why you'd even consider Mac hardware for your desktop. In the OP's position, suggesting Apple hardware literally makes no sense. They're not asking for the best hardware that runs MacOS, they're asking for the best hardware for AI.

> If I could spend $4k on a non-Apple turn key solution that I could reasonably manage in my house, I would totally consider it.

You can't pay Apple $4k for a turnkey solution, either. MacOS is borderline useless for headless inference; Vulkan compute and OpenCL are both MIA, package managers break on regular system updates and don't support rollback, LTS support barely exists, most coreutils are outdated and unmaintained, Asahi features things that MacOS doesn't support and vice-versa... you can't fool me into thinking that's a "turn key solution" any day of the week. If your car requires you to pick a package manager after you turn the engine over, then I really feel sorry for you. The state of MacOS for AI inference is truly no better than what Microsoft did with DirectML. By some accounts it's quite a bit worse.


M4 Ultra with enough RAM will cost more than $2000. An M2 Ultra mac studio with 64GB is $3999, and you probably want more RAM than that to run bigger models that the ultra can handle (it is basically 2X as powerful as the Max with more memory bandwidth). An M2 Max with 64GB of RAM, which is more reasonable, will run you $2,499. I have no idea if those prices will hold when the M4 Mac Studious finally come out (M4 Max MBP with 64 GB of ram starts at $3900 ATM).

> You can't pay Apple $4k for a turnkey solution, either.

I've seen/read plenty of success stories of Metal ports of models being used via LM Studio without much configuration/setup/hardware scavenging, so we can just disagree there.


>You need a lot of space for that, cooling, and a good fuse

Or live in europe where any wall-socket can give you closer to 3kW. For crazier setups like charging your EV you can have three-phase plugs with ~22kW to play with. 1m2 of floor-space isn't that substantial either unless you already live in a closet in middle of the most crowded city.


3 phase 240v at 16amps is just about 11kW. You're not going to find anything above that residential unless it was purpose-built.

That's still a lot of power, though, and does not invalidate your point.


Reasonable? $7,000 for a laptop is pretty up there.

[Edit: OK I see I am adding cost when checking due to choosing a larger SSD drive, so $5,000 is more of a fair bottom price, with 1TB of storage.]

Responding specifically to this very specific claim: "Can get 128GB of ram for a reasonable price."

I'm open to your explanation of how this is reasonable — I mean, you didn't say cheap, to be fair. Maybe 128GB of ram on GPUs would be way more (that's like 6 x 4090s), is what you're saying.

For anyone who wants to reply with other amounts of memory, that's not what I'm talking about here.

But on another point, do you think the ram really buys you the equivalent of GPU memory? Is Apple's melding of CPU/GPU really that good?

I'm not just coming from a point of skepticism, I'm actually kind of hoping to be convinced you're right, so wanting to hear the argument in more detail.


It's reasonable in a "working professional who gets substantial value from" or "building an LLM driven startup project" kind of way.

It's not for the casual user, but for somebody who derives significant value from running it locally.

Personally I use the MacMini as a hub for a project I'm working on as it gives me full control and is simply much cheaper operationally. A one time ~$2000 cost isn't so bad for replacing tasks that a human would have to do. e.g. In my case I'm parsing loosely organized financial documents where structured data isn't available.

I suspect the hardware costs will continue to decline rapidly as they have in the past though, so that $5k for 128GB will likely be $5k for 256GB in a year or two, and so on.

We're almost at the inflection point where really powerful models are able to be inferenced locally for cheap


For a coding setup, should I go with a Mac Mini M4 pro with 64GB of RAM? Or is it better to go with a M4 max (only available for the MBP right now, maybe in the Studio in a few months)? I'm not really interested in the 4090/3090 approach, but it is hard to make a decision on Apple hardware ATM.

I don't see prices falling much in the near term, a Mac Studio M2 Max or Ultra has been keeping its value surprisingly well as of late (mainly because of AI?). Just like 3090s/4090s are holding their value really well also.


It's reasonable when the alternative is 2-4x4090 at $2.2K each (or 2xA6000 at 4.5K each) + server grade hardware to host them. Realistically, the vast majority of people should just buy a subscription or API access if they need to run grotesquely large models. While large LLMs (up to about 200B params) work on an MBP, they aren't super fast, and you do have to be plugged in - they chew through your battery like it's nothing. I know this because I have a 128GB M3 MBP.


How large of a model can you use with your 128GB M3? Anything you can tell would be great to hear. Number of parameters, quantization, which model, etc.


I'm running 123B parameter Mistral Large with no issues. Larger models will run, too, but slowly. I wish Ollama had support for speculative decoding.


Thanks for the reply. Is that quantized? And what's the bit size of the floating point values in that model (apologies if I'm not asking the question correctly).


OP here, I almost got a decked out Mac studio before I returned it for a Asus ROG as the native Linux support, upgradability & CUDA support is much more important to me.

Meagre VRAM in these Nvidia consumer GPUs is indeed painful but with increasing performance of smaller LLMs & fine tuned models I don't think 12GB, 14GB, 16GB Nvidia GPUs offering much better performance over a Mac can be easily dismissed.


how about heat dissipation which i assume a mbp is at a disadvantage compared to a pc?


A MacBook Pro has lower peak thermal output but proportionally lower performance. For a given task you’d be dissipating somewhat similar heat, the MacBook Pro would just be spreading it out over a longer period of time.

nVidia GPUs actually have similar efficiency, despite all of Apple’s marketing. The difference is they the nVidia GPUs have a much higher ceiling.


I think the future of pesticides and weed control in food production will be robots manually removing pests. No chemicals required.

The technology we have today isn't too far off from this... weeds are certainly easier than bugs, but both should be feasible.


The weeds will adapt to that, too, over time. Some of the weeds you get in pavement cracks have strong roots and the top part breaks of easily to be quickly regenerated.


People should consider that campaigning, platforms and candidates would be entirely different in a world run by popular vote.

It may or may not ultimately be better, but it certainly wouldn't be a simple one variable change from the current political climate to the benefit of one party today.

I'm largely in favor of popular vote over the EC, but it would not usher in a huge change to the status quo, more likely a slightly different flavor of what we have today.

The stances of both parties has drastically changed over time under the EC model, and would continue to do so under the popular vote model


This is correct and Trump said that himself in 2016. According to him, he would spend a lot more time and money campaigning in California and New York instead of battleground states if the system was different.

And we would then see the very issue the electoral college tries to fix: small states wouldn't matter. It would be a lot more efficient in terms of vote acquisition to put in place policies benefitting Los Angeles than Green Bay, WI. Los Angeles is already extremely powerful as it is; the electoral college shifts some power away from huge cities.


Small states already don’t matter, with EC only “battleground” states matter; which feels way way worse.

We also already have weird dynamics when it comes to policy. More people are likely employed by Wendy’s than all the coal workers, but something tells me that coal workers are given way more policy proposals to help them than fast food workers.


Any state with a near 50/50 vote can become a "battleground" state. Only states with one party autocratic rule lose out.


Not if you go by popular vote, not necessarily at least. There are millions of voters for the other political parties in say California, Texas, New York, Florida; none of these voters will get their voice heard because of how they typically vote (CA going blue, Texas going red). Popular vote ensures that all votes matter, not just whatever happens to be a battle ground state which change every 10 years or so.

BTW saying that one party having a trifecta of governance as autocratic is fucking moronic and not accurate. Words have meaning, lets not diminish them because you're upset people don't vote how you want.


Do you truly believe it would be preferable, if one party consistently wins 51% of the vote, and the other wins 49%, to have one party permanently in power?

That (51% of votes translating into 100% of power) is certainly not fair either.

One thing the current system does surprising well is ensure that power remains divided over the long term.


Only if you define power as the length of a single human lifespan and not something that continues on when you exit our plane of existence.

The US has gone through dozens of major political and regional political parties over the last 200 years. Acting like this is ossified is just demonstrably false.

It sounds like you are more upset about the first-past-the-post voting system and that I agree with you.


First past the post creates the two party system. The multi party parliamentary systems give far more power to the party leader to the extent that all tweets by liberal party members in Canada have to be approved by the leaders office. So instead of a MP that represents a constituency it ends up with only ~5 leaders that are allowed independent opinions.


>Words have meaning, lets not diminish them because you're upset people don't vote how you want.

I don't know how you twisted my words in your mind to insinuate I just want 'them' to vote for 'my' team.

Governments that are dominated by one party, whether it be north korea, russia, or California tend to ossify and are controlled by party apparatchiks instead of democratically elected politicians. When there are multiple parties in close competition all parties are forced to be more responsive to the electorate.


We're talking about the US here, please tell me which states are autocratic because at a cursory glance that is a brain dead response.

Is Florida an autocratic state because the GOP control all three branches of government? Is Massachusetts an autocratic state because the Democratic party control all three branches of government? What about California, add in their jungle primaries as something that makes them extremely competitive when it comes to running elections, are they too autocratic because the Democratic politicians control all three branches?

Sorry but words have meanings, just because you want to misuse them doesn't mean we have to accept it. You conflating autocracy with competitive elections is just foolish.


I'm not. In China political struggle is hidden from the public and happens behind the scenes. Same thing happened in the Soviet Union. This made me interested in how autocratic political systems operate but there's not much scholarship on this. I found studying one party systems in America to be helpful in understanding the autocratic political system.

In a one party rule state the senatorial/gubernatorial race is not competitive and always won by the one party in power. To appear on the ballot as a representative of that party requires the party endorsement. In effect the party endorsement is the election. However the endorsement process is controlled by the party and not open to the public.

Endorsement selection is based primarily on how much money the prospective candidate can get their friends to donate. (From this money some percent is kicked back to the party and the rest is spent on services provided by party apparatchiks. It is a closed ecosystem). The other requirement is canonical adherence, though this is more required for national politics.

How the parties work behind closed doors I'm less familiar with because im not an insider, in the Chinese communist party case or the American case.


Only if we're stuck in the 18th century technology and a presidential candidate has to physically visit each small town to talk to its people. In that case, yes, small states wouldn't matter because no candidate will have time to visit Omaha or Boise.

But thankfully we're in the 21st century: a candidate can mail flyers and buy youtube ads with policies tailored for those perennial "red" states. Under the current system it makes zero difference, so nobody bothers, and they stay forgotten. Under popular vote, every citizen who votes for you in Omaha, NE is as good as another citizen in Philadelphia, PA.


It would absolutely be better. Think of how many Trump voters are disenfranchised in California. Think of how many Harris voters are disenfranchised in Alabama. The president represents everyone, and yet, seven random states will actually move the needle in the contest this year. This is only controversial because it benefits one party that would otherwise consistently lose in an election that uses sane math where "greater than" hasn't been redefined to let the loser win, so now everyone who's into that is going to come out of the woodwork to explain how a minority-rule system from back when we had slavery is actually super great, guys.


Alabama won't see another candidate if the US went by popular vote ...


Who cares about "Alabama"? Or "Texas"? Or "California"? The point being made is that the presidency shouldn't be weighted by states. It should be a selection by the people.

To your assumption that Alabama will always be lopsided conservative while the country will always be lopsided Republican: Why should the Democrats in Alabama never have their voice heard?

I support electoral reform for the legislature... but the EC is trash.


You would expect market cap of the largest players to decline if the market becomes more competitive... which was China's primary goal with their recent crackdown.

They proactively forced interconnectivity and limited the ability for companies to make "walled-garden peudo-monopolies", as we have in the US with Apple and Google.

If the same happens here (through act of congress, or legal outcomes), you can expect their market caps to decline as well. A decline in market cap doesn't speak at all to whether it's beneficial to the industry or consumers


> They proactively forced interconnectivity and limited the ability for companies to make "walled-garden peudo-monopolies", as we have in the US with Apple and Google.

WeChat is the inspiration for the idea of the "everything app" that so many US companies want to create but have always failed to. Has it somehow been newly limited in its ability to control an absurd percentage of all Chinese internet-connected activity?


This was and is part of their goal, yes.

https://asia.nikkei.com/Business/Technology/Beijing-asks-Ten...

"Beijing asks Tencent to lower WeChat's mobile payment market share"

https://www.forbes.com/sites/zennonkapron/2022/11/09/chinas-...

https://www.business-standard.com/world-news/china-may-impos...

"China is planning to introduce a new mobile payment regulation aimed at reducing the market share of Tencent Holdings' WeChat app, similar to efforts made by the National Payments Corporation of India to curb Google Pay and Phone's growing dominance in the market"

China's methods are more authoritarian than are viable in the west. But the general premise of a competitive market being better for society than dominance by a small number of firms is supported well by history


Competition usually leads to higher wealth creation and GDP, not less.

If you use real life examples and history as a benchmark


There's been a lot of comparisons between the gilded age and modern us economic structure with respect to individual wealth.

I contend the even more disturbing mirror is the trust/monopoly/cartel structure of virtually all industries and markets in the United States.

The ultra rich are a symptom, not necessarily the cause: we need a massive breakup of practically every sector in the United States, and it's not just for what the parent of this comment says about wealth creation: increased employment, more job mobility, more innovation, overall competitive advantage in the world, more resilience to global supply chain disruption, and innumerable other national security and economic concerns.

Monopolies are really bad for freedom. As we see with the closed no-appeal ban systems of internet companies and utter lack of customer service, your very day to day freedom can sharply be curtailed at a whim by the centralized power of monopolies.


I switched to Ubuntu recently on a Thinkpad Z16 and had some compatibility and battery life issues.

I think Linux is great for more mainstream models that have already been out for a few years, but unfortunately I can't recommend if you're buying a new or more niche machine.

It's really a shame, as I absolutely don't like the direction either Microsoft or Apple is going. I keep ending up back at MBP due to overall polish/hardware quality, but I prefer linux to MacOS. I hope Asahi Linux can get fully up-to-date with latest models and resolve the various QoL issues


This is where I'm at. I generally prefer Linux but I don't necessarily hate macOS, I've gotten by with a few annoyances.

It's really the hardware holding me back, still. I'm also watching Asahi closely. Any other laptop I've tried makes a compromise somewhere that I don't want - bad screen, track pad sucks, hit and miss keyboards. Plus heat and fan noise.

I just want a MacBook Air, but Linux. The new snapdragon surface laptop 7 is close (except the keyboard) but it doesn't run Linux and I've given up on Windows a long time ago.


Cannot speak for all Thinkpads, in my case it was the Thinkpad P16s Gen 2 AMD (rolls right off the tongue). It was pretty painless to get everything set up for me.


Then buy laptops made with linux in mind like Slimbook or Tuxedo.

Slimbook Execute, Slimbook Excalibur, Tuxedo Infinity and Tuxedo Pulse are amazing machines.


They're still held back by the chips.

RaptorLake gets close (and beats) the M3 in performance in some areas, but not in heat or fan noise.

No doubt they are all great machines, but after using an air for the last few years I can't use anything else that's not as cool or as quiet.

I'm holding out for Asahi Linux, or some better (and Linux supported) snapdragon elite offerings. The new surface laptop 7 is pretty close hardware wise (except the keyboard) but it can't run Linux, sadly.


Raptor Lake is pretty ancient, though. Wasn't it basically the same as Alder Lake?

Lunar Lake is supposedly pretty close to the Snapdragon. A bit slower but no need to bother with ARM and a much better GPU (if that matters).


I think Slimbook is releasing a Snapdragon laptop next year. I'm not a big fan of ARM in the desktop but I'm genuinely curious about it.


They aren't direct competitors.

You have two walled gardens and two monopoly-esque distribution platforms within those walls.

Nobody with an iPhone can use Google Play, and nobody with an Android can use the app store.

Which is why disallowing, or hindering, competing app stores within one walled garden is clearly anti-competitive.

It's not reasonable to expect consumers in one ecosystem to completely leave the ecosystem for one specific app, just like it's not reasonable to expect a homeowner to sell their house and move somewhere just so they can pay a lower utility bill.

Is the utility company serving your house a competitor with the utility company across the street if I have to move houses to switch between them?

Yes, if you look at the market as a whole. Clearly not if you use a reasonable interpretation and consider costs of switching.

If Apple and Google are truly providing unique value to developers and consumers, then they have nothing to fear from alternative app stores. Their profits won't be affected.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You