For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | maximamas's commentsregister

Yeah I try to keep it away from overlapping it's work as much as possible. Using plan mode in claude or just telling codex to build a plan, that is structured in a parallelized way for multiple agents usually helps delegate tasks to be handled at the same time. Typically: app code, infra, and data layer are the main three, but obviously depends on the project.

If I ever find my self just waiting, then it always gives me an opportunity to respond to messages, emails, or update tickets. Won't be long now until the agents are doing that as well...


Checks out, that’s definitely a CTO’s perspective.


Banning phones is a waste of time, the children will find a way. We need to embrace phones and teach responsible use along with holding content providers accountable. A war on phones will be as effective as the war on drugs.


Funny you should say that, when the article specifically mentions how people said the same thing about "attempting" to ban tobacco cigarettes in schools last century, and how-- lo and behold-- while cigarettes do still exist, they're now considered socially unacceptable for students to use by all involved parties, and the percentage of students using them is negligible.

I don't necessarily agree with the article's assertion that cigarettes and cell phones are the same kind of problem in need of the same kind of solution (since phones have legitimate use cases outside of the problematic ones, and cigarettes don't). Just wanted to point out that the article specifically called out your argument, and you seem to draw a different conclusion about reality (war on drugs failed) than the article (students no longer smoke).


> while cigarettes do still exist, they're now considered socially unacceptable for students to use by all involved parties, and the percentage of students using them is negligible.

Cigarettes have been almost entirely replaced by vapor products, which are arguably more popular than cigarettes have been in decades.

The bans weren’t effective; by the time they were in place, an new generation of products had already arisen and replaced the targets of those bans. Meanwhile, vapes are banned as well, but are ubiquitous.


Which was the argument against prohibiting smoking at school:

> John Tolton, then chairman of the Metro Toronto School Board, was having none of it. Tolton doubted whether schools even had the right to ban smoking, and if they did, “it will merely drive smoking underground.” (He was also unconvinced of the dangers of second-hand smoke.) There were other concerns, including the belief that bans would drive smokers off school property, inciting conflicts with neighbours and exacerbating absenteeism. Some parents gave their kids permission to smoke, so (the thinking went) better to offer them a regulated space in which to do so.

Why won't the war on phones be as effective as the war on tobacco?


> Why won't the war on phones be as effective as the war on tobacco?

I also just pointed the same thing out to the parent commenter, but to play devil's advocate: it won't be as effective because phones have a variety of legitimate use cases (communicating with family & coworkers being among them) that mean people will end up wanting/needing to use them later in life anyway, and will potentially want or need to use them (or learn how to use them) while they're still students. Tobacco/nicotine drugs serve no functional or productive purpose the way phones do.


That does not seem to be devil's advocacy but rather changing maximamas' position to one which is more defensible.

The argument is "the children will find a way", which was exactly the argument with smoking.

In practice, history tells us that fewer children found a way to get tobacco and smoke on school grounds.

Why should we think that a school ban would have no effect on smartphone use on school grounds?


I didn't say a ban would "have no effect." I said "it won't be as effective" (in response to a quote from you asking why it wouldn't be "as effective"). I gave my reasoning.

It was "devil's advocate" because I was agreeing with you but presenting an argument to the contrary, anyway.


Which is why I said you changed the topic to something more defensible than the original comment. That isn't "devil's advocacy" but arguing for the sake of arguing.

We see alcohol is used for business and social purposes, including Friday beer and pizza paid by the company, and conferences with beer and wine during receptions.

If we accept the premise that "legitimate use cases" that "people will end up wanting .. later in life" justifies it being taught or allowed at school then when will we teach high schoolers to drink alcohol?


> Which is why I said you changed the topic to something more defensible than the original comment.

Oh, I'm sorry, I thought this was a conversation. I didn't realize nobody was allowed to rebut you since you were just replying to one specific comment. /s

> If we accept the premise that "legitimate use cases" that "people will end up wanting .. later in life" justifies it being taught or allowed at school then when will we teach high schoolers to drink alcohol?

What are you even trying to say here? Drinking alcohol isn't a skill people need for anything productive. I'm almost 26 and I've never had an alcoholic drink.

> We see alcohol is used for business and social purposes,

Information technology is required for certain business processes to increase productivity. Alcohol is not. Your argument is either bad-faith or just poorly formed.

Drinking alcohol at work would more appropriately be compared to using social media at school (it's a social vice that doesn't help you practically). Not using phones in general.


There are certainly jobs in the alcohol industry which require drinking alcohol.

Try getting a job as a wine critic without ever drinking.

Some sales reps get an entertainment budget for, among other things, meeting a potential client over drinks. A teetotal at the same job may have a harder time being productive.

People do actually drink alcohol while on the job, in situations condoned by their employer, and where they are being paid. I gave examples, and can think of others, like a company holiday party.

That is very unlike using social media at school during class.

Just like how you have not ingested any alcohol, I don't have a smartphone and don't see how having it will increase my business productivity. I actually think it will make my productivity worse as I already have to force myself to work where there is no ready internet connectivity, otherwise I am too easily distracted by checking to see what's new, rather than working.


Exactly. A “dumb” phone seems to be the answer.


If we're talking about the professional angle, businesses certainly use apps and the internet for productivity. There are database apps, point of sale apps, group communication software like Slack or Teams, etc.

I think you're better off attempting to define what you want to block (e.g. social media, academically dishonest websites, and maybe games) than whitelisting only calling and texting as if the smartphone cat's not out of the bag yet.


This is about banning phones during class or school hours. Even though the war on drugs may have failed the general populace, the prohibition of drugs at schools has been largely successful in preventing drug use from taking place in classrooms and on school property. (the same with smoking bans at schools)

This is the same.


Making things illegal and not accepted works decently well. There wasn't a drug vending machine at my school, even though a few rogue students were probably buying and selling drugs at my highschool. Thus I was less likely to do drugs.

Very few handguns are in students backpacks, because we don't accept it. If we didn't accept cell phones, we'd have a lot fewer kids suffering from them.


Tough crowd in these comments. I think this is a beautiful project and it applications to AI are very interesting.


Finetuning should never be the first step; it's slow, expensive, and indeterminant. Until you are maxing out that context window, you can just keep layering in more information into the prompt.


Pinecone makes it super easy to get up and running with RAG asap. Those prices are ridiculous though and any project with legitimate scale will move on to a more affordable solution.


Hey, I'm from Pinecone. What scale are we talking about? Many of our customers come to us with 500M–10B embeddings precisely because other managed solutions either ground to a halt at that scale or cost even more.

Even so, driving the cost down for large workloads like that is a priority for us. We recognize the GenAI / RAG stack is a completely new line item in most companies' budgets so anything to keep that low can help these projects move forward.


I'm no die hard apple fan, but you would think I was if you heard me talking about my 16" m1 pro. It's an absolute beast, battery for days, and I've never heard the fan spin up once. It would take a lot for me to even give another machine a chance.


There's something about being very well constructed with high attention to detail / finishes. Growing up my parents had a new Subaru and a much older Mercedes station wagon. As a teen driving both, you could feel the difference in finishes, and overall solidness of the Mercedes, it felt like driving an adequately powered slab of marble where as the much newer Subaru felt, well plastic and fragile in comparison.


This was my experience moving from Subaru to Volvo as well. That “solidness” feeling is probably the best way to describe the difference between economy and luxury cars generally.


Oh, now you’re speaking my language. That’s exactly why I drive old Mercedes station wagons: everything else feels like it’s barely holding together.


The chipping sharp edges of the case of my work issued Macbook isn't something I would consider good construction.


Good construction relative to what is available. I will take a Macbook with a sharp edge over a dell laptop where the screen frame is falling apart, the hinge is non existent, and the trackpad is useless. If you go out to find faults in Macbooks, you'll find them, just not as many as the alternative.


If you pay dell or lenovo what you pay for a MacBook, you get decent hardware too.


I’ve been very happy with the 16” M1 Pro’s I’ve done work on. It’s probably the first laptop I’ve used where the load threshold at which its fans make noticeable noise feels somewhat appropriate (rather than spinning up for little to no reason), its power level feels more desktop-class than laptop-class, and I don’t have to keep my eye glued to the battery meter even when running heavy IDEs.

I’m even kinda happy about the notch, because it prompted Apple to add a strip of extra pixels for the menubar to live in, leaving the remaining 16:10 area fully open for use by apps.

The only downside is its weight, but given all of its other upsides I can live with that.


Funny how an IDE is considered heavyweight these days, rather than a vfx package like Blender.


I've been rocking this same device for a while now and it's revived my apple fanboyness just a little. The hardware itself gets an A+ from me.

What I really want from Apple at this point is better UX on MacOS. Stage Manager is an interesting idea but, to me, it's not really a fix for any of my problems so I've just disabled it. I've used two 4k external monitors for years on MacOS and the same little annoying bugs plague me. Specifically, I think how MacOS handles full-screen apps is just not quite right. I don't understand why things feel clunky in just this area of the experience. We need what happened in iOS a few years ago when they got rid of the home button and were forced to make opening/closing/switching between apps much more fluid. I need MacOS to feel fluid like that. Then, it'd really be "perfect" for me.


Not affiliated, just a happy customer: https://ubarapp.com/


I have the same complaint about UX. I sorely miss tiling window managers. I miss configurability of window management.

I have baked into my muscle memory the expectation that when I hit the keyboard shortcut to summon virtual desktop number 5, that desktop will show up on the monitor that currently has focus, no matter which monitor(s) it may have appeared on before. This setup is impossible in Mission Control or whatever the multiple-desktop thing MacOS is called. I can choose between:

"Displays have separate spaces" checked: left monitor has desktops 1,2,3, right monitor has desktop 4,5,6, and if I add a new one to left its number is 7. Want to put desktop #4 on the left? You can't except by dragging all the windows one by one, like a cave-man. What happens to the numbering when you add or remove another monitor? It's weird.

"Displays have separate spaces" unchecked: now I have numbered Left+RightMonitorMonstrosity desktops, but if I want to switch the left monitor between "documentation" and "email", while leaving the right monitor on code, I'm out of luck. This setup behaves a bit better about adding and removing monitors, I will admit.

My old Xmonad setup with numbered desktops (which I cloned from my ion3 setup) behaved beautifully when adding and removing monitors. This is to say nothing of having had a Mod4 key solely for my own use, which I ended up using almost exclusively to interact with the window manager.

I can't wrap my head around a film strip of horizontally situated desktops that I swipe through or page through. I can't fathom the idea of making "full screen" change an app from looking like a window to looking like a desktop, and whats more appending the new desktop to the end of the list. I know that MacOS already knows what the windows on that other desktop are going to look like before I switch, so why does it insist on showing some kind of animation when switching (even "reduce motion" changes it from a wipe motion to a useless fade), like iOS does to hide load time? I know about amethyst and rectangle and setting up a "hyper" key with karabiner-elements or qmk or whatever. No amount of it adds up to the same experience that I had with ion3 back in 2006 and I get worked up that I paid into the ecosystem and bought this otherwise-great laptop and I can't make it work the way I want it.


I have "Displays have separate spaces" checked, and I can drag a desktop from one display to the other by grabbing the desktop from that "film strip" and dragging it over to the other display. I don't have to drag all the windows individually. They move together with desktop.

The limitation, which you might be bumping into, is you can't drag the current desktop that's visible on a display, which is sometimes annoying but makes sense. Switch to a different desktop first.

I agree with you about the confusing ever-changing number labels on the desktops. I would really like to assign names to desktops, like "Work" and "Project 1". The GUI has room for it, as the full-screen app desktops already have names.


Have you tried Magnet or Amethyst?


I haven’t! I love that there are 3rd party solutions for these things, but the base experience should be better.


It's less difficult than I thought to to get an M1 Max chip hot enough to spin up fans. Run CitiesSkylines on a 4K display with all of the graphics maxed out for a few hours. ;)

Or do 8 parallel runs of transforming and merging a massive amount of jpgs into less massive pile of pdfs. Just about fully pegged all of the cores for hours.

What surprised me was how fast everything still was. Without the fan, I wouldn't have known the load the system was under.


This is a tangent but, I cannot for the life of me get Skylines to run without the cursor offset by approximately the height of the notch, even on an external display. Have you not had this problem?


Out of curiosity, why would you want to combine jpegs into PDFs? And why did that cause such a load? It wasn't just embedding them into PDFs, but somehow recompressing them too?


Documents scans. Lots of them. They all needed to get processed so the pages would be the same size.

I don't know what imagemagick does when you ask it to merge jpgs into a pdf. :)


Ah, that makes sense


Ya, those workloads would do it. My M1 Pro 16" really made my intel mbp feel sluggish, especially if I'm running any containers. Not quite enough to replace it yet, but sometime in the next year if I can do something useful with it.


Personally, for me, the best laptop ever made is:

14 inch M2 Macbook Pro.

Combine it with the Anker 737 Power Bank, and it's a match made in heaven.


Yep, this is what I'm running. Honestly shocked how good it is. Trackpad and monitor are great (the things they have never screwed up, to their credit), keyboard is back to being great, magsafe charger is back, no dumb touchbar gimmick, all usb-c, headphone jack, and I honestly think it seems faster - even on battery - than my beefy workstation I used to have at my office (just with less ram). The performance especially just feels like magic, for a laptop.


I went with the 32GB Ram upgrade ($400)-- mostly because I work with Docker (it seems to eat up about 33% of the RAM).

Never heard the fan. Stays cool to the touch, which is quite unexpected and pleasant.


The thing that's holding me back from getting one is the memory markups. The base configuration is too low and I can't just change the memory myself because everything is soldered.


I used to give Apple the ole' eye roll for that as well. Then I realised, as I got a MacBook myself and dove into running Machine Learning models on it, the RAM setup is pretty unique.

Essentially, the RAM is so close to the CPU and GPU that it can effectively be used as VRAM, at least for the M1 and up chipsets as far as I'm aware. That means a 32GB RAM MacBook would be able to run incredibly large (e.g. LLM) networks on-device. Nvidia GPUs with that much VRAM (although they are clearly better at GPU tasks) can cost as much as an expensive MacBook already.


Yeah, the memory is overpriced. So are eyeglass frames (rimless are especially overpriced). But as long as you can afford it, a few hundred dollars isn't that much spread over a few years. Or, think about the other option. You can get a not-MBP and have a clunky experience [1] but save some money, or spend a the extra few hundred for a MBP with enough memory to be a great experience. (Assuming you like macOS, of course.) I interact with my MBP all-day, every day, and it's totally worth a few hundred dollars to get something I love using.

[1] In addition to the non-MBP hardware being clunky, your choice of OS is cutting your steak with a spoon (Windows), or a huge drawer full of tangs, handles, prongs, and spoon-bowl and you spend your time digging around to assemble a knife, fork, and spoon that are the same style and finally give up and settle for "well, it matches if you squint" (Linux). I love the idea of Linux, but I've never actually gotten my system to where I like it, just to where I can tolerate it.


Few hundred?

When I looked at adding my ideal RAM and storage onto the base MBP M2 Pro model, the RAM upgrade came to $1634.56 (£1250), and the storage upgrade came to $2876.83 (£2200).

That's enough to be a long term purchase, so I decided to wait for the M3 and see if 3nm makes a big difference.


>Yeah, the memory is overpriced. So are eyeglass frames (rimless are especially overpriced)

That's not a very good comparison. Glasses are important for helping the visually impaired, while Apple memory is just an add-on for a luxury computer.

>But as long as you can afford it, a few hundred dollars isn't that much spread over a few years.

It is that few hundred dollars is a 10x markup over market rate (I've checked, and Apple's markups are actually that bad).

>You can get a not-MBP and have a clunky experience [1] but save some money, or spend a the extra few hundred for a MBP with enough memory to be a great experience.

I don't think my experience with my current PC is clunky, and if I get an M1 macbook, I'll be using Asahi Linux instead of MacOS.


I don't mind overpriced as much as the paltry maximum. Every other computer I have has at least 64GN of RAM, including all my laptops, but until recently you needed a Mac Studio to get that in Apple land. Among other things you can't run large LLMs on only 32GB.


Yep, you're not wrong at all. It's a great computer, but this is where they get you.


Sure. Get over it. I did. You can too.


Same experience here. However, I regret getting the 512GB drive option. I'm constantly monitoring my disk space as I do work and personal stuff on the same machine. Like I build Docker images as part of work and have to regularly purge out old images. Good thing macOS intelligently makes space (I have about 250GB in the Photos library) so I also get a random free 10GB from time to time.


I was issued one by my new employer. I'd much rather have a 4th USB-C port than the less versatile HDMI. For myself I'd probably go for a 15" MacBook Air instead, even if maxed out on RAM it's not that much cheaper than a MBP.

Another factor: I live in the UK, where they have a particularly crackpot derivative of the ISO QWERTY layout that well nigh unusable for a programmer. Apple is the only laptop vendor that will allow you to choose your keyboard layout, so when I buy anything else, I order from the US, with all the customs hassles that implies.


The M1 finally convinced me to forgive Apple for canceling the Apple ][.


Meh. I have battery for 10 hours, better than the 7 on my Linux machine but not stellar. Graphics are mediocre. Lack of ports means I have to carry dongles around. And putting up with macOS (with no Linux available) is a complete dealbreaker.


Heavy though. Coming from an m1 air to a 16” pro is cartoon-eyes-out-on-stalks surprising when you first pick it up.

I can’t help wonder if the m2 air 15” is the best on the market currently.


The 15 air is actually quite a bit heavier, it goes from 'easily hold it up with one hand with lid open' to, well, barely possible.

I'm so surprised more people don't want brighter screens.

At night my screen looks gorgeous, though I could even go a touch brighter when doing detailed design stuff.

Daytime? I'd go for literally 2-3x brighter if I could. Let alone working outside! Screen brightness is the biggest QoL improvement I'd get vs any other spec bump.


Yeah a brighter screen would be awesome. i misunderstood that the m1pro would be a huge step up in brightness from the m1air but it’s just as borderline in direct sunlight


I use a 13” M2 and it’s by far the best laptop I’ve ever used.

The only things it doesn’t check off is the HDMI out and other ports issues IMO.


FWIW the 14" and 16" M2 pro/max Macbook Pros have HDMI, SDXC, three usb-c ports, and a headphone jack.


I strongly prefer portability to port availability.

I work with my laptop only. No peripherals and I move around a lot. Makes it easier for me to work this way.


15in M2 Air or 14in M1 Pro are both really great options, the optimal mix of screen size, weight, size and power


I know, the last couple macbook lineups are crazy good again. Just when I thought I was out!


Yet my 4 year gaming laptop that back then costed half of a m1 macbook pro when it was new renders any typical Blender scene 4 times faster than it (using the newest Blender version with Metal support).

I suspect the fan doesn't turn on because it is heavily throttled.


Same, same.


It’s ironic you say: “we are playing with fire.” Playing with fire is, in large part, literally how humans have come to dominate this planet. Why stop now?


To turn your metaphor on its head, we aren’t playing with fire when we use it constructively; rather we are very carefully and thoughtfully deploying it, no doubt due to our gradual and deadly lessons with it over time. When we “play” with it (a la fireworks or neglected campfires), it wreaks rampant destruction.

Being we are basically toddlers with this new technology, I would argue the breathless speed at which it’s finding its way into our lives tells me we are not being careful or thoughtful with it.


Counterpoint: "playing with it" is the only way we have to actually master something. "Carefully and thoughtfully deploying it" only comes way after many people first extensively played with it (for any specific "it"), first because of curiosity (i.e. for shits and giggles), then for a quick buck.


Maybe because we're on the verge of being able to create fires which can actually consume the only home we have?

Playing with fire is in large part an ego and greed issue. Yes, it allows us to dominate, but at what cost?

I'd rather live a more balanced life than a greedy and ego driven life. I may not own the world, but I can be happy and sleep sound at night, and that matters.


On the verge? We set that fire in motion a century ago. Our home is nearly consumed.

Today is the hottest day in recorded history.

Yesterday was the hottest day in recorded history.

The day before yesterday was the hottest day in recorded history.

The day before the day before yesterday was the hottest day in recorded history.

The day before the day before the day before yesterday was the hottest day in recorded history.


We had nuclear weapons for almost 80 years and the world still hasn't ended. And I think that nuclear weapons are way more dangerous than Markov chains on steroids.


I can't launch a tactical nuke because somebody wronged me, but can create a disinformation campaign with the tools I have and optionally 2-3 smart, motivated individuals, for free.

Both can be equally devastating.

Or, if I want to go the extra mile, I can use the latter to create motivation for the utilization of the former. e.g. I may say that a country has WMDs, and maybe try to manufacture consent for destruction of these...

Oh, wait a minute...


> can create a disinformation campaign with the tools I have and optionally 2-3 smart, motivated individuals, for free

You can, and it may cause unbelievably nuisance, but not to the devastating outcome of a tactical nuke. Can you prove otherwise? Russian disinformation came close, such as the 2016 election, but that was state sponsored.

> Or, if I want to go the extra mile, I can use the latter to create motivation for the utilization of the former. e.g. I may say that a country has WMDs, and maybe try to manufacture consent for destruction of these...

You cannot. It was still state's action. Not to mention that many countries had their own intelligence that no doubt had different assessment. They weren't blind. They used WMD argument as the excuse to join the US led war.


>e.g. I may say that a country has WMDs, and maybe try to manufacture consent for destruction

too soon


What if Wargames had LLMs involved?


No, playing is how humans grow up to be adults that don't play, but think.


Are you serious? Us dominating the planet is NOT a good thing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You