For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | thwarted's commentsregister

Generally a good idea, but I'm not sure why you should even want to fork a git repo when a local clone should be sufficient. But this is probably a terminology mixup from the way github presents forks and clones.

I believe the author's idea is to do dev work from a Github account that only has access to the fork, but not to the main repo. Then, as a contributor, you'd open PRs from your fork to the main repo. I think this would only work if your Github account doesn't have write access to the main repo, though. I know you can use 'deployment keys' to give read-access to a single repo using an SSH key, but not sure if you can otherwise restrict access to a single repo with write access. Essentially, though, you'd want to find a way to give the remote host the most limited possible privileges to your Github account.

You could also just set the development machine up as a remote on the repo on your local host and then pull, diff, and merge locally. Then the llm agent doesn’t have access to any github account at all.

I use an overlay copy of my workdir, then the sandboxed LLM doesn't get any of my secrets, can do its own commits, and I pull the ones back that I want.

Oh, a separate GitHub account that has its own forks of the repos the agent is working on. Yeah, that's probably the most secure, isolated, and safest. The merge to the canonical repo then needs to go through a human, or at least separately controlled, process via a GitHub pull request.

Maybe this is doable with scoped API keys instead of SSH keys?

On a GitHub project, agents must just be considered untrusted external contributors.

They mention that as a mechanism for protecting the SSH keys for the repo.

Essentially using a repo that doesn’t matter with the coding agent and then creating a cross-repo PR to the real repo.


> It shows that you can build a crazy popular & successful product while violating all the traditional rules about “good” code.

We already knew that. This is a matter of people who didn't know that or didn't want to acknowledge that thinking they now have proof that it doesn't matter for creating a crazy popular & successful product, as if it's a gotcha on those who advocate for good practices. When your goal is to create something successful that you can cash out, good practices and quality are/were never a concern. This is the basis for YAGNI, move-fast-and-break-things, and worse-is-better. We've know this since at least betamax-vs-VHS (although maybe the WiB VHS cultural knowledge is forgotten these days).


WiB is different from Move Fast and Break Things and again different from YAGNI though.

WiB doesn't mean the thing is worse, it means it does less. Claude Code interestingly does WAY more than something like Pi which is genuinely WiB.

Move Fast and Break Things comes from the assumption that if you capture a market quick enough you will then have time to fix things.

YAGNI is simply a reminder that not preparing for contingencies can result in a simpler code base since you're unlikely to use the contingencies.

The spaghetti that people are making fun of in Claude Code is none of these things except maybe Move Fast and Break Things.


> WiB is different from Move Fast and Break Things and again different from YAGNI though.

Yes, which is why I listed all three.

It's not about if the vibe coding results in any of these strictly, it's that the vibe coder can claim that the low quality doesn't matter and cite any of these as support for why the low quality doesn't matter.


VHS was not worse is better. It’s better is better.

Specifically, VHS had both longer recording times and cheaper VCRs (due to Matsushita’s liberal licensing) than Betamax did. Beta only had slightly better picture quality if you were willing to sacrifice recording length per tape. Most Betamax users adopted the βII format which lowered picture quality to VHS levels in order to squeeze more recording time onto the tape. At that point Betamax’s only advantage was a slightly more compact cassette.

Also to correct another common myth, porn was widely available on both formats and was not the cause of VHS’s success over Betamax.



Not in ways that the market cared about.

Arguably better quality, but at the cost of being shorter. In the great trade off of time, size, and quality, I think VHS chose a better combination.

Importantly, it was so short that it was inadequate. Go beats no go every time.

It depends which definition of "better" you use. VHS won the adoption race, so it was better there. While Betamax may have been technologically superior, in hindsight we can say it apparently failed to address other key aspects of the technology adoption lifecycle.

Who cares? It couldn’t hold a movie on one tape. Thats what the market ended up selecting for. As soon as renting movies took off Beta lost.

When they compromised quality to get there, they were just more expensive.

And later S-VHS improved quality anyway.


> can we demon strait somehow a unpriv non root user

"demon strait". Was this speech to text? That might explain the punctuation and grammar.


The grammar doesn't matter. It's a total waste of time. Obviously not when writing to another human, when then it's a show of respect.

The upvotes ultimately train the bots, reenforcing the content posted. Even the most passive form of interaction has been co-opted for AI.

If you have ssh access to the remote machine to set up a git remote, you can login to the remote machine and commit the changes that you forgot to commit.


Charging for self-hosted runners is like a corkage fee but you still need to open the bottle yourself.


> it happens when they give Claude too much autonomy. It works better when you tell it what to do, rather than letting it decide. That can be at a pretty high level, though. Basically reduce the problem to a set of well-established subproblems that it’s familiar with. Same as you’d do with a junior developer, really.

Equating "junior developers" and "coding LLMs" is pretty lame. You handhold a junior developers so, eventually, you don't have to handhold anymore. The junior developer is expected to learn enough, and be trusted enough, to operate more autonomously. "Junior developers" don't exist solely to do your bidding. It may be valuable to recognize similarities between a first junior developer interaction and a first LLM interaction, but when every LLM interaction requires it to be handheld, the value of the iterative nature of having a junior developer work along side you is not at all equivalent.


I didn’t say they are equivalent, nor do I in any way consider them equivalent. One is a tool, the other is a person.

I simply said the description of the problem should be broken down similar to the way you’d do it for a junior developer. As opposed to the way you’d express the problem to a more senior developer who can be trusted to figure out the right way to do it at a higher level.


This feels like a rediscovering/rewording of Kernighan's Law:

"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." ~ Brian Kernighan


It's an old saying, I think Einstein is cited most often for it... something like this according to Google:

"We cannot solve our problems with the same thinking we used when we created them."


> I tried to teach her to code a few months back and it was hilarious. We started with "First download VS Code". We never made it to another step.

This has been a serious regression in the industry for a while: popular operating systems (I'm looking at you, Windows) don't encourage and are not set up for their users to program or even do the bare minimum of random automation unless it's embedded in an application and meant for automating just that application (excel macros).

You are encouraged and directed to install and use "apps" which are either a one-size-fits-all lowest common denominator or a tries-do-everything dog's breakfast frustration.

The Commodore 64 turned on instantly and said "READY." and effectively gave you a blank canvas to poke (no pun intended) at. It was BASIC, but it was a real (if simple and limited) programming language and you could get immediate feedback and satisfaction from playing with it to learn what it could do. The syntax of BASIC is simple, the stdlib is comprehensive and unopinionated. There was nothing to download to get started to try to get that initial dopamine hit and to start to realize the true power of what computers can do and what you could make them do.

If you want a better chance at getting someone excited about programming, there are much better places to start than VSCode. pico8, scratch, even the browser's developer toolbar is more accessible than VSCode.


I read comments such as:

>> I don't get it. LLMs are supposed to have 100% bridged this gap from "normie" to "DIY website." What's missing?

as less sincere and more facetious, calling out that every single "AI" company is massively overhyping their capabilities and use-cases. You did the same thing in a more detailed fashion, enumerating all the constraints that AI can't address, and others that speak to the reasons that small businesses don't have websites independently of the tooling/services that are ostensibly able too make it easier or remove barriers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You