For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | joquarky's commentsregister

Not reliably.

We should be applauding the promotion of science and useful arts that genAI is fueling.

But egos are involved.


At least it's more productive than AI Derangement Syndrome.

Every time Walmart does something evil, I cry myself to sleep when I realize I can never go to Walmart again.

Local models are a thing.


If you're worried about theft, then make backups.

Code isn't like a house, you can just copy it and put the copy somewhere safe.


What should be the maximum allowable cyclomatic complexity of license conditions?

How did they steal your code? Don't you have backups?


What do you think is the end state? What will society look like 5 or 15 years down the line if somebody creates actual AGI, according to you?

Let's not forget the basis here: To promote the progress of science and the useful arts.

Everything else is window dressing. The fact that licenses even exist to conditionalize use goes against this grain and creates far too much overreach that spoils the spirit of the basis of copyright law.


The question "why" is always answered with post-hoc rationalizations. This applies to both LLMs and humans.

No, I think a lot of humans can explain why they're adding a new button to the checkout page, or why they're removing a line from the revenue reconciliation job. There's always a reason a change gets made, or else nobody would be working on it at all :)

As someone who often fails to read subtext, I would estimate that most people expect you to participate in mind reading as a natural part of conversation.

So it is no surprise that many people have difficulty switching gears to literal mode when interacting with these models.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You