It was because IA-64 was a completely different unrelated architecture that until AMD succeeded with K8 was "the plan" for both 64bit intel roadmap and the roadmap to kill off compatible vendors (AMD, VIA)
I guess because 99% of generated code will likely need significant edits, so you'd never want to commit direct "AI contributions" - you don't commit every time you take something from StackOverflow, likewise I wonder if people might start adding credit comments to LLMs?
> I guess because 99% of generated code will likely need significant edits
What are you guessing / basing this on?
I have many commits with zero human editing. The relative split is def well away from a 99% vs 1% at this point for any edits, most remaining edits for me are only minor, not "significant"
OP here: I had the same thought, but noticed a very similar trend in both [0]; I think this graph is more interesting because you'd expect the number of new users to be growing [1], but this seems to have very little effect on deleted questions or even answers
The second graph here ([1]) is especially interesting because the total montly number of new users seems completely unrelated to number of posts, until you filter for a rep > 1 which has a close to identical trend
It's great for a prototype which doesn't need to store a huge amount of data, you can run it on the same VM as a node server behind Cloudflare and get a fairly reliable setup going
I really like this reactive guide style interface, which maybe could be quite a good project idea like mdBook[1] but also you to insert quizzes/examples alongside static notes
I had the same thought - I guess it's similar to that idea that if you had someone else's eyes, you might not perceive specific colours to be the same?
But actually it sort of makes sense since (from what I understand) is stimulating an external interface (the receptors), so you're mimicing what the effect a smell would have on you rather than the electrical signal created by the response to a stimulus?