I would probably just call it hand coding, as we say we use hand tools in wood working. Many do this for fun, but knowing the hand tools also makes you a better woodworker.
It's an interesting question: Will coding turn out to be more like landscaping, where (referring to the practice specifically of cutting grass) no one uses hand tools (to a first approximation)? Or it will it be more like woodworking, where everyone at least knows where a Stanley hand plane is in their work shop?
Can't wait to sell my artisinal hand-crafted software at the farmer's market.
Humor aside, long-handed programming is losing its ability to compete in an open market. Automate or be left behind. This will become increasingly true of many fields, not just software.
That's actually a great point: judging by the dev team's commits at work there's an unprecedented amount of code being committed but it's not actually making it into releases any faster. Maybe the same thing is happening at my various vendors, but then that kind of argues against the idea that Everything Has Just Changed.
Are there app stores on Linux? Yes, that's what FlatHub and Snap supposed to be.
So what, should Canonical just block Ubuntu downloads to anyone in the state of California? No security researcher is going to download an operating system that asks them their age for example. I feel like it draws a red line for me also.
This law is so completely insane. It sounds like it was written by some Apple fanboy to whom there is no other operating system other than Apple. The very state that spawned GNU and BSD is the same state that is not only demanding your data but enshrining its use in spyware in law.
NMP in particular readily biodegrades in aerobic environments, both in water treatment plants and just in water. Bacteria seem to crack it quickly. It's also not volatile. You have to protect yourself while working with it, but it's not comparable to really nasty stuff, like heavy metals.
I'm not aware of many (non-manmade) barren wastelands on Terra. Even the Empty Quarter has wildlife. About the only place I can think of would be something like the Dead Sea.
> What would fix that is enforcing the regulations nation wide, then applying tariffs on imported products that don't enforce the same regulations.
This is the biggest lie we are told, and the most heinous. The only thing that will fix it is when people like you (and me!) stop purchasing things which were made in those regulatory environments. If you continue to purchase them under the premise that "I have no choice, I have to participate in this fallen world," so does the state of California. Banning these activities when there are alternative regulatory environments just pushes the problem to someone else.
A great example of this is the Obama-era fuel efficiency laws. No one actually wanted a more efficient truck, so to get around the laws, the manufacturers just made larger trucks, which caused more problems than they solved.
Outlawing something, then doing nothing to stop demand for that thing, that's just irresponsible.
I don't think that will work. There's simply no viable path towards that much coordination; especially when late stage capitalism ensures that most people are living too hand to mouth to be able to worry about stuff like the environment.
You jest, but when I do interviews, I have prospectives write out a python program that ingests yaml ON THE WHITEBOARD. They don't have to be perfect. Their code doesn't have to compile. But, how closely they can hit this mark tells me if they have even a sliver of an idea what's going on in code.
> If you connect to a Wi-Fi network that isn't your company's, Teams will simply display the name of that network. So if you decide to take a "working lunch" and connect to "Starbucks_Guest_WiFi", your boss sees it instantly.
Looks like I need to rename my home wifi to "Corporate Network."
It's not that hard to create your own search engine, office suite, and school ecosystem. I mean, no single one of google's services isn't replaceable, especially if a country sets its mind to it. Just do that if you're worried about it.
> Instead, just look at go.mod. It lists the precise version at which all dependencies are built.
No, it does not. Minimum version selection means that the libraries will at least be that version, but it could be substituted for a later version if a transient dependency asks for such.
That I'm reading this blog post at all suggests there is a "market" for a single checksum/version manifest, which data is currently housed in go.sum . This is sad, but, Hyrum's Law and all that.
> No, it does not. Minimum version selection means that the libraries will at least be that version, but it could be substituted for a later version if a transient dependency asks for such.
No?
All dependencies - direct and indirect - are listed in your go.mod. Your module - as is - depends on nothing else. And those exact versions will be used to build it, if yours is the main module.
If your module is used as a dependency of another module, then yes, your module may be built with a newer version of those dependencies. But that version will be listed in that module's go.mod.
There's no way to use different versions without them being listed in some go.mod.
go.sum to only maps between versions and hashes, and may contain hashes for multiple versions of modules.
If you wanted to verify the contents of a dependency, you would want to check go.sum. That's what it is there for, after all. So if you wanted to fetch the dependencies, then you would want to use it to verify hashes.
If all you care about the is the versions of dependencies, you really can (and should) trust go.mod alone. You can do this because there are multiple overlapping mechanisms that all ensure that a tag is immutable once it is used:
- The Go CLI tools will of course use the go.sum file to validate that a given published version of a module can never change (at least since it was depended on, but it also is complementary with the features below as well, so it can be even better than that.)
- Let's say you rm the go.sum file. That's OK. They also default to using the Go Sum DB to verify that a given published version of a module can never change. So if a module has ever been `go get`'d by a client with the Sum DB enabled and it's publicly accessible, then it should be added to the Sum DB, and future changes to tags will cause it to be rejected.
- And even then, the module proxy is used by default too, so as soon as a published version is used by anyone, it will wind up in the proxy as long as its under a suitable license. Which means that even if you go and overwrite a tag, almost nobody will ever actually see this.
The downside is obviously all of this centralized infrastructure that is depended on, but I think it winds up being the best tradeoff; none of it is a hard dependency, even for the "dependencies should be immutable" aspect thanks to go.sum files. Instead it mostly helps dependency resolution remain fast and reproducible. Most language ecosystems have a hard dependency on centralized infrastructure, whether it is a centralized package manager service like NPM or a centralized repository on GitHub, whereas the centralized infrastructure with Go is strictly complementary and you can even use alternative instances if you want.
But digression aside, because of that, you can trust the version numbers in go.mod.
> If you wanted to verify the contents of a dependency, you would want to check go.sum
You're right, but also TFA says "There is truly no use case for ever parsing it outside of cmd/go". Since cmd/go verifies the contents of your dependencies, the point generally stands. If you don't trust cmd/go to verify a dependency, then you have a valid exception to the rule.
Agreed. Arguably, though, it would be much more reasonable to trust cmd/go to verify a dependency than it would to trust your own code. A lot more effort is put into it and it has a proper security process established. So I think the point is, if you find yourself actually needing to verify the go.sum, not by using cmd/go, you are very likely doing something wrong.
A local cache of sums are also stored in (iirc) $GOCACHE, so even if you delete go.sum from the project, the local toolchain should still be able to verify module versions previously seen without needing to call out to the Sum DB.
Probably unpopular, but I just use Bazel and pick the versions of software I use.
I know the current attitude is to just blindly trust 3rd party libraries (current and all future versions) and all of their dependencies, but I just can't accept that. This is just unsustainable.
Go MVS does not require you to blindly trust 3rd party libraries. Certainly not "current and all future versions". Go modules also offer hermetic and reproducible dependency resolution by default.
Conversely, I can say that an hash being in go.sum doesn't mean it will be used for anything.
Only that if the corresponding version does get used, and the hash doesn't match, you get an error. But you can have multiple versions of the same dep in your go.sum - or none at all - and this has no bearing on what version gets picked when you build your module.
The version that does get picked is the one in go.mod of the main module, period; go.sum, if it exists, assists hash verification.
Yes, if you want a lockfile in the npm sense, you need both.
But a Go module does not get built with new transitive dependencies (as was claimed) unless they're listed in some go.mod; go.sum is irrelevant for that.
Although he doesn't spell it out, I suspect this is the primary misunderstanding that drove Filo to open with "I need everyone to stop looking at go.sum, especially to analyze dependency graphs". I've had more than one code reviewer ding me on a module showing up in go.sum. Usually it's a situation where a dependency has tests for compatibility with some other module so that other module gets added to go.sum. Given Filo is a professional open source maintainer, any annoyance I've run into he's probably experienced 100x.
As explained in the post, if a transitive dependency asks for a later version than you have in go.mod, that’s an error if -mod is readonly (the default for non-get non-tidy commands).
I encourage you to experiment with it!
This is exactly how the “stricter” commands of other package managers work with lockfiles.
I'm not deeply familiar with this, but from reading the `go mod tidy` manual[1], it seems that running `go mod tidy` loads all packages imported from the main module (including transitive dependencies) and records them with their precise versions back to `go.mod`, which should prevent them from being substituted with later versions. Am I understanding this correctly?
go.mod will always match whatever versions are being used directly, as far as I know. But it's not possible to lock them using go.mod. Like if you wanted to bump one version only in go.mod, you're then stumped for actually doing that. Because _probably_ the only reasonable way to get that to build is to do `go mod tidy` after doing that, which will modify go.mod itself. And you can't _really_ go back in and undo it unless you just manually do all of go.mod and go.sum yourself.
Running `go mod tidy` months apart with no other changes to your module will not change your go.mod. It certainly won't update dependencies.
You run that when you've made manual changes (to go.mod or to your Go code), or when you want to slim down your go.sum to the bare minimum needed for the current go.mod.
And that's one common way to update a dependency: you can edit your go.mod manually. But there are also commands to update dependencies one by one.
go always requires a dependency graph that is consistent with all the declared requirements.
Which means if you wanted to update one version, it might bump up the requirements on its dependencies, and that's all the changes you see from running go mod tidy afterwards.
Manually constructing an inconsistent dependency graph will not work.
The MVS choices will be encoded into the go.mod; you may have been correct in the past, but as the post mentions transitive dependencies have been incorporated since Go 1.17. So yes, really: the only point of go.sum is to enable checking the integrity of dependencies, as a nice double-check against the sumdb itself.
Just to clarify, this will download the entirety of all of the dependencies in order to find out what versions are resolved? And then those versions aren't actually locked unless you keep the vendored dependencies around indefinitely?
My understanding is that the point of a lockflle is that you don't need to do that.