- 68.37% of CPU was spent computing these checksums. With a one-line code change to enable hardware-acceleration on Graviton via the sha2 library, this went down to 31.82%. This improvement allows us to push at least 2x more throughput from these processes without increasing our compute spend.” - Shikhar, CEO of S2
But later they say:
- Checksum Processing Efficiency: The AWS S3 Rust SDK was found to be unnecessarily recomputing CRC32C checksums. Identifying this issue led to the implementation of a simple workaround, further improving efficiency.
The eternal discussion isn't about old vs new or boring vs exciting. Mature is mature regardless of age.
A system that breaks when updating dependencies, introduces unexpected behaviour through obscure defaults, forces you to navigate layers of abstraction isn't mature... (looking at you Spring and Java ecosystem), it's old and unstable.
Stability, predictability, and well-designed simplicity define maturity, not age alone.
Is Python mature and boring? With toolchain issues and headaches of all kinds... Newer languages like Go or Rust i.e solve all these toolchain issues and make it truly boring in the best way possible.
Go and Rust are only "boring" if you vendor all your 0.x.y -versioned dependencies (or even worse, dependencies where the "version" is just the latest git commit to trunk) and carefully vet every single update for breakage.
The nice thing about Go (compared to NPM) is that a lot of those libraries are just nicer APIs for the standard libraries and not some core tech you need. You can go for the 0.x version with the absolute assurance that you can fork it or vendor it with minimal cost in support time.
(Genuine question). I only occasionally write python, but I just use venv and install requirements file. What toolchain challenges are out there for python?
For a large enough project, the dependency conflicts can get extremely frustrating, especially when it's time to update them. You may need to upgrade a dependency for security reasons cough cough requests cough cough, but some other dependency that calls it has pinned another version (range).
Dependency conflicts become an issue for large projects in any language. It's less of a problem when the language's runtime is feature-rich since libraries will be less likely to use a third-party HTTP client. You can choose libraries with fewer dependencies, but that only gets you so far. At some point, you can put the libraries in you monorepo, but upgrades come with a large cost.
Yeah that is a nightmare. But isn’t that a problem on all package systems except more dynamic runtimes like NPM which can load many copies of the same library?
It's a problem all languages have, but some are better at sorting it out. The way NPM does it solves one issue, but causes others.
The big issue, IMHO, is that when you're dealing with interpreted languages it's very hard to lock down issues before runtime. With compiled or statically typed languages you tend to know a lot sooner where issues lie.
I've had to update requests to deal with certificate issues (to support more modern ciphers/hashes etc) but I won't know until runtime if it even works.
Everywhere I've worked, I've had a few cases where we updated some dependencies on machine A (e.g. a developer's macbook), everything ran fine, we did the same updates on machine B (e.g. an Ubuntu EC2 instance) and everything broke. This is especially case with the numpy/scipy/pandas/etc. ecosystem. In one case this took days to fix, which is insane. I haven't had that experience with any other language.
It's worth noting that all of these involved anaconda, which was the recommended way to install numeric libraries at the time. Other package managers might be better.
Rust is the opposite of mature in practice. A language can be mature but if the style of devs that chose to write in it aren't, on average, then it doesn't matter.
Sometimes old versus new effects this. For example in Rust the language improves so fast I've literally had a 3 month old rustc be unable to compile a rust program (SDR FFT thing) because it used a new feature my 3 month old rustc didn't support. As I continued to encounter Rust projects this happened a few more times. Then I decided to stop trying to compile Rust projects.
Right now the dev culture is mostly bleeding-edge types who always use the latest version and target it. As Rust becomes more generally popular and the first-adopter bleeding-edge types make up proportionally less of the userbase I expect this will happen less. Bash still gets new features added all the time too; it's just that the type of developers who chose to write in Bash care about their code being able to work on most machines. Even ones years (gasp!) out of date.
Yes, I don't curl|sh like recommended for my rustc or otherwise install random arbitrary compilers from outside of my OS repositories. I have an OS install with system libraries and programs and I want to use that.
I don't have to set up a custom install of a language for every single application for any other language (although python in machine learning domain is getting there). This is an abnormallity which complicates software mantainence and leads to problems. It should be avoided if possible. And setting up container for every application is also not a solution. It's a symptom. Like a fever is a symptom of infection, containers are symptom of development future shock.
To be clear, I'm talking about in the context of a human person and a desktop computer. Not employed work at a business.
You can install `rustup` using your system package manager if you really want to. You could also `curl | manually-verify-script | sh`. But if you don't stick to recommended install procedure then of course you are stepping out of the "boring" path.
> I don't have to set up a custom install of a language for every single application for any other language
Which languages do you use? I find that using version manager saves a lot of headaches for every language that I use, and is very much a normality. Otherwise I run into issues if I need different versions for different projects. The fact that Rust has a first-party version manager is a blessing.
rustup still is outside of repos even if the download method isn't silly and insecure. For some random applications that's fine, but for a compiler and toolchain? No. If I wanted a rolling distro I'd use a rolling distro. Rust culture only being compatible with rolling is not a good thing for many use cases.
>Which languages do you use?
c, c++, perl, bash. A program written in perl+inline c today will compile and run on system perl+gcc from 2005. And a perl+inline c program written in 2005 will compile and run just fine on system perl+distro today. And pure Perl is completely time/version portable from the late 90s to now and back. No need for containerization or application specific installs of a language at all. System perl just works everywhere, every time.
There are versions of c++xx isms and non-c89/etc isms in some programs written in these languages. But at least these only happen a couple times a decade and because of the wide popularity and dev culture their use is much more delayed after introduction than in rust or python.
Highly recommend "Stories of Your Life and Others".
I describe Ted Chiang as a very human sci-fi author, where humanity comes before technology in his stories. His work is incredibly versatile, and while I expected sci-fi, I'd actually place him closer to fantasy. Perfect for anyone who enjoys short stories with a scientific, social, or philosophical twist.
Another anthology I'd recommend with fresh ideas is Axiomatic by Greg Egan.
Exurb1a is also worth reading. He's better known for his YouTube video essays (which vary between bleak and profound, usually within the same video), but he has published several books. I got about halfway through Fifth Science before leaving it on a plane (yesterday); I plan to rebuy it so that I can finish it.
In the sci-fi space I'd argue that Ursula K. Le Guin is another must read. She was heavily influenced by taoism (and eastern philosophy). When you approach her work with that in mind, it adds a whole new layer of depth to everything.
I’ve never encountered anything like Egan before. I’ve heard Stanislaw Lem mentioned in conversations about him though. But I can’t vouch for the comparison myself as I’ve never read Lem.
Both are fresh voices and well worth reading, but I don't think Lem comes anywhere near Egan's diamond-hard sci-fi. Egan knows, and does, real math; you can sometimes find him at the n-category Café. My impression is that Lem's beautiful philosophical ideas were not accompanied by comparable math or physics knowledge.
Lem is humanist. The sci-fi part is only a vehicle to make you think (eh, if you want to), and while things are written in 1950-80... they are not outdated, because humans are essentially same, for millenias. Just read "Stories of commandor Pirx". Somewhere in the middle of them, you may notice something like the current frenzy around LLMs and ethics. But he goes further..
The last time I responded to a similar comment by suggesting asking an AI, I was downvoted to hell. I won't do it again. I will note, though, that the list generated was excellent and provided rewarding information.
I recommend the story Hell is the Absence of God in the book you mentioned; as someone non-religious, it was quite interesting to see how people generally feel about deities and their awesome power, from this short story [0].
I think of this as “humanist” sci-fi; which has heavy overlap with “golden era” SF.
Other authors I’d put in this category are Gene Roddenberry (TOS and TNG, particularly), Asimov, PKD, Vonnegut and Theodore Sturgeon.
Personally - fantasy stories are “and-then” stories, SF are “what-if”. Humanist sci-fi is then asking “what-if” about very human things, as opposed to technological things, although the two are always related.
However, practically speaking, literature vs sci-fi vs fantasy (vs young adult!) are more marketing cohorts than anything else; what kind of people buy what kind of books?
Unfortunately, every software project will eventually reach a point of maturity where more and more features are added simply for the sake of adding them.
"The goal of this proposal is to introduce a new syntax that reduces the amount of code required to check errors in the normal case, without obscuring flow of control."
The key is "check errors in the normal case".
When the core principles of Go have always been simplicity, flexibility, and having one way of doing things, this feels completely like a step in the opposite direction. We will have syntax sugar for "normal cases" while still relying on the `if err != nil` block for everything else. It’s similar to how we now have both `iterators` and `for loops` as constructions for loops.
What is happening is that Go designers are discovering that the "academic" features of the language that predated it for decades have gotten them for a reason.
GitHub cares. GitHub cares about active users on their platform. Whether it's managing PRs, doing code reviews, or checking the logs of another failed action.
They don’t care about things that I care about, including everything the author talked about, and also things like allowing whitespace-ignore on diffs to be set on by default in a repo or per user - an issue that’s been open for half a decade now.
(Whitespace is just noise in a typescript repo with automatic formatting)
GitHub often actively doesn't act in situations where acting would be prudent, which portrays from an outside perspective a disinterest in those who give their time to document shortcomings. Would you care to guess when the last time that the GitHub API was updated? It's probably much longer than you'd think (2+ years at this point).
Bitcask is great, it's such an elegant idea. I've built a few different toy implementations in different languages and learned something new each time. YMMV based on how many deps you do or don't want to use, how complete you want to go, but it's a totally doable small-ish project.
- 68.37% of CPU was spent computing these checksums. With a one-line code change to enable hardware-acceleration on Graviton via the sha2 library, this went down to 31.82%. This improvement allows us to push at least 2x more throughput from these processes without increasing our compute spend.” - Shikhar, CEO of S2
But later they say:
- Checksum Processing Efficiency: The AWS S3 Rust SDK was found to be unnecessarily recomputing CRC32C checksums. Identifying this issue led to the implementation of a simple workaround, further improving efficiency.