After almost a decade of building Firefox from source I finally bit the bullet to switch to a precompiled binary last month. It's fixed all the issues that have started to crop up but leaves a bad taste in the mouth.
I had been using their binaries due to security updates and having to build at inopportune times for an hour or three (depending on machine), but like I wrote in a sibling comment, enabling the cairo-gtk2 backend resulted in crashes when trying to select a file with the gtk file dialog. So I bit the bullet and am struggling with the drawing regressions of GTK3.
I think the third party situation is unlikely, unless they're shipping a fork like IceWeasel or stripping out all branding information. At the end of the compilation process in Gentoo it helpfully tells you (if you enabled the Firefox Branding) that you cannot legally redistribute a branded version of Firefox.
I interpret a lot of AGPL projects I see as "we're willing to negotiate a secondary license to your private business". If you're the best in class (or just the only available public option) it can make sense as a way to try and make money. If there's a more liberally licensed substitution, however, it's probably best read as a "don't use me for your business" flag...
Right, but, from a business perspective, the same argument goes for GPL vs permissive licenses; why use GPL if there's an Apache or BSD alternative.
I'm ok with dual licensing, what annoys me is businesses that makes nominally free software but puts half the core functionality in proprietary extensions. It feels like they're saying, we want the community to contribute to our core, but we're going to sell the profitable parts.
Even worse is a company like Odoo (not to name anyone...), who switched from all AGPL + selling support model to a free core + proprietary modules model. They relicensed their codebase, claiming they own copyright for all of it since they rewrote all community-contributed parts, but that's really questionable. You can't erase 10 years of community contributions.
I don't really have a problem with this sort of practice, and I think it is reflective of the model I feel most companies should use: keep the core-to-your-business software for your company proprietary and open-source all non-core-to-your-business software. It is the only model that will help keep open-source alive and well funded.
It is unreasonable and unrealistic to expect a company to open source everything it creates; there are legitimate competitive advantages to keeping that software to yourself. It is much easier to convince companies to share their non-core software, and companies will actually gain from doing this.
I like to tell this to people when they start gushing about Japan because of their cleanliness or law and order or whatever, but they didn't make private possession illegal until a couple years ago. Before then it was just production or possession with commercial intent that was illegal. And the new law still doesn't apply to manga/anime. Is Japan a hell-hole? Are Japan's children all psychologically ruined or at a much higher threat of abuse because of the prevalence of anime/manga porn? (Some quick googling on opened investigations suggests a mere 0.0075% of Japan's children are affected per year.) To me Japan seems fine. (Another odd behavior is that daughters bathe with their fathers, sometimes (like 6-10%) all the way up to high school age.)
Isn't it that the whole family bathes together? The way you put it makes it sound like it's just the two of them, which doesn't fit the impression I have, though I don't really know.
Curious what they weren't so happy about with Python? Was it purely performance? If so, did they consider PyPy, or at least profile what the slowest bits are so they can evaluate whether to throw everything out or just rewrite the slow bits? Was it the language itself? Not everyone likes dynamic languages, though it's odd they started with it. Did you consider Node at all?
From my time doing server backend python dev, it is only catching any problems at runtime, everything from missing arguments to typos in variable names that accidentally match another variable, turning your int to a string. Having a compiler catch these saves much time and hairpulling. And having unit tests as a final defense, rather than the only defense, does wonders for my peace of mind.
As a Python user and fan I hear this complaint a lot. I understand but I can't really agree since things like typos and type fails pretty much never happen to me, at least in production. My secret? I use the REPL, heavily. (And not even in the grand Lisp fashion, because Python's REPL isn't very advanced, mostly I use it just off to the side and maybe or maybe not running an instance of the full program, or parts of it.) Using the REPL catches most of those things just as quickly as a compiler, plus it can catch things compilers don't, such as null pointer exceptions.
Two lesser secrets are using a linter, which catches all sorts of issues too, and second actually getting the full program locally to a state where I can have it execute (most of) the new code I just wrote that I didn't verify in the REPL, or using data sources I didn't just define temporarily in the REPL, so I can make sure it seems to do what I intended. A lot of devs don't seem to do the second bit... Checked in code for Java compiles and passes existing tests and went through a basic code review but inevitably bugs get filed because it doesn't actually do everything the story said, it's like they didn't even try out their own code, it just looked correct and the compiler/tests agreed.
I think when you're working with the REPL interactively instead of relying on the common "edit -> save -> compile -> ship it|start over" cycle you don't miss those details as much, because you're constantly trying out your own code. Maybe my experience is because I don't typically use dynamic languages as scripting languages, at least in the sense of quickly hacking up a script, saving, getting to skip the compile step (look how much faster it is to develop in dynamic languages!!!), and running it until it works. I have done that, but even then, I'm usually writing the bulk of the script in the REPL -- or rather in my editor that can send text to the REPL. It's quite different from what seems to be the thing that made these languages popular to begin with, which is not having to explicitly type everything and getting to skip a (potentially long) compile step (which also encourages more source sharing).
> I understand but I can't really agree since things like typos and type fails pretty much never happen to me, at least in production.
The typos / type fail comments are shorthand for the real complaint, which is that a long-maintained large dynamic language codebase requires continuous vigilance. I've worked in dynamic languages for most of my career (Python, JS, Clojure) and typos/type-fails are pretty rare but if you haven't spent a half day tracking down a bug that turned out to be one of these at some point in your career, I'd be quite surprised. The breakdown comes when someone not familiar with the code makes a change without fully considering the consequences and it goes through something that nil-puns so the error isn't detected immediately.
My experience with people who are really against type systems is that they haven't run into a language with a good type system. I'm a fan of the gradually typed JS dialects (Flow more than Typescript) since you get the quick hacking at the beginning combined with compiler-enforced vigilance once you switch over to maintenance mode. Type-inferred languages are also nice, particularly fully type-inferred languages. I think F# [0] is both terse and accessible, for example.
I see nothing false in rewording your statement to "a long-maintained large codebase requires continuous vigilance". Typing doesn't seem to matter with this. At least to the extent that we believe typing doesn't have a meaningful impact on the expected size, lifetime, and complexity of a codebase to solve an arbitrary problem. (With better type systems around it is neat to see statically typed languages quite significantly narrow the gaps in expressiveness though, and in some cases beat out 'trivially dynamic' languages.)
I know I've wasted time tracking down simple issues a static type system would have caught (or just due diligence by the coder -- and some of these issues I've caused myself! Though I really can't remember any insidious to find but quick to fix typo or type fail I caused, but I'm willing to admit to a possible selective memory bias), I've also wasted time tracking down simple issues a type system wouldn't catch -- even ones like Rust's, Haskell's, and dare I say maybe even Shen's? I also spend/waste a lot of time, probably the most time in total, tracking down complex issues that got past the type system and existing tests and code reviews and personal or team diligence, and these days most often in either Java or JavaScript, neither of which are particularly great poster children for their respective type systems. (I don't want to get into the strong/weak axis.)
Issues from NPEs or divide-by-zeros or undefined function calls, or stuff that goes through the type system's mechanisms to escape the type guarantees like reflection, casts to Object, void * , unsafe, serializing class names instead of values, etc., are annoying, a sudden power outage is also annoying. Some of that can be caught and prevented by more powerful languages, but still the time to fix those is nothing compared to more complex issues resulting in all sorts of incorrect behavior. There are so many more causes than type mismatches. It seems in your career the trivial bugs from typos are rare for you too. I'm not convinced the possibility of slight inconvenience those rare issues can create is worth the certain tradeoff in losing expressive power (especially if I can't use the most expressive static languages for whatever non-tech reasons) and possibly more, nor am I convinced a static approach is even the best one when you have languages like Lisp which support types well enough to error out when you call COMPILE on a bad form but still have huge flexibility.
I wonder if all this sounds like I'm a diehard C++ fan and don't need no stinking safe memory management tools because I never get segfaults or security problems. If it does I don't think it should, but it's really hard to explain why my perceived utility of static type systems is low without just appealing to preference, firsthand, or wise authority's experiences. The argument has been going on for decades by smarter people than me on both sides. In the end maybe it's just preference as arbitrary as a favorite color but rationalized to kingdom come. I at least don't draw my preference line so narrowly at static vs dynamic, there are plenty of static languages I'd use over JavaScript, and plenty of dynamic languages I'd use over C++.
I will ask about your experience though: how does it square with people like the author of Clojure? Is he just a god-tier outlier? I don't think one could argue he hasn't done his homework, or doesn't have enough real-world experience. It reminds me of a quip graphic I saw once, it was something like a venn diagram showing an empty intersection of "dynamic typing enthusiasts" and "people who have read Pierce's Types/Advanced Types books".
> I also spend/waste a lot of time, probably the most time in total, tracking down complex issues that got past the type system and existing tests and code reviews
Nobody will argue against you on this. The static/dynamic debate, as I understand it, is about whether typing and design constraints typing impose are worth the reduction in simple issues. Reasonable people can choose both and my personal take is that I think types are worth it given a good type system for a long-maintained project. You get more people coming on board and I think types are most useful in that situation.
generics and sum types
> how does it square with people like the author of Clojure? Is he just a god-tier outlier?
I happened to be in a group Rich joined for lunch at the last Clojure Conj and we talked about typed functional languages. The short story is that he doesn't think types are worth the tradeoffs. At the time Clojure had just introduced transients and he mentioned the numerous typed functional language blog posts about the feature and that most were incorrectly typed. He gives specific examples at the end of Simple Made Easy and reiterated many of them those during the discussion.
I think we'd agree that, libraries and architecture matter more than language but I do think language influences the abstractions the library can provide. My post wasn't really meant to argue the superiority of types in all situations but rather a specific response to the sentiment of "why do people always bring up typos when I never run into them"?
Type safety mainly, i think. Performance is a definite concern, but they have a lot of internal applications and the stability of them varies. I offered up that less dynamic languages would provide more speed and reliability to boot.
I know Python got types in 3.5, though i'm not sure if it has Go-like Interfaces (Traits in Rust). If not, i think it really should.
I do firmly believe they'll be quite happy with Go though. Rust, not so much.
Seems a lot of the Go fans I read are former Python users burned by dynamic typing, so I agree they'll end up happy (or at least happier than Rust) with Go. Though one more option you might want to consider is Nim: http://nim-lang.org/ (It's pretty easy to get up to speed in it, especially for a Python user so long as they're not expecting to use fancy OO features.)
Python has always had Go-like interfaces in practice. The problem was that they were not reified into the code, so you had no easy way to know when calling a function and passing it a "file" exactly what file-like things the function was going to do with that "file" without reading the source code. You had to extract the interface yourself.
Not only that, but there's simply no guarantees. You can abuse a function in any way you see fit in Python, and the only one that suffers is your runtime sever :(
Optional types in 3.5 look awesome - but i don't want to lose duck typing. I want Go-interfaces in Python.
For me, it's the extremely straightforward conventions of Go, with it's straightforward tooling, and it's strong typing.
I "grew up" on Python, wrote a lot of code in it, and love it. But it doesn't feel as cohesive as Go.
As an example of cohesive tool design, let's look at Go package management. In Go, if I want to install a package, I install it with:
$ go get github.com/pkg/term
Having installed this package, I import it in my code with:
import "github.com/pkg/term"
Having imported this package, I'd like to read the documentation for it. To do that I use the command `go doc` with the package name:
$ go dock github.com/pkg/term
Now that I've read the docs, I've got a question about how some particular
functionality is implemented. With Go, I happen to know exactly where I can
read that code, on my own hard drive:
$ cd $GOPATH/src/github.com/pkg/term
With Python, I find that I don't have this absolute guarantee of consistency.
Usually, packages will have a similar convention, but some require installing
with one name and importing with another, and the local documentation viewer
(pydoc) isn't installed by default, so I didn't even know about it until
relatively late in my use of Python. I've had a similar experience with the
rest of Python's tooling: it's as feature complete or better than Go's, but
it's not quite as consistent as Go.
It was pretty bad that easy_install came out with no easy_uninstall. Plus some packages are in your system's package manager (which I think is great because I'm sick of every language having its own package manager when my system's (Gentoo) is better) and some aren't, or the latest versions aren't. Plus there's the virtualenv stuff, or the general problem of your dev environment not matching the deploy environment. Needing to have both Python2 and Python3 on your system in some cases. Some packages have C/C++ code so you need a compiler, and all the dependencies that implies. On Windows I think Python development is a joke, last time I did anything extensive there I think I ended up installing Enthought's distribution and picked off from http://www.lfd.uci.edu/~gohlke/pythonlibs/ as needed. I don't see how the Go situation on Windows could be worse than that.
I'm not a huge stickler for non-local consistency -- one of the things I like about Nim is its apathy about naming conventions (foobar is the same symbol as foo_bar or fooBar, func(arg) is the same as arg.func()...) -- so that's probably why I don't find the consistency factor a huge issue. When a language and its ecosystem has it, it's nice, but when it doesn't, it's not really a thing that annoys me.
Can't speak for the OP, but when you go full type checking it's hard to go back. Our infrastructure has many pieces in Python (right now I'm rewriting some) but all new APIs are in Go. The amount of trouble you don't even get to fight with type checking is huge. Performance gains are also good in many cases. Slightly more verbosity is a minor price to pay.
It remains true. 4TB now is like $100 too, and SSDs that size are around even if they're an order of magnitude more expensive. (And then there's Seagate's 60 TB SSD prototype, clearly the future is with SSDs.)
This thing has better hardware specs than the 3DS and the Wii. It'd be nice if this sold well and they then offered a SNES controller + games, then N64, and so on, so the only games it can't run are on the Wii U or Switch. (Edit: beaten to it.. ah well.)
If genetics determines all, then we should sort people to tasks best suited for their genetics. Ashkenazi Jew? Work on advancing gene therapy and retroviruses so humans now and in the future can be freed from genetic determinism. European? Build civilization. African? Physical labor/sports. East Asian? Also build civilization. To each is given the minimum necessary to do their genetically determined purpose, ignoring any abstract "human" needs for all. Surplus is captured by the State and hoarded, some used to incentivize the Jews to have more children since their problem is the most important. Eventually the Jews will succeed and no one will have any excuses anymore.
That's not a world I'd like to live in. But it would solve you having a problem with inequality. Or maybe you just need a different perspective to not put so much into genetic determinism in the first place.
OP said "a lot" of the haves, so I get the impression they think it's actually a lot and justifiable to start imposing corrections to the problem on the lot of haves. My real issue is just the idea that more than a tiny fraction of the "haves" are in that group by genetic determinism, either from the lottery of being born to one's parents (and inheriting their genes, and their money, which they got somehow) or the other lottery of being born to a particular (and perhaps in today's economy, privileged) race. We all have to play the hand we're dealt, but the world and the brain are dynamic enough that there are lots of ways to play even when you have poor initial conditions. Having a problem with seemingly some large amount of people not having to play as hard is odd, especially when it's likely a much smaller set of people than imagined.
> OP said "a lot" of the haves, so I get the impression they think it's actually a lot and justifiable to start imposing corrections to the problem on the lot of haves.
That's a hell of a leap of logic. If you want to attempt to make the argument that being born into money gives you no intrinsic advantages in life, that's your argument to make. Access to resources (in the modern world, money) is a HUGE advantage. I'm not jaded, I don't hate the rich nor do I want to forcibly remove their wealth, I recognize that the motive to attain wealth is the single greatest motivator ever applied to people and the primary reason capitalist societies are (by most measures) the most successful.
It's worth noting, however that the much derided and in recent history scaled back Estate Tax was originally intended to prevent the creation of a permanent landed gentry in the United States, or if you've never used that term, a group of wealthy estates that own so much they can effectively live from birth to death on the incomes of their properties. I would argue we have one, albiet not as bad as the one that inspired the law but one nonetheless.
The failing of this counterargument is that I'm not advocating for equal results of all people, I'm advocating for more or less equal opportunity. There are plenty of factors at play that we cannot control; genetics are indeed one, as are number of involved parents, upbringing, what the child is exposed to, etc. but there are at least a few that we can help control and access to capital is one of them.
It's really technology that is always getting better. It has been so successful that it's hard to see the decline in so many other things. Factor out technological advances when answering these questions. Are our universities really better than they were 100 years ago? Even just 50? Are our political systems any better than 100 years ago? 2000 years ago? Are our stories any better? Are our People any better? Is our art better? Are humanity's cultures better? I can find points of these where I can argue some aspect is better now than before (but that's usually due to technology -- e.g. the whole genre of sci fi first demands, well, science), and some I could argue for overall being about equal, but for any of those being overall better, I find it a hard claim to make.