I remember before Little Snitch there was ZoneAlarm for Windows[0] (here is a good screenshot[1]). No clue if the current version of ZoneAlarm does anything like that (have not used it in 2 decades). I always found it weird that Linux never really had anything like it.
If I remember correctly, it runs as a commodity and patches the socket library. Interestingly, the socket library was not re-entrant (unusual for Amiga libraries) so I had to patch the Exec OpenLibrary() function to monitor the loading of new copies of the socket library. But it's been a long time so memories are hazy.
It'll be interesting to see if it is still compiles and runs for modern AmigaOS, if any active Amiga programmers are around to see.
What I really liked about ZoneAlarm wasn't just that it was a very nice technology - and it was; but also that it got the user expectations and training right from a very early stage.
It was quite insistent on the fact that it would be "noisy" at first as it queried all the programs you ran, but would then quieten down once it had been "trained". It got that across in clear, simple language.
I think it was so successful because it got the soft side of its security job right as well as the hard part. It's certainly why I recommended it to anyone at the time...
Was working as an IT consultant. We got a call from an international manufacturer in the area for support. Local lead IT manager took down the firewall which infected their computer network around the world. All they wanted were bodies to help clean systems and apply OS updates.
My personal computer had ZoneAlarm on it. It became ground zero for reporting about infected systems. They ignored systems they thought were save; CISCO phone system running on Windows server and other backend devices. The company then bought a few licenses to run their own laptops.
It is such a same that Microsoft destroyed _ERD Commander_ and other quality tools which assisted in the clean up.
I helped administer the CheckPoint commercial version of this before 2010 in a large enterprise (Checkpoint Integrity it was badged as). Really good product though we did have some bugs with it - I do remember the developers from Israel got involved and were very capable.
It mostly worked exactly as you would want a desktop firewall to, and integrated nicely with Cisco VPN tech, so you could ensure Integrity was operating correctly before fully opening up the tunnel for access to corporate assets.
It's not, though. There simply wasn't enough malware to worry about. Why would I run a firewall when I was unlikely to ever encounter a malicious program?
I mean, supply chain attacks are a thing that could have happened even in the earlier days. Linux almost got backdoored in 2003.
Also with the number of remote code execution exploits that have occurred in Web browsers over the years it's hard to know for sure if what you installed hasn't been hijacked unless you spent all your time on gnu.org
Run OpenSnitch for a while and you'll quickly realize how much of your system does phone home. Off the top of my head:
- GNOME Shell (extension updates without a way to disable this, weather),
- GNOME Calculator (currency exchange rates),
- NetworkManager (periodic hotspot portal checks in most configurations),
- GDB (debuginfod enabled by default),
- Firefox (extension updates, push notifications, feature flags, telemetry, ..., some parts cannot be disabled),
- VSCodium (Open VSX callbacks even when installing extensions from disk with updates disabled, JSON schema auto-downloads, extensions making their own unsolicited requests, ...),
- Electron (dictionary updates from Google servers, no way of disabling; includes any application running on top of upstream Electron, such as Signal, Discord, etc.),
- GoldenDict (audio samples fetched from the Internet on word look-up, no way to disable)
Of course, this is nothing compared to Windows [0] and macOS [1], but the malpractice of making Internet connections without asking, by default, has unfortunately been finding its way everywhere since modems stopped making audible sounds.
Having read about PRISM and seen the leaked dashboards of Paragon Graphite (said to be used by ICE), and with LLMs bridging the gap between mass and targeted surveillance, I don't want any of this.
Approximately 10-15 years ago I used an early Android app that synced contacts across multiple (local) accounts and deduplicated and merged them. It had Internet permission for some reason; on asking the developer why a dedicated contact management app would need to go online (in a time where I was using XPrivacy to prevent other apps from seeing my contacts), they said there was no real reason for it, and it was removed in an update two days later. This is the only time I've ever seen an app remove the ability to access the internet, and I really wish it was more common.
Of course, about 5-6(?) years ago Google removed it from both the play store and my devices (I allowed it because silly me assumed I could still get it again) because it requested a sensitive permission and didn't support runtime permissions.
Per se? No, maybe with the exception of GNOME Shell which literally runs code from the Internet unsandboxed. Can the traffic they silently generate be used for malicious purposes? Absolutely.
Wasn’t it KDE that had malware in its theme store not too long ago? Let that sink in for a bit. You changed around some icon themes and it executed arbitrary code.
And let’s not pretend that kde wouldn’t have an extension system if it could - but it’ll never have one because implanting one in that c++ spaghetti nightmare will never happen.
But if not, I'm not criticizing GNOME in isolation here. It's just what I use and what I'm most familiar with. KDE has the same issues and it does have an extension system too. It's called KNewStuff.
Problem with updates is that without automatic ones, users could stay on outdated systems and possibly get hacked through some vulnerability(of which there are many). While on the other hand, having explicit confirmations for each network request would be crazy annoying.
Maybe some middleground of having the tool OP sent built-in would be a good option.
I run all my systems with all outgoing connections blocked by default, and yes, it is annoying.
But it wasn't always this way, and so, I don't think it has to be. People just need to start paying attention to this.
The impact of a lot of those vulnerabilities would be mitigated if the affected programs didn't connect to the network in the first place.
As for updates in general, I really like the model adopted by Linux update managers and BSD port systems. The entire repository metadata is downloaded from a mirror and cached locally, so the search terms never leave your machine. Downloads happen from the nearest mirrors, there's no "standard" mirror software (unless rsync and Apache count?) so they don't report what was downloaded by whom back to any central system and you can always host your own. Everything is verified via GPG. And most importantly, nothing happens on its own; you're expected to run `apt/dnf update` yourself. It won't randomly eat your bandwidth on a metered connection or reveal your OS details to a public hotspot.
Simple, non-invasive, transparent, (almost) all-encompassing, and centrally configurable.
It contains Firefox and Chromium. You are right that they may call home, but at least it's very limited and easily configurable. Could be too much for you but fine with me. Also Debian does change their config by default to minimize privacy issues: https://news.ycombinator.com/item?id=32582260
It's far from easy in the case of Firefox [0], and the last time I tried, some .mozilla.com domains would still get pinged. Chromium doesn't even have an official guide. The only options I found to be reliable are source-level patches, i.e. ungoogled-chromium and LibreWolf.
Note that LibreWolf still leaves some of the stuff on for you to manually disable (dom.push.connection.enabled, extension updates).
I agree that push connections should be disabled. Maybe it can prompt you the first time you try to subscribe to one as to whether you're like to turn them on; this would annoy me personally, but also not break features by default. The annoyance hardly matters as websites already put an in-page prompt up before using the API, iirc because of Apple restrictions.
Enabling extension updates by default seems like a smart thing, though, as long as you can turn them off easily (there should really be a setting for this), and possibly a 6-month reminder to update them (similar to the refresh your profile reminder when you haven't used the browser in for a while). Extension updates happen, and many of the most widely used extensions (eg. ublock origin) really should be updated every time it's available. Better that than having the extensions go online to fetch and run arbitrary payloads because you know they will if disabling updates gets popular enough.
This reminded me of running Kerio Personal Firewall. When Kerio ended I switched to either ZA or Comodo firewall, one of them introduced a neat feature of running executables in containers. Made clicking random things so much easier. But the best part with all of these was restricting windows to where it could barely do anything. "RandomXYZ.DLL wants to execute random what and connect to random where? I dont think so MS." lol
I remember switching from Win95 to NT4.0 just to be able to use SoftICE properly under Windows without all the stability problems, it was an incredible time! SoftICE felt like absolute wizardry at the time.
Wow. Insane throwback. I think I first learned about ZoneAlarm from some PC magazine my parents bought for me. Completely forgot about this great piece of freemium!
if anyone else suddenly started wondering, PC magazines still exist in physical form. There are even still Linux magazines that come with installer CDs for distros. And all kinds of other magazines as well, like for Mac computers, for photo editors, for Raspberry Pi etc.
I ran ntop on a router in 2001. It had a highly insightful overview of traffic with nice looking diagrams and everything. There hasn't been anything like that since as far as I'm aware.
ZoneAlarm otoh, was snakeoil. Programs that ran at the same privilege level (typically everything) could bypass it in various ways.
There was also Tiny Firewall which got bought by Computer Associates around 2005. Probably the most complicated or fine grain control for me at that time in Windows XP.
This is what I used! At some point I managed to block DHCP lease renewals on my computer, and Internet would always stop working after a given timespan. Took a good while to figure out I caused the problem myself.
It is probably easier these days when you have a phone to fall back on for if you break the internet on your computer.
Playing with your router is still a pain though, especially if you don't have a device with an Ethernet port. You learn all sorts of fun things like "If you change your router's IP address you get logged out of its management at the old IP address" and "Oh, that's what subnet mask means, weird."
> It is probably easier these days when you have a phone to fall back on
Most definitely. The old lessons were hard learned, and they stayed with you. Going through everything, trying all the combinations, and reading obscure materials for any hints.
I don't want to glorify the old hard way of spending perhaps days on problems that ended up being trivial, but it's obviously different now when one can get all the answers and helpful scripts directly from LLMs. Much less is retained.
Back when people would try to winnuke others on IRC, the Linux guys would know who sent them the packet and call them out in the channel (and then usually ban them)
I tried out portmaster recently. Coming from rethinkdns on Android, I was far from impressed; it looks featured, but it's much harder to use. Opensnitch looks better but doesn't have the nice features to drill down connections (get from app requesting a domain being resolved to an IP and connecting on a port, and filter this at any level including globally; if the request was already filtered, you can see why and get to that filter to either remove it or add an exception)
> I tried out portmaster recently. Coming from rethinkdns on Android, I was far from impressed; it looks featured, but it's much harder to use. Opensnitch looks better but doesn't have the nice features
If 'far from impressed ... much harder to use' is about Rethink DNS + Firewall... Over the years, we've got numerous complaints about the UI over emails and on GitHub Issues, so we're acutely aware of the fact. In our defense, we have had no help from a designer, and couldn't come up with a good UX even if our life depended on it. We'll keep trying though.
No, the Windows firewall in its default configuration does not restrict outbound connections in any way. Any application can make any outbound connection it wants. If an application attempts to listen for incoming connections from external sources and there is not an existing policy, Windows will pop up a dialog asking the user if they want to allow this and if so whether it should be allowed to listen on all networks, only networks marked as "private", or for domain-bound corporate computers only networks where the domain controller is reachable.
It can be manually configured with very detailed policies, but you have to know where to go to find those controls.
It's been a while since I used ZoneAlarm or Little Snitch, but the last time I used either one the default behavior was instead that any connection attempt or attempt to listen for which there was not a policy would result in a dialog showing all the details about what application is looking to connect to or receive connections from what as well as a variety of options for creating a policy or even not creating a policy and just deciding whether that one connection would be allowed.
Also back when I used ZoneAlarm I had dialup so the taskbar addon they had which showed realtime bandwidth usage and what applications had active connections was really useful. It also had a big red "Stop" button that would immediately disable all connections, which thinking about it in retrospect really makes me miss the more innocent days of the internet.
I wish Adobe had open sourced Flash - it really was a pretty amazing tool. They could have owned the proprietary developer tool market to support themselves...
If it was possible they would have loved to - certainly by 2012 or so, and more likely by 2008-9. The reason I heard they couldn't is that by that time Flash Player was a massive 10+ year old codebase with lots of parts that were licensed or external, and nobody had ever tracked which parts would be to be relicensed or rewritten.
Source: I worked there at the time and knew the relevant PMs.
A lot of people - including studios who use it for projects that can take years to complete - were very unhappy at the prospect of having the only tool that can read their mountains of FLA files (the file format the Flash/Animate editor uses, and used to compile into a SWF) stop working because Adobe turned off the auth servers. Adobe has pulled back to "okay we're, uh, putting it in maintenance mode, expect no new features, ever, just security patches".
If you follow their mea culpa link, it says they're keeping (a type of) support.
> Adobe Animate is in maintenance mode for all customers...
> Maintenance mode means we will continue to support the application and provide ongoing security and bug fixes, but we are no longer adding new features.
Of course, in my experience, such a lifeline never lasts much longer than the furor that earned it...
Yeah, if I was in a Animate studio I sure would be putting some energy for the entire last month into finding a good crack for it so we could deal with our old files, and talking about our plans for how to deal with the major hit the production pipeline would take when we picked a new animation program and started retraining everyone on it.
A lot of people made the choice to use proprietary tools for their creative work flow, rather than making do with and pushing for better open source equivalents.
I have some sympathy for them - I am sure they felt it was the only real choice at the time - but not a whole lot.
There were zero open-source options at the time. Flash/Animate was the only digital ink-n-paint solution that was even vaguely affordable to the hobbyist or small studio for many years. Most studio-quality 2D programs were proprietary solutions developed in big studios like Disney.
People started using Flash for professional work around 1995. "Open source" barely existed as a concept then, Wikipedia tells me the name "open source" was coined in 1998 and it took a while before anyone but programmers gave even half a damn about it.
The first open-source studio-quality 2D animation package I know of was OpenToonz from 2016, which was a relicensing of a commercial package that dates back to the late eighties or the early nineties - Wikipedia just mentions v3 from 1993.
But anyway now there is a dude working on an open-source Flash clone that can read the editor source files, so all these people you have next to no sympathy for have something to celebrate.
I was introduced to "free software" and the GPL in 1986, as a PhD student at the European Molecular Biology Lab (Heidelberg).
Your historical revisionism doesn't sit well. Yes, "open source" came later because some people didn't like the specifics of the GPL and wanted a term that could describe "source available" software under a variety of license. But by 1998, I'd already been contributing to GPL'ed projects for more than a decade.
I'm well aware of the lack of free/libre alternatives to Flash. But that wasn't my point at all. I'm not saying that people failed by choosing Flash over some (mythical) free/libre alternative. I'm saying they failed by choosing Flash, period.
Before proprietary software, there were almost no creative tools that were proprietary. Nobody bought proprietary paint, or proprietary paint brushes, or proprietary table saws, or propriety anything. The software showed up, and everyone was so gaga about what you could do with it that they just forget about the fact that XYZ Corp. controlled the tools 100%, and dived in. There were people warning them, but those people were ignored.
I wish he had gone into more detail around 'A common critique of HTMX is that users lose the ability to use the “Back” button or share specific filtered views. In many frameworks, this requires complex series of state hooks to keep the URL in sync.'
He makes it sound like he did something special, but this is just something that htmx offers out of the box. In fact if he had used something like:
<a href="/?page=2" hx-target="#dashboard-content" hx-boost="true">
Next Page
</a>
Then he would have gotten the functionality out of the box without even using hx-push-url explicitly. And he would have gotten graceful degradation with a link that worked without JS and Ctrl/Cmd-click to open in a background tab.
Also the article seems to be full of errors. Eg
> In HTMX, if the server returns a 500 error, the browser might swap the entire stack trace or the generic error page into the middle of a table by default. This is a poor user experience.
This is simply incorrect. By default htmx does not swap 4xx/5xx responses. Instead it triggers an error event in the DOM. Developers can choose to handle that event or they can choose to override the default behaviour and do a swap.
reply