I still don't really understand why we can't just run any files in cache (from other sites) with the same hash.
If I have the sha-256 of an exe file, I'm perfectly happy to run any exe file with the same sha256 simply because collisions don't happen. Why is this different for JavaScript?
If an attacker can inject HTML script tags into your website haven't you already lost?
Content security policy is a defensive technology which makes the answer to your last question "no." Attackers still need to have their script appear to execute from a whitelisted domain, which is only possible if the system you propose is enacted. IE - have your own random webpage which loads a script with hash X, then redirect to an XSS hole which appears to load that script on a client's site that also comes from a whitelisted domain. Since it is cached from the first site, it will be loaded as if it was hosted on the second site and thus bypass CSP.
I use Firefox purely for a large number of quite complex extensions (which have no Chrome equivalent). If these no longer work or ever extension has a similar version in Chrome I guess I'll switch to Chromium.
Then again. Perhaps there will be some kind of long term fork of Firefox before the switch?
If my FF extensions break I'll probably switch to Edge, it's faster and will be using the same extension API. The only reason I use FF is the extensions that can't be found anywhere else.
Ah, I suppose. I like the performance aspect of it though. It feels like something my computer is doing live, as I watch it, instead of a recording of something done in the past.
Not sure the point of the comparison. Yes, some of our encoding technology beats some algorithmic generation of content in performance. If the goal is size of encoding, though, it doesn't come close.
Tricks like this can help show how less than a gig of data is enough to encode an operating system. Or a person.
I really hope you are right about encoding a person in less than a gig, but what about the associated genome (3000 megabases), episodic memories and/or neuron connectivity?! Surely this is data that would be essential, yet very hard to algorithmically encode?
I'm aiming for something backend(ish). Either C(++) or JavaScript/Node.js/PHP on the backend (I feel there's not much more I want to learn with CSS).
All I do at the moment is rearrange the order of skills for a job I'm applying to. Would you're advise be that I should have an independent CV for each type of job I apply for?
Honestly I'd just plug it in and look through the files.
Meh, seems quite unlikely it'll break the computer or infect it (I wouldn't expect Linux to be a big target for something left lying around anyway).
Besides, you can probably find the owner from the files on the USB stick and return it. If I can't find the owner this way I'd probably hand it in somewhere near.
Surprisingly, I get this fairly often on my mobile. I'm not quite sure what's changed with the last version (or couple) of Android to trigger this. Over-secure, perhaps?
Android mobile browser requires the full signature chain of authorities who signed the site certificate[0]. It is done to avoid calls to CA servers and reduce network use on mobile devices.
If a person does not concatenate authorities signatures to their site certificate, Firefox and mobile Chrome will alert, desktop Chrome and Safari won't.
If I have the sha-256 of an exe file, I'm perfectly happy to run any exe file with the same sha256 simply because collisions don't happen. Why is this different for JavaScript?
If an attacker can inject HTML script tags into your website haven't you already lost?