In many southern tech cities corning someone and yelling at/assaulting them with unknown fluids is a great way to get shot. We recently hired a CA transplant to the dev team and he was shocked how many of us carried.
If I'm cornered by a crazy man who's shouting and dousing me - if it hits me in the face, quite literally inoculating me - with God knows what? Even here in duty-to-retreat Baltimore, you'd have a hard time calling that other than self-defense.
"on mobile" these days can be anything from a 3'' device with 512MB RAM to the device you mention.
It's very easy to assume that everyone has the latest device in an affluent society that doesn't have much of a problem buying the latest and greatest, but I can assure you this isn't the case in most of the world.
You're failing to see that Google is used everywhere in the world and not all countries (in fact, very few) have markets where the standard device is what you describe or where everyone has blazing fast LTE available everywhere.
For example, there are places here in Colombia where 2G is the norm, that's the kind of people AMP is helping, not the bay area kid that has the latest iPhone and 100Mb Wi-Fi
Such luxuries. I spent several years reading web pages on a 33MHz 386DX with 8MB of RAM. Yes, Netscape took a while to start (had to wait for the rest of Win 3.1 to swap out), and downloading images was always somewhere between "slow" and "don't bother" even with the glorious 14.4kbps of a fancy V.32bis modem[1]. However, it still only took a few (15-20) seconds to fetch an article and render it.
The slowness started when websites decided it was fashionable to add a few dozen unnecessary HTTP requests to fetch megabytes of Javascript. The bloat is self inflicted, and websites do not need Google's help to make their pages small and fast. Unfortunately, many pages value the bloated ad loaders and trackers, several types of spyware ("analytics"), and their favorite "framework" more than they value the actual content of the page or the reader's experience. Google is happy to pretend the problem isn't self-inflicted when it gives them more tracking data.
Yes, it's important to remember that there will always be a wide variation in the User Agent. That's one of the reasons well-designed websites progressively enhance the heavier features. Websites can do this on their own - just like they did 10/15/20 years ago. An over-engineered caching system isn't necessary. Do you want a future where the internet retains some of it's interactive, decentralized qualities? Or do you want a fancier version of Cable TV, mostly controlled by Google et al?
[1] On weekends I was stuck with the old 2400bps V.22bis hand-me-down.
"You're failing to see that Google is used everywhere in the world and not all countries (in fact, very few) have markets where the standard device is what you describe or where everyone has blazing fast LTE available everywhere."
So here's an idea: Seeing as some folks _do_ have these powerful devices and fast connections, Google could make AMP _optional_.
> You're failing to see that Google is used everywhere in the world and not all countries (in fact, very few) have markets where the standard device is what you describe or where everyone has blazing fast LTE available everywhere.
So Googles solution is to pull everyone down to the lowest common denominator, which is shit. Google basically re-invented WAP and is trying to set the web back 15 years.
Sure, not everyone has a fast connection and a high-end device, but they could have easily limited AMP to just those devices instead of forcing it down everyones throat.
I feel the same way. 99% of what I do requires the non-mobile site. Most links I open are github pull requests and the mobile site doesn't give the option to approve with a message, second most common is Circle CI which again is worthless in mobile form, third most common is JIRA, which again, I have to click "request desktop" in the menu to be able to search and do everything I expect.
So in all the biggest apps like Gmail, Slack, etc. I now have to click a link and it shows up in process, then I have to pick the menu option to view it in real Chrome, then I have to pick the menu option to request desktop site. So they've added two clicks and two page loads to almost every site I visit.
> "on mobile" these days is a device with a quad core, 3GB of RAM, and a 1080p + screen.
You left out the most important reasoning for AMP, which is the network. Even in the US, cellular networks are almost invariably slower/higher latency/less reliable than wired (or even wifi) networks. In other areas of the world where most users are not on 4g it's a huge difference.
There are relatively cheap, unlocked Chinese phones available -- e.g., ZTE's Axon 7 -- with the aforementioned specs. One doesn't have to spend 600 USD+.
Restricting what you use to the AMP subset would make your page quite fast, with a dependency on a javascript file that most mobile browsers should have cached, along with having your page in the top carousel loading in smoothly like all the others. This can all be detected from your main page when crawled (as it'll link to the AMP version) so depending on the device one or the other can be presented as a search result.
Caching is a part of the platform and is entirely coherent with the goals, and if you want analytics or ads, you approach those in a different way. AMP caching isn't limited to Google - Bing, for instance, caches and serves AMP as well. Ultimately anyone could, and HN could cache and quick-serve AMP content for supporting sources. AMP is open source and any one can take part.
I should add a side note that many of the comments on here have taken the predictable turn of claiming that people who defend AMP are "over invested" or must work at Google. I have nothing to do with Google, and have a reasoned, fact-based opinion on AMP. I think it's a last-ditch salvation for a web where sites are demonstrating a tragedy of the commons. Nor do I think everyone denouncing AMP works for Apple or some competitor.
Coherent with the goals, sure, but not required for any benefit. [edit for clarity, while it may improve things, it is not required for at least some benefit]
What I dislike is not being able to choose simply to make and host an AMP content myself without google taking it and hosting it on their own servers. I cannot opt-in to this, nor can I opt-out. My only choice is to not do anything with AMP.
What is the way to host your own AMP cache? The AMP project under caching just shows the google cache and links to a google page.
As a side note, what if what I publish is not acceptable by google? Will they remove my content, despite having already taken it and given it to people under that URL? If AMP content must be loaded from the caches, is my content only valid AMP if google and the jurisdictions they operate under approve?
"Open source" in that context is a meaninless buzzword. It is not "open" when giant companies appify your content in their walled gardens, even if you can read part of the code.
Custom Elements are a web standard. Anyone can just make up new tags, if they include a hyphen in the name. And with customElements.define() you can attach an ES6 class to that element and boom, DOM as an open component model.
AMP is a subset because in addition to the elements, they provide a set of restrictions that would make your page "valid AMP" and cached by Google's creepy CDN thingy.
You realize a lot of people pushing the supposedly nefarious agenda you're saying here own a lot of bitcoins, and even belong to companies where timelocked bitcoins act as an incentive program, right? The conspiracy theories surrounding Core's agenda never have sensical incentive structures. This is why nobody listens to the conspiracy theorists when it comes to designing a protocol that relies entirely on incentives adding up.
It is nothing sinister though. It is just when Mr. Maxwell and Dr. Back started Blockstream they expected the 1 mb blocksize to stay 1 mb. On this assumption they built their company and sold their services as "Core" maintainers.
The Fork to a chain that does scale on its own seems inevitable and was fully expected by the original developer. It seems like an uphill battle for Core to convince people to NOT upgrade the network.
The "sketchy" off chain stuff is not sketchy either, it is just off chain. If you mean that they may be able to profit from it, I don't see why that is evil or even wrong, it is just their business plan.
That is incorrect. Literally nobody expects blocksize to stay at 1MB permanently. If they did, SW wouldn't exist, since that is a blocksize increase to about 2MB.