Yeah that's the thing making my head spin, tack a 30% profit margin on that and it's 550usd per day?
Probably going to be more than that for rocketship growth and investor expectations.
Is that the game? Lock in companies to this "new reality" with cheap tokens then once they fire all their devs, bait and switch to 2X the cost.
If you read history widely (across millennia and geographies), you'll note that most of the power-contests follow this pattern[0]. In the modern industrial world, the pattern becomes exponential rather than incremental. What I'm saying is that this is not unique to AI Labs[1]. This is caused by the deeply flawed and unbalanced system that we have constructed for ourselves.
[0]: The pattern, or, as gamers would call it, the "meta", is that every ambitious person/entity wants to control as much of the economic/material surplus as possible. The most effective and efficient (effort per control) way of doing this is to make yourself into as much of a bottle-neck as humanly possible. In graph-theory this corresponds to betweenness-centrality, and you want to maximize that value. To put it in mundane terms, you want to be as much of a monopoly as you can be (Thiel is infamous for saying this, but it does check out, historically). To maximize betweenness, or to maximize monopoly, is to maximize how much society/economy depends on you. This is such a dominant strategy (game-theory term, but in modern gaming world, they might call this a "cheesy strat" -- which just means that the game lacks strategic variety, forcing players to hone that one strategy), that we even have some old laws (anti-trust, etc) designed to prevent it. And it makes a lot of sense: Standard Oil was reviled because everything in the economy either required oil or required something that did. 20th-century USA did a lot to mitigate this. It forced monopolies like ATT to fund general research like Bell Labs (still legendary) towards a public good (a kind of tax, but probably much more socially-beneficial). It also broke up the monopolies, and passed anti-profit laws (e.g. hospitals were not allowed to make a profit until 1978; I have seen in the last 10 years a tiny cancer clinic grow into a massive gleaming hospital -- a machine that transforms sickness and grief into Scrooge McDuck vaults of cash). This monopolistic tendency of the commercial sector, is a tendency towards centralization, which yields efficiency, sure, but also creates the conditions for control and rent-seeking and exploitation.
[1]: Much of the cloud-computing craze was similar in character (and also failed to deliver on some of its promises, such as reducing/replacing IT overhead (they just renamed IT to DevOps)). And Web2 itself was about creating and monopolizing a new kind of ad-channel and lead-generation-machine. There is a funny twist, that a capitalist society like the USA, has much more deeply rooted incentives to create a panopticon than communist states of the past ever did. Neither is pretty of course. The communists demanded conformity and loyalty, while the capitalists demand consumption and rent.
Alpha Centauri yes, the edge of the universe no :D
Edge of observable universe is something like 46 billion light-years away, even at 0.9c thats 50 billion years of travel (22 billion years experienced by the traveller)
But yes, you can travel places by constant acceleration but unfortunately it still dwarfs in comparison to those places out of our reach.
Unfortunately also, the universe is expanding at a rate faster than the speed of light so you actually cant ever reach the edge
If the craft could maintain a constant 1G acceleration the entire time or more it is feasible to get near the known edge for the traveler, assuming we could make and utilize enough anti-matter to do it and that what we see as the edge here is actually a recognizable edge once you are out there.
0.9C would be reached in only a year and a half for the traveler under constant 1G acceleration. After 2.5 years you would be at .99c, and at a bit over 3.5 years you would hit .999c with a 6x time dilation compared to earth. After 6 years of acceleration it would be .99999c and Earth would be 200 years in the past. As you approach 12 years you would be going 0.9999999999c and Earth would have experienced almost 70,000 years. As you go past 16 years you would be in the millions of years and as you got past 20 years you would be in the billions of years.
Of course doing that may only be feasible with anti-matter energy storage. The next best energy source is fusion energy but it is 2 orders of magnitude less dense. Perhaps some kind of ram scoop would make that route possible but that is going beyond just speculation because we don't know if you can feasibly capture random particles at that speed even assuming you didn't explode from just hitting them in the first place.
That would be an accurate summary of almost all software.
Either it's quickly produced and thrown out the door as it's a startup trying to iterate and find market fit asap or because it's a bigcorp who's metrics are all not related to software.
No, the majority of people use something a lot of Americans struggle with "Public transport".
The MRT and bus system in Singapore is great for getting around to the point that you don't need a car, but if you Want one it must be new and you have to pay for a license as road space and parking space are physically limited.
Singapore is a small and dense island, poor people fare better without cars there. Cars are very expensive, even old, beat up cars. They're either expensive for the owner or for society or both.
Qantas offer premium economy, about 39” leg room and a few extra inches of width.
If I travel long haul personally I will always go business, booked wel in advance. It’s rare enough that the extra cost is worthwhile. Others spend the money on fancy cars instead.
There used to be a joint online project to compute these tables in a SETI like distributed system. Everyone who contributed their CPU cycles, could use the tables. And yeah, around 2005-2008.
net-ntlmv1 rainbow tables have been around forever too though, the same attack documented in this blog post has been hosted as a web service at https://crack.sh/netntlm/ for 10+ years
A few years ago i was doing some vm things in azure. Hadnt touched azure before, and spent 10+ minutes of frustration trying to figure out how to get amd64/x86_64 things started, as the only thing i could find was "Azure ARM", and on googling, "arm" here means azure resource manager... ARGH why does microsoft insist on using existing names and acronyms!?!?
I was part of a user study on Azure back when it first rolled out-- they were looking for seniors with an AWS background to participate in UX research, and I remember walking out of that study with imposter syndrome for the very first time. Spent 60 minutes totally unable to do the thing I wanted to do when I was introduced to Azure for the first time, and I remember thinking... am I a fraud?
No! Not this time, at least. In hindsight everything was named and organized terribly and it hasn't improved much since.
This is the Charlie Kirk argument against gun control, "I'm ok with a small number of gun deaths, it's a small price to pay for freedom". All well and good until you become one of those gun deaths.
I agree with him by the way. But this kind of maximalist thought ending cliche is weird and anti intellectual.
One death of an amazon employee means we should change the whole system? A huge number of people are employed by them, enjoy their lives, became multi millionaires.
Why am I flagged for a fairly normal opinion? A few deaths are okay if the wast majority are satisfied?
In theory I can pull a new cable through. But practically it might be tough due to the number of bends (shelter -> wall -> vent -> ceiling -> wall -> floor -> room). In the worst case scenario I can give it a try, but it's probably going to destroy the new fibre cable when I pull it through. For now the connection still works, so I am hoping it doesn't get to the point where I have to give that a try.
you can always try the plastic bag + vacuum cleaner trick - take a thin flexible rope, tie it to a small plastic bag, stuff the small plastic bag into the conduit, use a vacuum cleaner at the other end to suck the plastic bag & rope through. You can then use the rope to pull through new cable. If you make the rope twice the length of the conduit, you can keep it in there indefinitely to pull through new cable whenever you want.
> you can always try the plastic bag + vacuum cleaner trick - take a thin flexible rope, tie it to a small plastic bag, stuff the small plastic bag into the conduit, use a vacuum cleaner at the other end to suck the plastic bag & rope through.
That's absolutely great! Worked like a charm two days ago and everybody cheered and laughed who saw it :-D
I've seen dummy wires being put in when the conduit goes in.
Say initially you need 2 wires from A to B. That probably means there's plenty of room left. So you just put 4 more other wires in there. When the time comes you need to pull a new one, you pull in the new by pulling out the old
To anyone reading this and assuming it applies equally to electrical conduit, it does not, which is why the NEC specs a maximum of four 90 degree bends between pull points. You could probably manage five, as was described, but it is technically disallowed (again, for electrical wiring - the NEC doesn’t care about networking).
Bends ideally need pull boxes, but given the lack of pull boxes, you might be able to use fish tape where where fish rods / glow rods don't work, if you cannot get a pullstring / pull cable going.
Question from a casual bystander, why not have a virtual/staging mini node that receives these feature file changes first and catches errors to veto full production push?
Or you do have something like this but the specific db permission change in this context only failed in production
I think the reasoning behind this is because of the nature of the file being pushed - from the post mortem:
"This feature file is refreshed every few minutes and published to our entire network and allows us to react to variations in traffic flows across the Internet. It allows us to react to new types of bots and new bot attacks. So it’s critical that it is rolled out frequently and rapidly as bad actors change their tactics quickly."
In this case, the file fails quickly. A pretest that consists of just attempting to load the file would have caught it. Minutes is more than enough time to perform such a check.
That's 2X the salary of a lot of the world's software developers
reply