Re: auto generated code: it saves ci time which is a pretty huge benefit. You still need to regenerate the code to make sure it's up to date, but you can do that in parallel with tests that rely on that code.
High school is a bit under a third of school years in the US, so that's a high school of about 3000 (which is how big my high school was, for example).
You need a heat difference, not just heat, to generate electricity.
Now my sibling comment links to a paper where they say they can find heat differences in the body that are sufficient for their needs, so this is still a possibility! But it does mean you need to be somewhere with a heat gradient: the paper mentions just under the skin.
The tax income in this scenario is already devoted to paying for road improvements. There’s nothing left to be redistributed back. You can raise another tax for redistribution if you’re so inclined, but you can’t spend the money for road repair on redistribution (or else the road doesn’t get repaired, defeating the purpose).
The logistics and shipping are all part of the price of the food and whatever other items you buy. If it was solely up to the trucking companies to pay for the road wear, the increase in costs will then of course be added to the price of the food and everything else that gets shipped. So the people will end up paying for it either way, but through food (etc.) costs.
I suppose it might be cheaper for drivers that way, but groceries and other items will all be more expensive, even for those who do not drive.
I think odds ratio ( p/(1-p) ) is the thing I'd use here. It gives the right limiting behavior (at p ~= 0, doubling p is twice as good, and at p~=1, halving 1-p is twice as good) and it's the natural way to express Bayes rule, meaning you can say "I'm twice as sure (in odds ratio terms) based on this evidence" and have that be solely a property of the update, not the prior.
Excellent comment. I think the issue is that "better" is underspecified and needs some precisification to be useful. The metric you are using here is the proper response to the question "how many times more surprising is it when method A fails than method B?". This is in many cases what we care about. Probably, it's what we care about here. The odds ratio seems to do a good job of capturing the scale of the achievement.
On the other hand, it's not necessarily the only thing we might care about under that description. If I have a manufacturing process that is 99.99% successful (the remaining 0.01% has to be thrown out), it probably does not strike me as a 10x improvement if the process is improved to 99.999% success. What I care about is the cost to produce the average product that can be sent to market, and this "10x improvement" changes that only a very small amount.
I came to rust from the other direction, as it were: from Haskell.
There are a lot of similarities! The type systems are similar, with some name changes (sum types -> enums, typeclasses -> traits). Pattern matching is basically the same. Haskell uses (lazy) lists mostly the same way rust uses iterators. Ownership is new, but imposes some of the same requirements that immutability does (no cycles without shenanigans[0]). Rust requires that values are aliasable XOR mutable: Haskell does too (they're always aliasable and never mutable).
The thing is that mutability is a useful and common property used by most programmers. It takes a bit of buy in to be convinced immutability is a good thing that solves bugs. For example:
Here's a reasonable program to write in Python (wave hands here, my python is rusty)
queue = [root]
for node in queue:
if !node.visited:
# ... visit the node ...
node.visited = True
for child in node.children:
queue.insert(child)
Here's that in Rust.
let mut queue = VecDeque::new();
queue.push_back(root);
for node in &mut queue {
if !node.visited {
// ... visit the node ...
node.visited = true;
for child in &mut node.children {
queue.push_back(child);
}
}
}
This will not compile, at all. You need to do a lot of work to restructure the data structures in Rust to get that reasonable program to run soundly. Now that's a good thing, because this would be invalid in C/C++ too due to iterator invalidation and lifetime issues, but it's the kind of thing that people who haven't seen a block of code segfault due to a `push_back` before would be confused by.
It's rather rare for a programmer to need to learn about memory safety when compilers in managed languages just solve it for you, and in other systems languages they assume you already know it (or don't care that you can write unsafe programs).
There's just that additional conceptual burden when learning Rust and why programs that seem syntactically correct and safe are forbidden.
TBH I'm surprised that even works in Python – modifying a collection you're in the middle of iterating seems like a bad habit to get into in any language.
edit: in fact, if you specifically use a deque in Python the way you are in Rust, Python will throw a "RuntimeError: deque mutated during iteration". This is just a bad approach in any language, honestly.
Agreed, python may be a contrived example here. It is possible to pull this off in a few languages where object relocation isn't a problem and mutating a collection during iteration is natural.
Other features include the prominent arcs in this field. The powerful gravitational field of a galaxy cluster can bend the light rays from more distant galaxies behind it, just as a magnifying glass bends and warps images. Stars are also captured with prominent diffraction spikes, as they appear brighter at shorter wavelengths.
So, would that mean that the gravitational lensing over how-ever-many-light-years is ALSO coupled with the convex/cave aspect of the pico-adjusting of the JWT 'lens' such that even our JWT's pico-adjustments affect the NORMAL of the photons to the image?
Can this be adjusted for?
Wouldnt the pico-arc of the overall array affect the image output due to the distances involved such that we receive "false gravitational lensing, simply based on distance from the sensor"
?
I wonder if a more precise version(s) of the hex lenses could be made such that they can 'normal-ize' on a much more refined basis.
I know that each JWT is already capable of mico-flexes to each cell... but if we can develope an even further refinement (Moores law on the JWTs hex lenses resolution) we will be able to make thousands of images with varying the the normalization to each receiving area and comparing image quality.
Also, I am sure there are folks who know the reflective characteristics of photons from each wavelength that would allow for orientations for each wavelength.
--
Do ALL 'light' wavelengths, particles bounce off the reflector materials in the same way? - meaning do infra waves/photons bounce in the exact same way as some other wavelength with the exact same orientation of the sensor?
---
Do they do any 'anti-gravitational-lensing' correction calcs to 'anti-bend' a photons path to us to 're-normalize' the path that we should have seen?
The gravitational lensing matches exactly how it looked in Hubble's deep field overlay, so I would guess no the JWST lens is not causing any "false" gravitational lensing? If that's what you are asking.
Wouldn't one be able to adjust the perceived path of the photon after time, to adjust for re-normalizing the path of the photon based on the understanding of the gravitational arc imposed on such -- meaning the astro equivalent of "ZOOM. ENHANCE!" :-)
Lets assume you have a 'straight' vector of line of sight pointing your Earthly-Bound-Lens [hubble/jswt/whatever] at the object of interest.
you also have an idea through previous observations of gallaxies on the line of sight, which will have gravitational impact on the trajectory of the photons of interest...
the arrival photon's wiggle represents a wobble in time to get to earth. Meaning it changed phase multiple times between our sensor receiving it, and its origin.
If one could look at the path and the grav-lenses it went through, one may be able to extrapolate a more clear picture at various distances(times)....??? /r/NoStupidQuestions
( I am picturing a straight shot - but the photon traveled between many other celstials - and those
Meaning that no matter waht, when we speak of gravitational lenses, we could, usting JWST account for the "wobble" of a photon, nased on the accurate knowledge of where a body was, via measuring through multiples of JSWT observations... (ideally through actually multiple JWSTs, in differnt locations)
The idea being that if we can triangulate a more precice location between earth [A] and galaxy [N] - set of all galaxies/bodies/whatever,
We may be able to calculate the influence of gravity lens upon phont differentials based on when they came from and how far...
Ultimately making adjustments to the output of an image \based on super deep-field focus which is effectively selecting to the phtons of interest... and we can basically "carbon date" the accuracy of an image with a higher resolution?
What i think is pretty cool is that the gravity lens actually allowed hubble to see galaxies it may have not ever seen had there not been a gravity lens and now that we have JWST we see many more distant galaxies (and more of the same galaxy reflected in more positions)
Producing an "image" of a black holes requires astronomical, ahem, resolution because they're so far away (thankfully). To achieve this kind of resolution you need an aperture of thousand of kilometers.
The EHT images are created using synthetic aperture techniques to create an effective aperture with a diameter of earth's orbit around the sun. But this is only currently possible at radio frequencies due to our ability to capture, store, and coherently combine the phase information. It's essentially SDR beam forming across space and time.
We can also study black holes though visible and IR observations through their effects of the things around them-- lensing from their mass, matter heated up by falling in. Here is an image I took of the relativistic speed matter jet believed to originate from black hole in M87: https://nt4tn.net/astro/#M87jet ... and Webb can do a lot better than I can with a camera lens in my back yard. :)
Aside, there is some controversy about the EHT black hole images. A recent paper claims to be able to reproduce the ring like images using the EHT's imaging process and a simulated point source-- raising the question of the entire image just being a processing artifact. https://telescoper.wordpress.com/2022/05/13/m87-ring-or-arte... Though it's not surprising to see concerns raised around cutting edge signal processing-- LIGO suffered from a bit of that, for example, but confidence there has been improved by a significant number of confirming observations (including optical confirmations of ligo events).
> The EHT images are created using synthetic aperture techniques to create an effective aperture with a diameter of earth's orbit around the sun.
Small correction: The EHT is a synthetic aperture telescope the size of the Earth, not the size of the Earth’s orbit around the Sun.
Synthetic aperture telescopes need both amplitude & phase information from each observing station & have to combine the phase of simultaneous observations in order to create the final image. We can’t do this on the scale of the earth’s orbit, because we don’t have a radio telescope on the far side of the sun!
> "Here is an image I took of the relativistic speed matter jet believed to originate from black hole in M87: https://nt4tn.net/astro/#M87jet ... and Webb can do a lot better than I can with a camera lens in my back yard. :)"
You, sir, have just contributed a prime example of HN comments at their best. Your astrophotography is outstanding. Thank you for sharing! :)
Another question: are they already planning a successor to JWST? Is something better even possible? If it took more than 30 years, we should start sooner than later :)
https://caseyhandmer.wordpress.com/2021/10/28/starship-is-st... is correct. No NASA planning, including for space telescopes, shows any understanding of how much Starship changes the game. Instead of one, we can put up a network of telescopes. And try out crazy ideas.
Here is a concrete example. https://www.researchgate.net/publication/231032662_A_Cryogen... lays out how a 100 meter telescope could be erected on the Moon to study the early universe with several orders of magnitude better resolution than the JWST. The total weight of their design is around 8 tons. With traditional NASA technologies, transport of the material alone is over $30 billion and it had better work. With Starship, transportation is in the neighborhood of $10 million. Suppose that precision equipment added $40 million to the cost. Using Starship, for the cost of the JWST, we can put 200 missions of this complexity in space. Using a variety of different experimental ideas. And if only half of them worked, we'd still be 99 telescopes ahead of the JWST.
So where is Starship? It is on the pad, undergoing testing. They have a list of 75 environmental things to take care of before launch. Which means that they likely launch this month or next. At the planned construction cadence, even if the first 3 blow up, by Christmas it should be a proven technology.
I realize this is a joke, but it isn't! Play a video of a ball flying up and then back down again and it'll be the same forward or backwards (up to air friction anyway).
If it wasn't a joke, then that was simply a misleading false statement.
Let's take the simple example of earth orbiting around the sun. Playing time backwards gets you a orbit in the opposite direction, while gravity becoming antigravity would mean that earth would get repelled by the sun and thus go off to infinity.
That's interesting. Playing time backwards long enough would see the earth disassembled into rocks, dust and gas, repelling each other and indeed flying off into <far away>. Same with the sun. But the short term orbit example challenges the intuition. Perhaps the answer is that the time-forward orbit is (conventional) downhill in spacetime, and the time-backward orbit is uphill in spacetime, but both trajectories are seen in conventional space as a curved path around the center of gravity.
> Playing time backwards long enough would see the earth disassembled into rocks, dust and gas, repelling each other and indeed flying off into <far away>.
No, playing time backwards long enough would see a hot earth exploding into rocks, dust and gas that are attracting each other - just the initial velocity is large enough and attraction is not strong enough to stop them from flying out into <far away>. They would be slowing down when flying off, not accelerating as if they were repelling each other.
They would then be joined by the dissolving sun and form a cloud of dust which some time later (i.e. earlier) would converge (because the dust is attracting itself) into some earlier massive star(s) out of whose remains our solar system was formed.
If an asteroid hits the earth, the gravitational potential energy (of an attractive gravity) gets turned into kinetic energy as it accelerates when approaching the earth and afterwards into heat as it impacts it; playing time backwards, the heat gets turned into kinetic energy, which then gets turned into gravitational potential as it distances itself from earth.