XOR A absolutely works on Z80 and it's of course faster and shorter than loading a zero value with LD A,0.
LD A,0 is encoded to 2 bytes while XOR A is encoded as a single opcode.
XOR A has the additional benefit to also clear all the flags to 0. Sub A will clear the accumulator, but it will always set the N flag on Z80.
Yeah, the article seems to have missed the likely biggest reason that this is the popular x86 idiom - that it was already the popular 8080/Z80 idiom from the CP/M era, and there's a direct line (and a bunch of early 8086 DOS applications were mechanically translated assembly code, so while they are "different" architectures they're still solidly related.)
I think the answer is in the article text "all meticulously sectioned by publisher".
Book sizes differ somewhat between publishers, but each publisher tends to only print books in a few standard sizes. For the paperback editions this is even more reduced, it usually looks to me like all paperbacks from one publisher are the same size.
Because they are an industry, aren’t they?
Printing, binding, cover productions, transportation and storage, all that are much easier and much cheaper with a few standardized sizes.
I always thought it to be the other way around. If you assume that we are living in a simulation, then gravity might well be an artifact of a simulation that runs on localized, loosely coupled nodes.
Because when you have mostly empty space, with low interaction, then the simulation is able to run at full speed, but the more particles you have interacting at close range, the slower the simulation gets as the local workload increases.
In this model the force called gravity would not be the source, and time dilation one of it's effects, but instead time dilation caused by interactions is the cause and the gravitational force is what we experience as the result.
The same thing happened a week ago in Hamburg/Germany when the fire department wanted to inform the local population about necessary evacuations for removal of World War II unexploded ordnance that were found during construction work.
They had hit their Twitter limit and were unable to post the notice.
I just ran into the same issue yesterday when I was updating one of my machines. Furtunately I caught the DKMS error when mkinitcpio ran after the upgrade:
==> dkms install --no-depmod zfs/2.1.9 -k 6.2.8-arch1-1
Error! Bad return status for module build on kernel: 6.2.8-arch1-1 (x86_64)
Consult /var/lib/dkms/zfs/2.1.9/build/make.log for more information.
==> WARNING: `dkms install --no-depmod zfs/2.1.9 -k 6.2.8-arch1-1' exited 10
Issues like these are when being on a rolling-release distro might bite you.
To mitigate the risk a bit, I always have the linux-lts kernel package installed on all of my Arch Linux machines, which provides a fallback option when the mainline kernel fails, or some major kernel issue surfaces.
In my case with the LTS kernel at 6.1.21-1-lts, DKMS build of ZFS on LTS succeeded, so it would have been a fallback option to boot even if I would have missed the error for the mainline kernel DKMS.
I don't have my root on ZFS, I only use it for data storage, but if I ever did,I would make damn sure to build myself a live ISO with ZFS support and all drivers necessary for network access, ahead of time, in case I need to do an emergency repair. It doesn't need to always be on the latest kernel, just recent enough that you are able to mount all partitions and to chroot into the system.
For this issue a workaround with Kernel 6.2.8 is discussed in the AUR comments for zfs-dkms: https://aur.archlinux.org/packages/zfs-dkms
This would involve editing the PKGBUILD for ZFS to patch the license for the symbols to GPL, though I'm not sure about the legality of this approach.
> For this issue a workaround with Kernel 6.2.8 is discussed in the AUR comments for zfs-dkms: https://aur.archlinux.org/packages/zfs-dkms This would involve editing the PKGBUILD for ZFS to patch the license for the symbols to GPL, though I'm not sure about the legality of this approach
In the US there is 17 USC 117(a) which might apply:
> Making of Additional Copy or Adaptation by Owner of Copy.—Notwithstanding the provisions of section 106, it is not an infringement for the owner of a copy of a computer program to make or authorize the making of another copy or adaptation of that computer program provided:
> (1) that such a new copy or adaptation is created as an essential step in the utilization of the computer program in conjunction with a machine and that it is used in no other manner, or
(Section 106 is the section that says you need permission from the copyright owner to make copies or adaptations).
This whole GPL only symbols thing seems to happen to zfs every kernel release. They will find a way around it eventually, even if that means implementing stuff in zfs or spl.
I would argue that the credit of first 3D open-world game belongs firmly to Mercenary from 1985, and not to Hunter with it's 1991 release date.
Mercenary is a 3D vector environment where one can navigate the environment freely, on foot or by using different vehicles and aircraft, enter buildings and explore their interior.
The graphics is much more limited than Hunter with the first Mercenary game from 1985 only using simple wire-frame graphics, you can most of the time just walk through.
Even with the technical limitations of it's time this game proved really captivating, once you made it past the initial hurdle of figuring out what the hell you're supposed to do in this game, after crash-landing on an alien planet in the introduction.
It was one of the few games that really fascinated me back then, and I spent many hours playing it.
The best exploration of this theme that I have read, was an East German SF novel from the former GDR.
Unfortunately, as far as I know, it has never been translated into English.
The last German government before Merkel came to power, was a coalition between the Green party and the Social-Democrats (SPD), this government decided to phase-out nuclear power in Germany in favor of renewable energy by limiting the running time of existing nuclear power plants.
The next German government was a coalition between conservative CDU under Merkel and SPD, and the nuclear phase-out remained untouched.
But when Merkel won the federal elections a second time and was able to form a coalition with the Liberal-Democrats (FDP) replacing of the SPD as a coalition partner, who had voted in favor of the nuclear phase-out, they decided to cancel the nuclear phase-out.
This happened in the autumn of 2010 and was a controversial decision for the public opinion. When Fukushima happened half year later it was widely seen as a confirmation of the inherent risks of nuclear energy and thus the Green party’s energy politics, not because Germany is especially prone to Tsunamis, but because it showed that it's impossible to rule out and plan for every possible dangerous situation in advance, and that even a high-tech country like Japan, on a comparable technical level to Germany, was not able to stop the meltdowns.
Elections in federal states shortly after Fukushima resulted in a landslide loss for Merkel's CDU and big wins for the Green party, the CDU losing one of its stronghold states [1] it had before held constantly for 58 years.
As far as I remember Merkel foresaw the public reaction and almost immediately after Fukushima voiced out in favor of reversing the cancelation of the nuclear phase-out, but could only get a solid backing in her party, after the losses in the federal state elections.
reply