Only using the party trick HAM mode though. 32 (plus 32 for the half-bright bit plane) is the mode that most software uses.
Of course in 1987 a Macintosh II with a fully expanded "Toby" framebuffer could not only do 256 colours, it could do it in 640x480 mode where as a PS/2's VGA could only do 16 colours at that resolution. And an Amiga could only do flickervision at that res.
Of course with technology improving all the time, not having a updated chipset circa 1987 that at least had a progressive scan 640x480(ish) is one of those things that really killed the chances of Amiga as a serious computer. They only got that circa 1990, and "Super VGA" was already just about becoming a thing in the PC world (and Microsoft had kinda got round to making a version of Windows that didn't suck by then). I'm not sure if the mythical Ranger had a progressive mode, but it's it does show how Commodore inability to keep the custom chips updated in a timely mannner slowly sunk the system...
> Of course in 1987 a Macintosh II with a fully expanded "Toby" framebuffer could not only do 256 colours, it could do it in 640x480 mode where as a PS/2's VGA could only do 16 colours at that resolution.
If cost is no issue, the PS/2 also had the 8514/A card that could do 256 colours at 1024x768. And there was also the PGC from 1984 that could do 256 colours at 640x480.
Indeed. As per this timing diagram, Denise accesses each 16-bit word of each bitplane sequentially. Any bitplanes you turn off, the more cycles available for blitter... or CPU!
Fun fact! The Amiga Workbench is 4 colour hires by default, because hires is impressively businessy... but 8 or 16 colour hires would lock out the CPU most of the time, as the chipset would have to dip into the 68000's even cycle RAM accesses and stall it. 4 colour hires lets the CPU (on a chipmem-only system) run at full speed!
Games having a title screen where you “Press Start Button” is a slightly odd convention going back to the arcades, even on games where there’s only one set of controls.
It does now have a limited set of themes based on previous consoles. Initially a 40th anniversary special feature that ended up sticking. But yeah, it’s a bit odd we’ve gone from Sony providing the tools to make your own theme with the PS3 to basically nothing with the PS5.
oh yeah, it turns out I even had one of these themes enabled (PS3 one). I got so used to it I forgot it's not how the console usually looks. Still a far cry from custom icons and animated backgrounds
I've got a spec for IPv2. Because of advances in carrier grade NAT, we can reduce the address field from 32 bits to 16, making amazing savings somehow.
I hearby propose an IPv6.1. The only change is the written form goes from:
2001:db8::ff00:42:8329
to
128.1.13.184..255.0.0.66.131.41
By doing this, I have changed IPv6 from the strange unwanted alien thing everyone hates, to the new wonder protocol that "just adds more dots" that everyone wants.
The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.
I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.
How much of Hombre is myth-and-legend? Given how little progress with made with OCS->ECS->AGA, it seems unlikely they could even have built an Amiga SoC, nevermind designed a new 64-bit chipset.
Don't agree there considering x86 has MODRM, size-prefix(16/32 and later 64bit operand sizes), SIB(with prefix for 32bit), segment/selector prefixes,etc.
Biggest difference perhaps where 68000 is more complicated is postincrement but considering all the cruft 32bit X86 already inherited from 8086 compared to the "clean" 32bit variations of 68000 I'd make it a toss at best but leaning to 68000 being easier (stuff like IP relative addressing also exists on the RISC-y ARM arch).
Apart from addressing the sheer number of weird x86 instructions and prefixes has always been the bane of lowpower x86.
It's a sequential colour camera, each field is red, green or blue filtered (using a spinning colour wheel), and they're processed back on earth to recombine them into a colour TV picture. Doesn't work that well with fast motion, as there's too much movement between the red, green, and blue images, hence the rainbowing. They were of course bandwidth limited so conventional NTSC might be an issue. Also a normal colour TV camera at the time used three (or four) image tubes, rather than the one in the Apollo cameras, which would have added size and weight (this is before things like CCDs were practical).
Bitsavers have some documents about the Jaguar RISC project[1] that do indicate Apple's feedback went into the 88110, for example in the System ERS it states "The main processor for the Jaguar is a new version of the Motorola 88000 family which has been enhanced (with input from Jaguar's team) in several areas over the existing implementation. This processor (which will be the MC88110) will be referred to as XJS in the ERS.". There's also an architecture document describing changes Apple wants to make to the 88000 ISA, although I'm not sure how much of this actually got through into the final 88110 (Apple wanted to break binary compatibility, not sure if that happened).
[1] The high end RISC machine project that went nowhere, which AFAIK became known as Tesseract when switched to PPC before it fizzled out.
There isn’t a Glide version of Quake. John Carmack didn’t want to do endless vendor specific API variants of Quake after an early Rendition Verite port burned him, so just released Glquake and said vendors should support standard APIs.
3DFX had a mantra of “no CAD” so didn’t support OpenGl, as they saw it as a primarily aimed at running CAD software etc. So therefore they had to come up with the somewhat hacky MiniGL to implement enough of OpenGL to get Quake to actually run.
That's because CAD graphics for games where a very different beast. NV Quadro cards would maintain far more objects being rendered but with slower FPS' as a CAD rendering wouldn't requiere real time FX with constant changes everywhere. If you have a look at GLIDE games the effects somehow look 'prettier' and the lights more 'alive'.
/u/sdz-mods - the person doing the 3DFX port to Irix - reported just now that they got MesaFX up and running and that'll serve as the basis of a "MiniGL" to get the port done. ;) So, there is hope!
Of course in 1987 a Macintosh II with a fully expanded "Toby" framebuffer could not only do 256 colours, it could do it in 640x480 mode where as a PS/2's VGA could only do 16 colours at that resolution. And an Amiga could only do flickervision at that res.
Of course with technology improving all the time, not having a updated chipset circa 1987 that at least had a progressive scan 640x480(ish) is one of those things that really killed the chances of Amiga as a serious computer. They only got that circa 1990, and "Super VGA" was already just about becoming a thing in the PC world (and Microsoft had kinda got round to making a version of Windows that didn't suck by then). I'm not sure if the mythical Ranger had a progressive mode, but it's it does show how Commodore inability to keep the custom chips updated in a timely mannner slowly sunk the system...
reply