If failure really was a foundation for success, we reimplement all these technologies we software emulate now and where the patent has expired so we could implement them in hardware, including the firmware by hardbaking it as an ASIC (certainly ASIC memory is more dense than flash and obviously more so than SRAM).
For example, putting a pre-Y2K pentium on a chip with soundblaster, using diodes for power, and using wireless communication. It'd make the RISC-V ESP32 look wimpy.
I don't know if they put a soundblaster, but Intel's first generation of Edison boards is basically a 32nm pentium (complete with LOCK prefix bugs) combined with wireless stuff in the footprint of an SD card.
I wonder where the cutoff is where the value in the existing (legacy) x86 ecosystem beats the performance offered by a new platform like RISC-V.
The software stacks available are probably going to be more battle-tested. I'm also curious if there's more potential in resuscitating legacy code at the "application" level, if you knew you could run it in an embedded box which drew a fraction of a watt. Maybe all you really need is something that already existed as a DOS application.
However, it's comparing apples and freight locomotives. The main part of the x86 historic story revolved around desktop computing and keyboard-mouse-screen interactions[1], and the ESP32's is about internet-of-things and GPIOs, so even if the x86 module is beefier, it's tooling and connectivity may not be what you want out of the box if you were considering an ESP32.
On the other hand, we do see some places where they're clearly trying to hammer things like low-end ARM and ESP32 into vaguely more screen-and-keyboard things. I'm thinking of things like dedicated word-processor appliances (Alphasmart, etc.) or 300-in-1 knockoff consoles, where you could reasonably implement them as a 586-class CPU running off-the-shelf-software.
[1] Yes, I know full well there are loads of embedded x86 environments that look nothing like a PC. In university, our assembly language course involved setting up an 80186 developer board tethered to an ADM-3 terminal to be the world's most expensive digital clock.
Feels more polished? I absolutely do not understand that, some of the more social media friendly Pi projects involve building cluster file systems with them. Seems very nitty gritty to me.
What's deviating from stock GNU/Linux? Adding a third party repo?
Beyond how that is contradictory, are you saying that the vendor is providing a non-mainline version of Linux that can't be updated because the driver API will be broken upon update?
Hardware-level stuff/embedded is outside of my area of expertise, but as I understand it you are pretty much correct.
You see this a lot with Android devices and custom images. There will be drivers that are only provided in the vendor blessed image, patches that are difficult or impossible to port to new versions of the kernel, etc.
Again, this is me looking in from outside. Most of my information has come from reading about other peoples experience with hardware, especially android devices but also other embedded chips.
How are you incapable of understand sarcasm? In fact you're arguing from Appeal to Authority now. I just, I don't understand. The weird thing is that your argument is extremely convincing at first glance even though the entirety of your argument is in a link to wikipedia?
Just what do you expect my response is going to be to that? I certainly don't understand anything more now. You bring up someone obscure and state he is representative of what I could conclude is all teachers urging for reform of our education system? Maybe after reading the previous arguments by COMPLETELY SEPERATE PEOPLE I have been mislead.
Not to be personal or anything, but I went to public school.
That's in Disney's "The Magic Highway" (1958), at 02:16. Which is apparently still shown at Disney parks in the waiting area for some auto-related attraction.
I guess that's the true Turing Test, can an artificial general intelligence determine the complexity level and relative risk level of infinite loops for an algorithm written for a Turing machine.
I'm curious if a major distribution will attempt to support MacOS itself. I've heard of Gentoo and Debian supporting the FreeBSD kernel for a brief period of time, however I'm surprised that the only package manager support for MacOS consists of what seems to be a one man project.
Still waiting for Windows Platform to become Universal.