AmigaOS still has a special place in my heart, probably the most elegantly designed piece of software I've ever seen (apart from that ugly DOS part of course which was shoehorned in because of deadline pressure).
There was a single fixed location in the entire system (address 0x4 aka ExecBase), and everything an AmigaOS application required was 'bootstrapped' from that one fixed address.
All OS data structures were held together by linked lists, everything was open and could be inspected (and messed up - of course terrible for security, but great for learning, exploring and extending).
Amiga Exec completely spoiled me when it came to elegance of operating systems.
Everything I learned about after it was a huge disappointment, including Mach. Particularly because it demystified the OS. Just a bunch of of lists, and due to the OO nature, they were the same kinds of lists.
Here's what a node looks like: next, previous, a type, a priority, a name.
A task? A node. With a bunch of extra state.
An interrupt? A node. With a lot less extra state.
A message? A node. With an optional reply port if the message requires a reply.
Reply port? Oh, that's just a port.
A port? Yeah, a node, a pointer to a task that gets signaled and a list of messages.
How do you do I/O? Send special messages to device ports.
No "write() system call", it's queues at the lowest levels and at the API layer.
To me, a few things that stands out, that I'm increasingly looking to emulate, that are mostly not about the low level API/ABI:
* Assigns. Basically aliases for paths, but ephemeral and can be joined together. E.g. the search path for executables is the assign C:. The search path for dynamic libraries ins libs:. I've added basic, superficial support for assigns to my shell. It's a hack, but being able to just randomly add mnemonics for projects is nice, and not having to put them in the filesystem as a symlink somehow also feels nicer even if it only saves a few characters.
* Datatypes. AmigaOS apps can open modern formats even if the app hasn't been updated for 30 years as long as they use datatypes: Just drop a library and descriptor file into the system.
* Screens. I'm increasingly realising I want my wm to let apps open their own virtual desktops, and close them again, as a nice way of grouping windows without having to manually manage it, and might add that - it'd be fairly easy, and on systems that don't support it the atoms added would just be a no-op. The dragging was nice to show off at the time, but less important. Ironically, given the Amiga was one of a few systems offering overlapping windows when it launched, screens often served as a way for apps themselves to tile their workspaces on a separate screen/desktop, and my own wm setup increasingly feels Amiga-ish - I have a single desktop with floating windows and a file manager, just like the Amiga Workbench screen, and a bunch of virtual desktops with tiling windows.
In terms of the API, one of the things I loved was more something that evolved: The use of "dispatch/forwarding" libraries, like e.g. XPK, that would provide a uniform API to do something (like compression) and an API for people implement plugins. So much of the capabilities of Amiga apps are down to the culture of doing that, and which Datatypes was an evolution of, that means the capabilities of old applications keeps evolving.
I think that it also having preemptive multitasking also made a huge difference to make the system responsive and feel fast.
The GUI, with its windows and gadgets (= "widgets"/"controls") ran in Intuition's task. The mouse pointer was moved by a vblank interrupt and could thus never lag.
On Macintosh, the whole GUI ran practically in the active app's event loop.
The whole system could be held up by an app waiting for something.
Microsoft made the mistake of copying Apple when they designed MS-Windows.
Even this day, on the latest Windows, which although it has had preemptive multitasking since 1995, a slow app can still effectively hold up the user interface thus preventing you from doing anything but wait for it.
When Apple in the late '80s wanted to make their OS have preemptive multitasking, they hired the guy who had written Amiga's "exec" kernel: Carl Sassenrath.
> Even this day, on the latest Windows, which although it has had preemptive multitasking since 1995, a slow app can still effectively hold up the user interface thus preventing you from doing anything but wait for it.
Could you explain what you mean here? If you were to make your event loop or wndprocs hang indefinitely it would not hang the Windows interface for the rest of the machine, it would just cause ANR behavior and prompt you to kill the program. As far as I can remember it's been that way since at least Windows 2000.
An example I run into almost each day: run a non-trivial build in Visual Studio which consumes all available CPU cores. Now try to use the trackpad to scroll some random window content, it just doesn't work, at best you get a UI view that jumps around instead of smooth scrolling.
AFAIK Windows is supposed to boost the CPU priority of the UI during user input, but apparently that doesn't work.
AmigaOS also boosted the CPU priority of the UI during mouse movement, except it actually worked.
PS: and instead of fixing the issue from the ground up in the OS (which admittedly is probably impossible) the VS team instead added a feature called 'Low Priority Builds':
I think what OP's saying is that on the Amiga it was idiomatic to let the UI be handled by a dedicated thread/task. That was also the norm on other notable systems such as BeOS. It's still a good guideline today but not so easy to apply.
I mean the windows. An app can open a window and hold it there, and meanwhile you can't move the window, you can't move it to the back and you can't minimise it.
Oh I see. I'm not familiar with the model used by Workbench and how it differs exactly.
(edit: and to be clear, I did read the article and see what it said, but without more detail I'm not 100% sure what it really looks like in practice, and why it would be less likely for applications to have situations where they become unresponsive.)
Amiga computers are expensive these days, you are better off with an Amiga Emulator like Amiga Forever.
I remember the Amiga had the checkered beach ball bouncing demo and others copied it, then on the Amiga they opened up several copies of the demo all multitasking and bouncing at the same time.
The only downside of the Amiga was the dreaded Guru Meditation Errors when memory went corrupt or something. IIRC AmigaDOS/Workbench had no protected memory features.
> IIRC AmigaDOS/Workbench had no protected memory features.
This was a limitation of the original MC68k CPU architecture. Though the Amiga operating system was indeed designed to leverage a single address space throughout, which made it significantly harder to retrofit memory protection after the fact.
The original Macintosh had a resolution of 512 x 342 pixels. The Amiga 1000 had several resolutions up to 640 x 400 @ 16 colors, and could utilize 4096 colors in lower resolutions. The reality distortion field seems to still be working, I guess. The Amiga was better in every way practical way.
The differences between the Amiga and Mac are a fascinating study in priorities. The Amiga was orientated around home use so its graphics hardware was designed to be attached to a TV. That design goal affected the entire system, to the point that the system clock changes frequency depending on whether the machine is outputting PAL or NTSC.
Technically the Amiga could display a rock solid hires picture but only on a special monitor that I personally never saw.
The priority on the Mac was to have a high quality monitor for black and white graphics. They put a lot of effort into drawing libraries to make the most of in-built display.
The result was that the Amiga was perfectly fine for playing games or light word processing but if you actually needed to stare at a word processor or spreadsheet for 8 hours a day you really wanted a Mac.
Nope. The Amiga did just fine 8 hours a day, even 16 hours a day, coding and word processing and everything else. Interlace wasn't a huge issue, and the price difference between Macintosh and Amiga made it easy to justify. A simple "flicker fixer" could be added for far less than the difference in price between an Amiga and a Mac. The Mac cost $1000 more than an Amiga, which is a lot to pay to get a far less capable platform. A "flicker fixer" went for $100.
Apple in the 90's was cirlcing the drain, nobody wanted an overpriced black-and-white computer except die-hard apple fans, and Apple only exists today because Microsoft bailed them out. Too bad Microsoft didn't invest in Amiga instead.
The top resolution was 640 x 400, not lower than the Mac, which had one single 2-color resolution. Interlace didn't bother me, but a "flicker fixer" was only $100. The Amiga had a ton of cheap and easy mods depending on what your interests were.
I think you are missing the point. I was an actual user of both and preferred the macintosh. The Amiga graphics were color but underwhelming in resolution.
640×400 was an interlaced mode. Every other scanline was shown every other frame.
This was a NTSC TV signal, remember?
There was an add-on called a "Flicker Fixer" that cached the video signal and emitted VGA-signals at twice the pixel clock and horizontal refresh rate. The Amiga 3000 had one built in.
The ECS and AGA chipsets supported "Productivity mode" that emitted VGA signals but
ECS supported only four colours in this mode. All games used the TV modes.
"Multisync" monitors that could switch between VGA and television refresh rates were expensive, so few people had them.
As I mentioned in a previous post, it cost a LOT more than $100. $500 for a Microway flicker fixer + another $400 for the monitor. They may cost $100 "today", but at the time, it was very expensive.
Also remember the Amiga was competing with the Mac II line for most of its life. Yes, it was much more expensive... but we are comparing specs, and you could get Mac II displays that supported 256 colors out of 16 million (24-bit.) The Amiga didn't have 24-bit color until 1992.
>As I mentioned in a previous post, it cost a LOT more than $100.
Nope, at the time the flicker fixer was about $100. The monitor didn't cost that much either. I had both, I was not rich, I was a kid who saved up some money.
>Also remember the Amiga was competing with the Mac II line for most of its life.
We're talking about the original Macintosh computer released in Jan 1984, and the original Amiga 1000 released in July 1985. Don't move the goalposts.
The Mac II was not released until 1987, and it had a 68020 CPU. The system you should be comparing it to is the Amiga 2000 with 68020 card. The Mac II was released at a ridiculous price of $3700. It really makes the Amiga look like a bargain - the price for an Amiga 2000 with 68020 was $1495, you could easily buy two of them for the price of one Mac II.
>The Amiga didn't have 24-bit color until 1992.
I don't care? The topic was the original Mac vs the original Amiga 1000. Not wherever you want to take this conversation.
>The Amiga graphics were color but underwhelming in resolution.
You cannot be serious. I provided the ACTUAL specifications of the screen resolution of both platforms, and somehow you still say the Amiga was "underwhelming in resolution" when it actually had MORE pixels in both horizontal and vertical than the Macintosh? How can you actually say this? The Amiga had 256,000 pixels, the Macintosh had only 175,104 pixels. The numbers do not lie. The Amiga had 80,896 MORE pixels than the Macintosh. PAL mode offered even more pixels on the Amiga. You're just plain wrong.
FWIW, I also had both platforms, and vastly preferred the Amiga, not just for the higher screen resolution, but also the 4096 colors it provided vs. the 2 colors of the Macintosh. And the far better multitasking, stereo 14-bit sound, amazing games, AREXX, and a lot more. Mac was always way behind the Amiga, in every single way including resolution.
Stereo 14-bit sound didn't really happen until the 1990s and needs quite a lot of CPU time and careful hacking, as Amiga sound chips are 8-bit, 4-channel. 4096 colors via weird hold-and-modify modes was never useful outside of demos and vanishingly few games. Nobody used 640x400 because interlacing was far too flickery, and such resolutions certainly didn't support 4096 colors.
The Amiga was ahead of its time in many ways, and the pre-emptive multitasking was fantastic, but claiming it was some paragon doesn't help anyone. If you wanted a fun home machine attached to a TV, it was great. Even a fun home machine attached to a monitor. If you wanted a business machine with a monitor, it wasn't the safest or best choice, if only due to a lack of software.
The Amiga 1000 can do 14 bit stereo sound. What did the Mac have? Oh, single-channel 8-bit. It doesn't matter if it took all the CPU to do it, the mac simply was not capable of coming anywhere close to Amiga in terms of graphics or sound. The standard Amiga sound capabilities were 4-channel stereo 12-bit, which was still light years ahead of Apple.
Try to notice how you're avoiding answering the points I raised and jumping at any chance to defend the Amiga (it was only 4-channel 8-bit built in and nobody achieved 14-bit sound before 1991). The Amiga vs X wars were decades ago at this point - you could just let it go.
>4096 colors via weird hold-and-modify modes was never useful outside of demos and vanishingly few games.
16 colors is still way better for games than 2 color black-and-white. The fact that the Amiga could achieve 4096 colors in a world where 16 colors was the norm, was astonishing. The resolution did not matter. The capabilities mattered, and the Amiga was far more capable in every way than the Mac. The mac had 1-channel 8-bit sound, the Amiga had 4-channel stereo 8-bit sound, and was capable of 14-bit sound. So go nitpick some more if you want to, I don't care, but I won't be responding to you anymore.
The Mac’s screen was flicker free. Almost nobody used any Amiga any higher than 640x200 (640x256 PAL) unless they had special hardware (“flicker fixer”.)
A "flicker fixer" cost $100. A macintosh with 2-bit lower resolution graphics cost $1000 more than the Amiga. The Amiga was a bargain. And I don't remember interlace modes being that bad, honestly.
Early Macs weren't even 2-bits, they were 1-bit: black or white!
The Amiga was a bargain in comparison, but it was not without its flaws, like all early machines. I had an A500 with a 1084 monitor, and the flicker at high res was bothersome to me. I later upgraded to an A3000 w/VGA monitor, and it was a vast improvement. I ran at 640x400 for everything at that point.
I think you are underestimating the price of "flicker fixers" at the time. I looked up the price of a Microway flicker fixer in an old Amiga World from 1988: Over $500. You also had to add an a VGA monitor: another $400.
>I think you are underestimating the price of "flicker fixers" at the time. I looked up the price of a Microway flicker fixer in an old Amiga World from 1988: Over $500. You also had to add an a VGA monitor: another $400.
Still way, way better than a 2-color mac at about the same price. And it wasn't even necessary, the Amiga 1000 was amazing without it.
> My Amiga 1000 ran rings around the Macintosh with the same CPU, because it had custom co-processors
And the earliest ARM machines ran rings around the Amiga because they had a custom-designed RISC CPU, so they could dispense with the custom co-processors. (They still cost a lot more than the Amiga, since they targeted the expensive education sector. Later on ARM also got used for videogame arcades with the 3DO.)
Want to give an Amiga user an orgasm? Fuck them gently and at the right moment, nibble their ear and whisper the words "custom chips" into it.
By contrast, there's a story about some Microsoft engineers taking a look at the Macintosh and asking the Apple engineers what kind of custom hardware they needed to pull off that kind of interface. The Apple guys responded, "What are you talking about? We did this all with just the CPU." Minds blown.
The designers of the Mac (and the Atari ST) deserve mad credit for achieving a lot with very little. Even though, yes, the Amiga was way cooler in the late 80s.
And the Amiga was faster than the Mac when emulating a Mac.
I know this first hand, because I got my first email address with CompuServe, running their software under emulation, while using my Amiga's dial-up modem. (I had to sneak the Mac ROM images from the computers at school...)
In one amusing juncture of really bad port and emulation/hardware speed, Sim City 2000 was reported to run better as Shapeshifter-emulated Mac version, than the Amiga native version, as long as you could meet the hardware requirements.
This was due to several factors, chief of which was that the SC2000 Amiga port was made under extreme time pressure and, probably, very low budget. Later patches alleviated that to some degree, but patching your game in 1993? Who did that? What you got on your floppies was usually what you were stuck with barring some extreme cases of negligence.
"no dynamic linking" (by implementing dynamic linking)
"no zombies" (as long as your programs aren't buggy)
I fail to see any meaningful distinction from what we have today. If it was more reliable, it was due to being smaller in scope and with a barrier to entry.
Not to mention any errant program could easily take down the OS because there was no memory protection. Amiga users quickly became familiar with the “guru meditation” screen, which was the system’s BSOD.
There is an important distinction between "one address space" and "one set of memory permissions". The former is usually a good idea (for debuggability if nothing else!) if you don't need to support `fork`; the latter is the problem.
On modern Linux systems you can even do separate sets of memory permissions within a single process (and single address space), with system calls needed only at startup; see `pkeys(7)`.
There was a single fixed location in the entire system (address 0x4 aka ExecBase), and everything an AmigaOS application required was 'bootstrapped' from that one fixed address.
All OS data structures were held together by linked lists, everything was open and could be inspected (and messed up - of course terrible for security, but great for learning, exploring and extending).
Everything I learned about after it was a huge disappointment, including Mach. Particularly because it demystified the OS. Just a bunch of of lists, and due to the OO nature, they were the same kinds of lists.
Here's what a node looks like: next, previous, a type, a priority, a name.
A task? A node. With a bunch of extra state.
An interrupt? A node. With a lot less extra state.
A message? A node. With an optional reply port if the message requires a reply.
Reply port? Oh, that's just a port.
A port? Yeah, a node, a pointer to a task that gets signaled and a list of messages.
How do you do I/O? Send special messages to device ports.
No "write() system call", it's queues at the lowest levels and at the API layer.
* Assigns. Basically aliases for paths, but ephemeral and can be joined together. E.g. the search path for executables is the assign C:. The search path for dynamic libraries ins libs:. I've added basic, superficial support for assigns to my shell. It's a hack, but being able to just randomly add mnemonics for projects is nice, and not having to put them in the filesystem as a symlink somehow also feels nicer even if it only saves a few characters.
* Datatypes. AmigaOS apps can open modern formats even if the app hasn't been updated for 30 years as long as they use datatypes: Just drop a library and descriptor file into the system.
* Screens. I'm increasingly realising I want my wm to let apps open their own virtual desktops, and close them again, as a nice way of grouping windows without having to manually manage it, and might add that - it'd be fairly easy, and on systems that don't support it the atoms added would just be a no-op. The dragging was nice to show off at the time, but less important. Ironically, given the Amiga was one of a few systems offering overlapping windows when it launched, screens often served as a way for apps themselves to tile their workspaces on a separate screen/desktop, and my own wm setup increasingly feels Amiga-ish - I have a single desktop with floating windows and a file manager, just like the Amiga Workbench screen, and a bunch of virtual desktops with tiling windows.
In terms of the API, one of the things I loved was more something that evolved: The use of "dispatch/forwarding" libraries, like e.g. XPK, that would provide a uniform API to do something (like compression) and an API for people implement plugins. So much of the capabilities of Amiga apps are down to the culture of doing that, and which Datatypes was an evolution of, that means the capabilities of old applications keeps evolving.
You can still have that Amiga feeling on old PCs by using AROS: https://aros.sourceforge.io/
On Macintosh, the whole GUI ran practically in the active app's event loop. The whole system could be held up by an app waiting for something.
Microsoft made the mistake of copying Apple when they designed MS-Windows. Even this day, on the latest Windows, which although it has had preemptive multitasking since 1995, a slow app can still effectively hold up the user interface thus preventing you from doing anything but wait for it.
When Apple in the late '80s wanted to make their OS have preemptive multitasking, they hired the guy who had written Amiga's "exec" kernel: Carl Sassenrath.
Could you explain what you mean here? If you were to make your event loop or wndprocs hang indefinitely it would not hang the Windows interface for the rest of the machine, it would just cause ANR behavior and prompt you to kill the program. As far as I can remember it's been that way since at least Windows 2000.
AFAIK Windows is supposed to boost the CPU priority of the UI during user input, but apparently that doesn't work.
AmigaOS also boosted the CPU priority of the UI during mouse movement, except it actually worked.
PS: and instead of fixing the issue from the ground up in the OS (which admittedly is probably impossible) the VS team instead added a feature called 'Low Priority Builds':
[1] https://developercommunity.visualstudio.com/t/Limit-CPU-usag...)
[2] https://devblogs.microsoft.com/cppblog/msbuild-low-priority-...
(edit: and to be clear, I did read the article and see what it said, but without more detail I'm not 100% sure what it really looks like in practice, and why it would be less likely for applications to have situations where they become unresponsive.)
I remember the Amiga had the checkered beach ball bouncing demo and others copied it, then on the Amiga they opened up several copies of the demo all multitasking and bouncing at the same time.
The only downside of the Amiga was the dreaded Guru Meditation Errors when memory went corrupt or something. IIRC AmigaDOS/Workbench had no protected memory features.
This was a limitation of the original MC68k CPU architecture. Though the Amiga operating system was indeed designed to leverage a single address space throughout, which made it significantly harder to retrofit memory protection after the fact.
Technically the Amiga could display a rock solid hires picture but only on a special monitor that I personally never saw.
The priority on the Mac was to have a high quality monitor for black and white graphics. They put a lot of effort into drawing libraries to make the most of in-built display.
The result was that the Amiga was perfectly fine for playing games or light word processing but if you actually needed to stare at a word processor or spreadsheet for 8 hours a day you really wanted a Mac.
Apple in the 90's was cirlcing the drain, nobody wanted an overpriced black-and-white computer except die-hard apple fans, and Apple only exists today because Microsoft bailed them out. Too bad Microsoft didn't invest in Amiga instead.
There were even C64, Apple II, IBM PC-DOS, and Atari ST emulators for the Amiga.
In what universe 512x342 was better that 640x400 ?
There was an add-on called a "Flicker Fixer" that cached the video signal and emitted VGA-signals at twice the pixel clock and horizontal refresh rate. The Amiga 3000 had one built in.
The ECS and AGA chipsets supported "Productivity mode" that emitted VGA signals but ECS supported only four colours in this mode. All games used the TV modes. "Multisync" monitors that could switch between VGA and television refresh rates were expensive, so few people had them.
Also remember the Amiga was competing with the Mac II line for most of its life. Yes, it was much more expensive... but we are comparing specs, and you could get Mac II displays that supported 256 colors out of 16 million (24-bit.) The Amiga didn't have 24-bit color until 1992.
Nope, at the time the flicker fixer was about $100. The monitor didn't cost that much either. I had both, I was not rich, I was a kid who saved up some money.
>Also remember the Amiga was competing with the Mac II line for most of its life.
We're talking about the original Macintosh computer released in Jan 1984, and the original Amiga 1000 released in July 1985. Don't move the goalposts.
The Mac II was not released until 1987, and it had a 68020 CPU. The system you should be comparing it to is the Amiga 2000 with 68020 card. The Mac II was released at a ridiculous price of $3700. It really makes the Amiga look like a bargain - the price for an Amiga 2000 with 68020 was $1495, you could easily buy two of them for the price of one Mac II.
>The Amiga didn't have 24-bit color until 1992.
I don't care? The topic was the original Mac vs the original Amiga 1000. Not wherever you want to take this conversation.
Fair enough on the Mac II. They were very expensive and a rip off! Also, early Mac OS sucked.
You cannot be serious. I provided the ACTUAL specifications of the screen resolution of both platforms, and somehow you still say the Amiga was "underwhelming in resolution" when it actually had MORE pixels in both horizontal and vertical than the Macintosh? How can you actually say this? The Amiga had 256,000 pixels, the Macintosh had only 175,104 pixels. The numbers do not lie. The Amiga had 80,896 MORE pixels than the Macintosh. PAL mode offered even more pixels on the Amiga. You're just plain wrong.
FWIW, I also had both platforms, and vastly preferred the Amiga, not just for the higher screen resolution, but also the 4096 colors it provided vs. the 2 colors of the Macintosh. And the far better multitasking, stereo 14-bit sound, amazing games, AREXX, and a lot more. Mac was always way behind the Amiga, in every single way including resolution.
The Amiga was ahead of its time in many ways, and the pre-emptive multitasking was fantastic, but claiming it was some paragon doesn't help anyone. If you wanted a fun home machine attached to a TV, it was great. Even a fun home machine attached to a monitor. If you wanted a business machine with a monitor, it wasn't the safest or best choice, if only due to a lack of software.
16 colors is still way better for games than 2 color black-and-white. The fact that the Amiga could achieve 4096 colors in a world where 16 colors was the norm, was astonishing. The resolution did not matter. The capabilities mattered, and the Amiga was far more capable in every way than the Mac. The mac had 1-channel 8-bit sound, the Amiga had 4-channel stereo 8-bit sound, and was capable of 14-bit sound. So go nitpick some more if you want to, I don't care, but I won't be responding to you anymore.
That being said I preferred the Amiga.
The Amiga was a bargain in comparison, but it was not without its flaws, like all early machines. I had an A500 with a 1084 monitor, and the flicker at high res was bothersome to me. I later upgraded to an A3000 w/VGA monitor, and it was a vast improvement. I ran at 640x400 for everything at that point.
I think you are underestimating the price of "flicker fixers" at the time. I looked up the price of a Microway flicker fixer in an old Amiga World from 1988: Over $500. You also had to add an a VGA monitor: another $400.
Still way, way better than a 2-color mac at about the same price. And it wasn't even necessary, the Amiga 1000 was amazing without it.
And the earliest ARM machines ran rings around the Amiga because they had a custom-designed RISC CPU, so they could dispense with the custom co-processors. (They still cost a lot more than the Amiga, since they targeted the expensive education sector. Later on ARM also got used for videogame arcades with the 3DO.)
By contrast, there's a story about some Microsoft engineers taking a look at the Macintosh and asking the Apple engineers what kind of custom hardware they needed to pull off that kind of interface. The Apple guys responded, "What are you talking about? We did this all with just the CPU." Minds blown.
The designers of the Mac (and the Atari ST) deserve mad credit for achieving a lot with very little. Even though, yes, the Amiga was way cooler in the late 80s.
I know this first hand, because I got my first email address with CompuServe, running their software under emulation, while using my Amiga's dial-up modem. (I had to sneak the Mac ROM images from the computers at school...)
This was due to several factors, chief of which was that the SC2000 Amiga port was made under extreme time pressure and, probably, very low budget. Later patches alleviated that to some degree, but patching your game in 1993? Who did that? What you got on your floppies was usually what you were stuck with barring some extreme cases of negligence.
https://en.wikipedia.org/wiki/Carl_Sassenrath
"no dynamic linking" (by implementing dynamic linking)
"no zombies" (as long as your programs aren't buggy)
I fail to see any meaningful distinction from what we have today. If it was more reliable, it was due to being smaller in scope and with a barrier to entry.
On modern Linux systems you can even do separate sets of memory permissions within a single process (and single address space), with system calls needed only at startup; see `pkeys(7)`.
https://www.man7.org/linux/man-pages/man7/pkeys.7.html
(note however that there aren't enough pkeys available to avoid the performance problem every microkernel has run into)