For game consoles, we've had emulators like Nestopia and bsnes and Dolphin and Duckstation for years.
For PCs, virtualisation systems like VMWare and VirtualBox have covered most people's needs, and recently there's been high-fidelity emulators like 86Box and MartyPC.
The C64 has VICE, the Amiga has WinUAE, even the Apple II has had high-quality emulators like KEGS and AppleWin, but the Mac has mostly been limited to high-level and approximate emulators like Basilisk II.
In addition to Executor/DOS, a non-released version ran on the Sun 3 workstations (they too had 680x0 processors) and Executor/NEXTSTEP ran on NeXT machines, both the 680x0 based ones and the x86 powered PCs that could run NEXTSTEP.
Executor was the least compatible because it used no intellectual property from Apple. The ROMs and system software substitutes were all written in a clean room--no disassembly of the Apple ROMs or System file.
Although Executor ostensibly has a Linux port, it's probably hard to build (I haven't tried in a couple decades) in part because to squeeze the maximum performance out of a 80386 processor, the synthetic CPU relied on gcc-specific extensions.
I know a fair amount about Executor, because I wrote the initial version of it, although all the super impressive parts (e.g., the synthetic 68k emulator and the color subsystem) were written by better programmers than I am.
> I feel like that’s a bit harsh, but I’ll admit that it is needlessly inflammatory.
You're asking for a courtesy here that you failed to extend to others.
When you write a hit piece on someone's hobby volunteer code, and then you get called out for being unduly mean, I don't think you get to complain people are being harsh to you. You chose to devote hours of your time to dismantling something someone put years of effort in, entirely as a fun hobby. (Antique Mac emulation is certainly not the highway to riches.) You say 'inflammatory', like the issue here is that you're slightly heated and passionate. No, the issue here is that the piece boils down to bullying other people because their fun hobby projects don't meet your esoteric standards ('no Github releases!').
> I wasn’t in the best state mentally when I wrote that. (I do sometimes worry that I’m responsible for the disappearance of Paul C. Pratt…)
Nothing about your mental state gives you licence to bully others. Their emotional states are no less important than yours.
vmware and virtualbox were backed by billion dollar corps
the 16 bit machines are much simpler than macs
game consoles had highly homogenous well documented hardware, and sold in much greater numbers (snes alone sold more than all macs from 1987 to 1995) so there's a larger community to draw devs and users from. writing a nes emulator is almost a weekend project now, it's so documented.
Connectix got bought by Windows, and InnoTek got bought by Sun, which is now Oracle. Connectix themselves started as a scrappy outfit making it possible to run DOS/Win95 on a Mac.
The core emulation was pretty much done and stable and optimised before the billion-dollar corps bought them out.
even a "tiny, scrappy company" has massive manpower compared to 99.999% of open source projects
It's a shame that Basilisk - possibly owing to its inaccurate but killer features - is as janky as it is, because it's really remarkably pleasant to use when it works.
For earlier models, there are unused apertures in model memory maps that are at least 2KB large, but they do differ between models.
These seem to work:
https://archive.org/details/mac_rom_archive_-_as_of_8-19-201...
I try to run some of them, e.g. Macintosh Plus. It does accept the ROM, but it just shows a flashing floppy disk icon and doesn't do anything else. How could this be fixed?
https://www.gryphel.com/c/minivmac/start.html has some links.
Apps on Macintosh were supplied as disk images often compressed in hqx formats. You'll need PC Exchange and StuffIt Expander - think of those as equivalents of a file extension renaming tool, and 7-zip.
Installation of apps was often accomplished by dragging and dropping the application icon from installer image to Applications folder, although some did came with Windows-like installer apps.
Unmounting disks and installer images was done by moving disk icons to Trash(recycle bin). It's the most intuitive and straightforward feature of classic MacOS, followed by "Shut down" menu being under "Special".
My father used an external hard drive with his Mac Plus, back in the day.
It would probably be easier to crack the software!
[1]https://en.wikipedia.org/wiki/File:Apple_Keyboard_II.jpg
[2]https://www.cnet.com/tech/computing/hack-your-old-macs-adb-k...
https://github.com/gblargg/adb-usb
Unfortunately it's US-ANSI only so my pile of 4 french canadian AEK2s don't work very well with it.
It's one of the most convenient ways to get arbitrary-sized disk images both into emulators -- both Mac emulators and physical floppy emulators.
It's why the cursor stops moving sporadically when writing to a disk. The Mac has a 60 Hz interrupt timer that also tracks the cursor. It needs to be switched off when writing.
There's a story on Folklore.org by Andy Hertzfeld that mentions it in passing:
> Woz's disk technology required that the software feed it new data every 32 microseconds exactly. If we were even a single microsecond early or late, it would cause a glitch in the data and ruin it. In order to write the routines, I needed to know how fast the Macintosh executed each instruction. The manual gave the number of clocks for each instruction, but I wasn't sure how long it took to fetch from memory. So of course, I asked Burrell what the timings were, but I was surprised at his response.
> "I don't know. The Mac is synchronous, just like the Apple II, so each instruction has the same timing, every time you execute it, so you will be able to write disk routines that have exact timing. I don't know what it is, so we'll just measure it. Why don't you write your routine and we'll measure it with the logic analyzer."
-- https://www.folklore.org/Nybbles.html
This reminds me that all of the unusual Apple II disk stuff like spiral tracks and different-sized sectors and different nibbilization schemes were also, at least theoretically, possible on the Mac. I wonder if they were ever used for copy protection?
http://www.mac.linux-m68k.org/devel/plushw.php:
“The Macintosh disk interface uses a design similar to that used on the Apple II and Apple III computers, employing the Apple custom IWM chip. Another custom chip called the Analog Signal Generator (ASG) reads the disk speed buffer in RAM and generates voltages that control the disk speed. Together with the VIA, the IWM and the ASG generate all the signals necessary to read, write, format, and eject the 3 1/2-inch disks used by the Macintosh.”
Any chance this could be made to emulate an Atari ST?
And there's also Clock Signal (CLK) "A latency-hating emulator of: the Acorn Electron and Archimedes, Amstrad CPC, Apple II/II+/IIe and early Macintosh, Atari 2600 and ST, ColecoVision, Enterprise 64/128, Commodore Vic-20 and Amiga, MSX 1/2, Oric 1/Atmos, early PC compatibles, Sega Master System, Sinclair ZX80/81 and ZX Spectrum." https://github.com/TomHarte/CLK
Inspired is a strong word. I didn't invent the concept of an accurate emulator, although I'm certainly a fan of his approach.
A brand new 68k Mac emulator quietly dropped last night!!
“Snow” can emulate the Mac 128k, 512k, Plus, SE, Classic, and II. It supports reading disks from bitstream and flux-floppy images, and offers full execution control and debugging features for the emulated CPU. Written using Rust, it doesn't do any ROM patching or system call interception, instead aiming for accurate hardware-level emulation.
* Download link (Mac, Windows, Linux): https://snowemu.com
* Documentation link: https://docs.snowemu.com
* Source link: https://github.com/twvd/snow
* Release announcement: https://www.emaculation.com/forum/viewtopic.php?t=12509
-- https://oldbytes.space/@smallsco/114747196289375530
I understand why links get re-written, but I think the context is relevant and can help the random reader who is unfamiliar with the project.
I wish Apple would bring back the white menubar background and the coloured logo.
The white menubar makes the whole computer easier to use in a small but constant way. The coloured apple icon would suggest they no longer have their heads stuck up their assess and might bring back "fun" rather than "showing off" to their design process. And then maybe, maybe... with that "suggestion" symbolised in the UI, we can hope they might bring back the more rigorous user-centric design process they used to be famous for.
I suppose a built in volume mixer is still too much to ask for though.
It’s not churn its change, and it’s inevitable. No sense getting worked up over it.
The only improvement I've seen has been for mac they have the command+space launcher which is functionally like the win+type the app you want. Graphical file browsers haven't changed since the original Mac and/or Win 3.1. Mac has never had a good tree view IMO but they do have a version of it.
The only reason UIs would change at this point is to keep UI/UX folks employed and busy, and give the marketing department something new to talk about.
But I'm not going to upgrade whilst the back/next buttons are floating 3m above the window as suggested in that screen shot.
I go through phases with transparency off or on.
Sometimes I enjoy the translucent menus. They make the machine look "glossy" and expensive. But they're definitely harder to read than opaque flat ones.
With "reduce transparency" on, it's better, but the menubar still isn't white. It's a textured light grey that's closer to the look of an unfocused app window than the solid, dependable, flat thing I wish it still was.
A color logo might be added with an overlay app – or you reminisce a black&white screen.
Links to the actual project are in the submitted post, so you can get an overview before then being directed to the project itself.
As always YMMV, indeed, YMWV, but I like seeing the announcement giving the context rather than a bare pointer to the project.
But as the Man in Black says in The Princess Bride: "Get used to disappointment".
The guidelines are clear that the original/canonical source is what we want on HN:
Please submit the original source. If a post reports on something found on another site, submit the latter.
But you're welcome to post a comment with links to other sources that give the extra information and context, and we can pin it to the top of the thread, or do what I've done here and put them in the top text.
I understand the rationale, and as someone who moderates other communities I can totally understand why this is administered as a blanket policy. Having said that, it does sometimes result in what I think of as sub-optimal situations where information is unnecessarily lost or obscured.
In particular, adding a link to the original post, as you have done here, is likely to be of minimal value. People will click on the headline link, wonder what it's about or why it's "news", and close the window. On the other hand, clicking through first to the post means people will see the context, then those who are interested will click through to the project site(s). I've done this analysis in other contexts and found that the decision tree for engagement and user-information is in favour of linking to the post, not the project.
But as I say, I understand your position, and in the end, it's not my forum, not my community, and not my choice.
We always want the source that contains the greatest amount of information about the topic. As I wrote in the other reply in this subthread, the heuristic is whether a source contains "significant new information" vs an alternative.
That means, as explained in that reply, an article about the findings of an academic study is better than the academic paper, if it contains significant new information that isn't easily found from the paper itself (particularly if the article contains quotes from interviews with the researchers). A project creator’s blog post about a new project or release is better than a link to the project's GitHub page.
We generally prefer not to link to a third-party's social media post about a project, on the basis that it's light on significant new information and takes traffic/attention away from the primary source or another in-depth article about it. (It's different if it's a 3rd-party's detailed blog post about a project, which includes their own experiences using the project and comparing it with other projects in the same category. But then it's more of a review, than a report about the project itself.) Another problem with submitting a 3rd-party post about a project is that it then becomes a topic of debate in the comments, why one source was chosen over another, which happened here.
In a case like this, the information that was in that social media post could easily have been quoted in a comment in the thread, that we could have pinned.
Given that the author of the project posted an announcement in a discussion forum, there could be a case for making that the HN source, given that it contains the other relevant links and some additional commentary, though in this case it's a bit light on detail. But it makes all the difference that the source we link to is by the author of the project.
In the case of this submission, the story has been on the front page for 12 hours already, including some time at #1, and is still going strong, so I don't think anything has been lost.
You're always welcome to make a case for why a particular source is the one that contains the most "significant new information" and is thus the one that should be the HN source.
[1]: https://news.ycombinator.com/item?id=44381297
[2]: https://arxiv.org/pdf/2506.19244
[3]: https://www.quantamagazine.org/a-new-pyramid-like-shape-always-lands-the-same-side-up-20250625/
In the case you cited, the Quanta Magazine article is a report about the study’s findings that is readable and understandable to lay people, and includes backstory and quotes from interviews with the researchers and also images.
I.e., there’s plenty of information in the article that isn’t in the paper. So we’ll always go with that kind of article, over the paper itself, particularly in the case of Quanta Magazine which is a high-quality publication.
In other cases an article is “blog spam” - I.e., it just rewords a study without adding any new information, and in those cases we’ll link directly to the study, or to a better article if someone suggests it.
Anyone is always welcome to suggest a source that is the most informative about a topic and we’ll happily update the link to that.