I'm one of those Macs since the beginning folks. The first computer I truly used on my own was the family Mac Classic. I've watched Apple almost die and come back from the brink of destruction. I've used Macs daily in school, the school lab where I undoubtedly fell in love with digital media was full of Powermac G4s.
Before my 2018 Mini I had a very upgraded Mac Pro. Absolutely loved how I could squeeze in all the drives and cards for more ports into it. My chief complaint of course was heat. I live in a somewhat warm part of California already and having a heat lamp keeping my pet turtle alive AND a Mac Pro pumping out the heat in the same room was just a no go. Doesn't help my older monitors were putting out heat too.
Essentially during the winter, I never ran a heater.
Fast forward to today. I like my 2018 Mini. Its about the same multicore as my Mac Pro and substantially better at single core. It runs cooler, the fans aren't as loud and the new Ultrasharps put out way less heat. Hypothetically a win win.
Hypothetically.
I used to have a single power strip running everything, now I have 3. Hubs, docks and enclosures all take power and not everything plays nice when plugged into one another. After much experimentation and many forced hard shutdowns I have finally found a good balance but not everything is rosy. For one my Mini can't sleep with its RX 580 eGPU and probably 1/10 startups result in a random crash. The internal HDMI port causes crashes and one of the two monitors has to have video coming from the Mini semi-directly to work. The workaround is using a Dell WD15 USB C dock -> MiniDP -> Secondary Monitor.
This brings me to my future with Apple hardware.
One of the my biggest complaints about Apple is the depreciation and eventual dropping of things they don't like anymore. Right now I'm still on Mojave because I reference 32-Bit software. I've read people grumbling about losing Firewire support on newer MacOS for random devices. I keep a Sony DSR-45 DV Deck around for conversion projects, I'd hate to lose that.
The iOS bloat isn't the great either nor is the fact that I just can't plug in my Samsung S22 Ultra that I have to use for work and transfer files to and from it. My Windows laptop, no problem there.
Another point of contention is the lack of ports on all the current available Macs. The Studio has exactly as many ports as my current Mini and the M1 Mini lost ports. For reference this is my current Thunderbolt and USB trees.
View attachment 2040197
View attachment 2040196
Thats alot of hubs and docks just to connect stuff I use. On my Mac Pro I had 2x USB 3 cards to attach everything, worked without a hitch.
As a hobbyist, power user, artist there doesn't really seem to be a future for me and Apple. I can't stay on an obsolete OS forever and all the bugs associated with my current Mini have really turned me off from any future hardware. Heck, I can't even use Thunderbolt 1 and 2 devices on a 4 enabled computer.
Windows isn't as elegant as a Mac but at least it seems like I can actually do what I want to do with it.
From the standpoint of ports, the version of the Mac mini that has the M2 Pro (which directly replaces the higher-end 2018 Mac minis) has the exact same I/O arrangement. No, you won't have eGPU support. No, you won't have the ability to run Windows via Boot Camp, let alone run any version of Windows that isn't Windows 11 Pro or Enterprise for ARM64 in a VM, and no you will not be able to run macOS Mojave or any other Intel release of macOS, let alone one that has support for 32-bit Intel apps.
The Mac runs cooler, and will be way faster. But those are kind of the only of your problems that it solves.
Certainly, as someone who uses both Windows and Mac, as someone who has a couple of 2015 era Macs specifically for Mojave, and as someone who also despises Apple's penchant for deprecating software support far more quickly than any other desktop platform out there and for not all that much end benefit, I greatly empathize. Windows will be better.
I hadn't really been all that involved with Windows until Windows 10 and I will say that Windows 10 is pretty great. Windows 11, with a couple minor annoyances that I've yet to get over, is similar. Provided the hardware you run it on is good (and shopping for that can be a bit of a challenge in and of itself), you will be left with something no less stable than the best of Intel Macs.
Apple Silicon DOES result in faster and more battery-efficient Macs, but that's not to say that the performance on a good Intel Mac or good Intel/AMD PC is bad. Again, provided you buy the right machine, it's generally no worse.
This is a Mac forum and most on here are diehards that will talk trash on Windows. But the fact of the matter is that it's a perfectly viable alternative.
Frankly, I'm jealous that I can't join you in eschewing Apple altogether. (Once my Intel Macs don't serve as much purpose, my Mac usage will pretty much scale down to only really needing a specked out 13-inch MacBook Pro or a binned Mx Pro based 14-inch MacBook Pro and that's it. Windows covers all other needs and, for that, it's great!)
As you stated, it sounds like you really need/enjoy legacy support and if that's the case then Windows is probably your best bet especially since you're not full engrained into the Apple ecosystem (Samsung phone) because once you are it's hard to get out.
While I don't disagree that it's much easier to stop using macOS or iPadOS if you don't have an iPhone. Not having an iPhone doesn't necessarily mean you're not still somewhat locked into an ecosystem. iCloud is probably the biggest element when it comes to this. But, considering things like the Apple Card, the Apple Watch, AirTags (at least with the ability to update their firmware) all require an iPhone, you're not far off.
I don't entirely disagree, but, on the other hand, an obsession with backwards compatibility is part of what is holding Windows back - and why Apple have successfully been able to switch processor architecture 3 times while Windows is still joined at the hip with x86 (despite several efforts over the years to support other processors).
In what actually practical ways is Windows held back by backwards compatibility? If you're saying that it takes them 6 years to sunset Internet Explorer and even longer to sunset Control Panel instead of settings, sure, I'll grant you that. But if I'm both a software developer as well as a user that depends on software, I'd much rather my OS vendor phase things out slowly and gracefully rather than the change-for-change's-sake attitude that Apple has. Plus, it's not like Apple HAS to leave certain apps, users, and developers in the lurch just to update, say, Metal. Maybe the hardware.
Toward that end, I think it's fine that macOS dropped PowerPC hardware support in Snow Leopard. But I don't know what it did for developers, consumers, or the platform itself by dropping that first Rosetta version. Similarly, I understand that the days of supporting Intel Macs have to be numbered at this point. While sad, that's also okay and SHOULD happen. But I don't know what a possible dropping of Rosetta 2 does for the platform in terms of its advancement other than force people like the OP to hold onto older hardware that Apple would prefer not be held onto.
Also, Microsoft IS branching Windows out into ARM. They're not forcing a transition because, that's not how they roll and, ideally, people make their apps compatible with both architectures so that users buy the computer they want. I don't see how that's holding anyone back. Even still, that goes back to my first question, what does the Windows platform need to advance to that it doesn't already have?
Any Mac software that is still 32 bit is now well and truly "abandonware" - and while there is some totally self-contained software that can potentially keep doing its job for ever, most things will eventually become just too out of date as tech changes.
I think the practical question you have to ask yourself is - if you switch to Windows - will you still need to keep the Mac Mini around to "reference" Mac software? Otherwise, those old 32 bit apps will still be toast along with every other Mac app that you use - and one way or another you''l still need to keep that 2018 Mojave Mac around (at least it's a Mini!) So it comes down to guessing what hypothetical things that Apple might drop in the future vs. the future of Windows. Already, with Windows 11, MS have been more aggressive in dropping support for older hardware ("because security") and finally killed off 16 bit software - and the Windows software market is heading down the same slippery slope towards subscription-based and/or server-dependent software as the Mac.
The key difference is that Microsoft is dropping support for technologies that, through the power of telemetry, they know people aren't using anymore and/or shouldn't still be using if they are. They sunset 32-bit OSes in Servers back with Windows Server 2008 R2 (the server analogue to Windows 7); they stopped shipping 32-bit to OEMs with Windows 7 and 8, and strongly encourage that only business class hardware be given 32-bit Windows 10 drivers at all. There were companies with old legacy software that still needed it, even if the average Windows-wielding consumer had long comfortably transitioned to 64-bit Windows.
Yes, Windows 11 ditched a lot of hardware that was much more recent than the hardware capable of running Windows 10 with full driver support. That doesn't mean that 99% of software that worked in 64-bit Windows 7, let alone 8, let alone 8.1, let alone 10, won't continue to run unmodified in Windows 11. That's the key difference. Since Apple first moved to Intel, I've had to stop using Mac OS 9 apps, then later PowerPC Mac OS X apps, then later 32-bit Intel apps; and you know it's only a matter of time before Apple concocts some nonsense reason to drop Rosetta 2, leaving 64-bit Intel Apps in the cold.
Your post makes some interesting points, but I don't understand these.
Apple dropped 32-bit support because 64-bit is an improvement without compromise
First off, I'm pretty sure that's NOT the reason they dropped 32-bit support. They didn't ever say, to my knowledge. They merely warned developers at the Platforms State of the Union one year at WWDC.
Secondly, every Intel Mac in existence has, in its Intel processor, 32-bit instruction sets. You throw 32-bit x86 instructions at any Intel Mac, and it will know what to do with them at a hardware level.
Thirdly, unlike the massive de-bloat that occurred in Mac OS X Snow Leopard following the removal of PowerPC code, macOS Catalina didn't debloat by removing the ability to run 32-bit software. If anything, the OS got bigger.
For those on an Intel Mac, losing the ability to run 32-bit Mac Apps brought absolutely zero benefit to the Intel version of macOS. None. Didn't streamline anything. Didn't make the codebase better. Did nothing.
So, why did they do it? My money is on the following:
1. Apple produced their first 64-bit Apple Silicon SoC, the A7. Which, incidentally was the first ever 64-bit ARM processor on the market. Apple has stayed relatively ahead of even ARM's own reference designs, ever since.
2. Apple dropped 32-bit iPhone and iPad app support with the 2017 release of iOS 11. Incidentally, releasing at the same time was the A11 Bionic, the first Apple SoC (and first known ARM CPU) to drop 32-bit instruction sets. And to clarify that bit, unlike every single Intel processor put in every single Intel Mac ever produced, devices with the A11 Bionic, namely the iPhone 8, iPhone 8 Plus, and iPhone X - devices that shipped with iOS 11 initially, could not run a 32-bit iOS app if it tried. The instruction sets were simply not there. ARM and, by extension, Qualcomm, Samsung, and all of the other ARM CPU/SoC vendors are going to follow suit and drop 32-bit ARM instruction sets. Microsoft is currently trying to make all of its leftover 32-bit ARM code in Windows 11 for ARM64 to be exclusively 64-bit in anticipation of this.
Everything I've said so far is factual and not speculative. Here's my speculation given this.
3. Every Apple SoC since A11 Bionic has been exclusively 64-bit-only. No 32-bit ARM instruction sets whatsoever. Apple starts planning the switch from Intel to Apple Silicon roughly around the time that they're removing 32-bit from their A-series SoCs. Their roadmap from that point to M1, if not beyond, has been clear to them. So, they know that, by the time the Apple Silicon version of macOS and the first M1 Macs launch, there's no 32-bit support. They can build Rosetta 2 to translate 64-bit Intel to 64-bit ARM. They cannot build it to translate 32-bit Intel to 64-bit ARM. And 32-bit Intel to 32-bit ARM is not an option because again, no 32-bit ARM instruction sets are present.
4. If you're Apple, are you going to keep allowing Intel Macs to run 32-bit Apps while positioning Rosetta 2 as only being able to translate 64 Apps, which, at that point wasn't that many? No. That'd be a PR nightmare for your Apple Silicon transition. The original Rosetta wasn't able to translate things specifically optimized for the PowerPC G5 and, to a lesser extent, the G4, but that didn't eliminate support in Rosetta for anywhere near as many apps as would've suddenly stopped working had Apple picked the Apple Silicon transition as the point in time where it cut off 32-bit Intel app support. That would've been disastrous. So, they cut it off one year earlier, with macOS Catalina. They warned developers for at least two years prior to Catalina that Mojave would be the last release to support 32-bit Intel apps, knowing full well that two releases later would be the first Apple Silicon macOS release. By that point in time, the amount of apps that were 64-bit Intel was healthy. And no one would blame the lack of 32-bit Intel app support on Apple Silicon; that was Catalina's fault one year prior. Certainly, those who use commonly updated apps were not affected at all. It's only those of us who use apps that have limited or no support that got left out in the cold anyway.
That's why we lost 32-bit Intel support. Has nothing to do with "Improvement without compromise". Yes, there's no compromise when it comes to using 64-bit x86 or 64-bit ARM over 32-bit. But clearly there are casualties of this.
; yet unless Apple dropped support for the former technology, many developers likely wouldn't update their software. Similarly, Apple made OS Lion download or USB drive-only, which helped to prepare users for a future where DVD drives would no longer exist on their devices.
Lion being download or USB only didn't hasten the death of the optical drive. The fact that displays were well beyond the resolution of standard definition (for DVD movies) and that software, particularly operating systems were starting to bloat well beyond what you'd want to throw onto a DVD is what did it. Plus, you could still easily burn DVDs of both Lion and Mountain Lion. It wasn't until Mavericks that you could only have a bootable USB. And even now, macOS is too large to fit on a DVD. Apps are insane too. DVD drives stopped being practical. This wasn't something Apple hastened, it was simply the reality we all were headed towards.
Having 64-bit on all their platforms also insures compatibility, improved security and helped with the adaptation of iOS from Mac OS.
The ability to run a 32-bit app doesn't have bearing on whether the OS itself is 64-bit. macOS has been a full, top-to-bottom 64-bit OS since Mountain Lion. Similarly, Windows 11 is a 64-bit only OS all the same. The ability to run 32-bit apps does not detract from this. Nor does it have any significant overhead. At least not as far as the software is concerned. Clearly, Apple dropped 32-bit instruction sets from their SoCs ahead of the rest of the collective of ARM chip manufacturers for a reason. That might've resulted in more efficient hardware. But, 32-bit x86 instruction sets were never getting dropped from Intel CPUs, so this is moot as far as Intel Macs are concerned.
Firewire is a similar story, a technology that was once relevant and then no-longer. I'm also in audio production and, honestly, when I've come across 32-bit plugins then I always look for an alternative, and more often than not something better crops up (and the developer keeps it up-to-date).
If you have money or your needs aren't particularly low-end, you can get away with that. For higher-end stuff, that's often extremely expensive and pivoting to something else isn't just something you're going to be willing to do. That's why a lot of people hold onto old Macs, old OSes, or just switch to Windows where such deprecations won't totally screw them over.
I've learned my lesson about being an unpaid system beta-tester. (When Apple stops "updating" an OS is when I start considering using it.)
The problem with that approach is that Apple only puts out security updates for the most recent three versions of macOS (with the current one always getting the most attention). Running a version of macOS that isn't getting updates at all is about as bad of an idea as running a version of Windows that isn't getting Windows Updates anymore. Security vulnerabilities go unpatched, then get exploited, and then you become a target for hackers. Yes, it happens to Macs too.
That being said, the updates a macOS release gets when it's one version behind the current are almost always just security related. There's definitely a merit to being consistently one release behind. But waiting until the OS is more than three years old? Bad idea unless your Internet usage is sparing.