macOS 10.15 & 32-bit Apps

Thanks, but still no joy. Am completely new to Fusion and VMs. Can’t seem to find complete answers at Fusion website or web search. I keep getting “internal error” message. Am trying to create virtual machine via dropping dmg of my Rosetta system onto the Fusion Screen… it accepts the image, lets me choose OS (10.6 server), opens new window with start arrow, with popup “internal error”. I must be missing something. RUnning latest version of Fusion on High Sierra, new iMac (2018). Any other suggestions?

romad
Dennis Swaney

    May 29

Apple never should have gone with Intel in the first place. Dumbest thing Jobs did. The best iMac I have ever owned (and still do) is my iMac G5 ALS.

It wasn’t exactly Jobs’ decision. Motorola had already stopped manufacturing RISC chips. Apple was the only remaining customer of the only remaining PowerPC manufacturer, IBM. But IBM wanted out of the chip business altogether and refused to develop them further. And at the time, Intel chips were already light years ahead of IBM’s, and Windows was close to obliterating Mac. Without OS X, it extremely likely they would have.

Personally, I was totally furious when Apple announced the switch to IBM chips. I had just bought my 9600 about 2-3 months before Jobs announced OSX would not run on anything with an PowerPC chip. Though I did, and still do, love the 9600, I would have probably equally loved a Mac that could run the latest and greatest upgrades and new apps.

Currently, Intel’s chips are reportedly holding Mac development back. Some big name PC including HP, LG, Samsung, Lenovo Sony, etc. have been using ARM processors successfully. And Microsoft has been promoting Windows 10 for ARM to developers:

https://docs.microsoft.com/en-us/windows/arm/

If Apple can build a better processor than Intel, then IMHO, more power to them. Apple’s homegrown ARM chips have been performing beautifully in iPhones and iPads, and I’m sure they’ll be transitioned to Macs as successfully as the switch from IBM to Intel.

Not the first stupid thing that Apple has done in the name of ‘progress’, lest we not forget disabling all support for fax modems resulting in 3rd party efax services cleaning up on Mac users needing to fix. Let us not also forget the current MacPro Desktop, due it being a closed system and no internal expansion capability, is best served as a piece of art in a museum.

By the way, based on what I have read and been told, if you are planning to install MacOS 10.15 or higher and have a MacPro Desktop ‘cheesegrater’ (MacPro Desktop 5.1), don’t bother as from my understanding it is a total waste of money, unless you have the expertise and willingness to hack it if someone figures out how to do it. If the information I have received proves to be correct, Apple is configuring 10.15 in a way that will not permit installing it on any ‘cheesegrater’, even if you install a METAL graphics card, SSD’s, and upgrade the processors. I suspect that they are trying to create a market for these machines, and overcome user resistance to hand over the big bucks to replace them. The new MacPro Desktop coming out supposedly this year is supposed to have a reported base price of $6000.00, and also reported to be a closed system as well.

Funny, my iMac G3 ran up to OS X.3.9 (Panther), my G4 iMac ran up to OS x.4.11 (Tiger), my G5 iMac ALS ran up to OS X.5.8 (Leopard), and my G4 PowerBook also ran up to Leopard. OS X.6 (Snow Leopard) was the first version was that would not run on any PowerPC chip. What chip was in your 9600, a G2 or earlier?

Funny, my iMac G3 ran up to OS X.3.9 (Panther), my G4 iMac ran up to OS x.4.11 (Tiger), my G5 iMac ALS ran up to OS X.5.8 (Leopard), and my G4 PowerBook also ran up to Leopard. OS X.6 (Snow Leopard) was the first version was that would not run on any PowerPC chip. What chip was in your 9600, a G2 or earlier?

It was a PowerPC 604e “Mach 5,” and the last Mac built with a Power PC chip. It’s got a place of honor near my desk.

https://everymac.com/systems/apple/powermac/specs/powermac_9600_350.html

No. PowerPCs went on as the G3, G4, and G5 in Mac land. You might have had the last manufactured by Moto, but it was in no way the end of the PPC era. Just as @romad pointed out, OS X ran just fine on all those PowerPCs that came after yours.

First was the transition from 68k to PowerPC processors, handled by compiling fat binaries.

1 Like

Jeez…you’re right! I forgot that my beloved Cheese Grater was a PowerPC. And it was the one I’m still p.o. about clunking down a big chunk of change for and learning a few weeks later that it wouldn’t run OSX.

If past system upgrades are any indication, no applications will actually be deleted. They may be moved to a special area or simply left as-is with some giant X over the app icon showing you that it will no longer run.

As a long-time Apple developer with lots of in-house software that is still 32-bit, I too dread the next macOS. Of course, this is far from the first time Apple has rocked the boat in its history. (We’ve had 68K to PPC, the iMac hardware shake-up, Mac OS X, and the Intel move to name a few.)

What is different this time is that Apple has not provided a software migration path for legacy apps to go 64-bit. Most 32-bit apps still in circulation were probably written using the Carbon framework, which is now a dead end. So the bad news is I seriously doubt you will see many 32-bit apps go 64 because it can’t be done without a total rewrite.

If you have such apps that are critical to your daily routine, maybe virtualization might save the day? I really don’t know. For my part, I will probably invest some effort in rewriting apps which talk to older hardware first, since that’s most likely to not work in a sandboxed OS. (We develop custom instrumentation that syncs with the Mac and have to support multiple generations going back several decades.) Anyway, ugh!

1 Like

I’m not a Mac programmer but my impression was Apple made it clear for a long time that Carbon had no future; it updated other frameworks to support 64-bit in 2007, it deprecated Carbon in 2012, and almost two years ago it started warning about running 32-bit applications. I don’t know what opportunities they missed to make it easier to switch from Carbon to Cocoa but they’ve made it clear that one should do so through their actions; I think at WWDCs and elsewhere they were explicit about the need to switch.

Right you are! I had that “fat binary” term running through my head as I was writing, but couldn’t pull it out enough to remember the whole switch from 68k to PowerPC. (The “fat binary” apps included both 68k and PowerPC code in the same file and could thus run on Macs using either chip.)

True enough. I’m just saying it’s unlikely many Carbon apps will get ported to Cocoa because of the effort that entails. It’s comparable to porting your app to a different operating system. In some ways, it’s even worse. I think most will be abandoned, so I wouldn’t hold my breath waiting for 64-bit versions of your beloved 32-bit apps to materialize.

To the best of my knowledge, both Fusion and Parallels have specific code that will block a non-Server version of SL; for a short time you could run plain SL under 4.1, then using the fake .plist trick under 4.2, but even that got coded out as of (approximately) v4.3 of fusion and the comparable version of Parallels.

Supposedly there was still a hack after that, but all forum how-to articles, especially those on those vendors’ forums, were quickly scrubbed, and even forums on places like MacRumours were scrubbed because it was considered highly illegal hacking, and at that time, Apple was really going after any license violations.

Today’s Apple seems to have less concern as they’ve not made any serious crackdowns on Hackintosh users, or very public efforts to keep up to Mojave running on “unsupported” 2007-2008 Macs and newer.

There are still how-to articles/forum threads for open source VMs like Virtual Disk; they aren’t super user-friendly to setup, but you seem sharp enough that you could get plain SL running in one of those. AFAIK, the plain trick still works in Virtual Box — but after being acquired by Oracle, they, too, may have cowed to Apple’s original license restrictions.

Here’s a method with Windows as host, but it should still work:

This video has specific instructions for VMWare in the description, along with VirtualBox:

Let us know, please, if you have success.

Otherwise, I spotted an SL Server disk set close on eBay for ~$60; you can install it fresh, then just use Migration assistant (or just a manual drag and drop) to absorb your SL image, or you can first do a dirty SLS System Reinstallation on top of your SL , then import directly to Fusion.

HTH

F

2 Likes

I’m running Mojave with zero issues on my Mid-2010 iMac and a few other “unsupported” Macs using DOSDude’s Installer; it’s super easy to use, and now includes an auto-update for revised patches so you don’t have to do a boot dance every other time macOS updates and breaks something (which is also fairly rare).

http://dosdude1.com/software.html

My beloved 2008 Mac Pro can also run Mojave, but the Metal-supported Radeon cards (that are affordable) do not support my full array of 4K displays, thus I stalled at High Sierra so I can keep my nVidia card which will drive four 4K displays at 60Hz.

Back on topic, I have Mac OS and macOS versions in VMs back to SLS for older software and testing our own development builds.

Cheers

F

The number of warnings I get that some developer needs to update their code is really frightening… But it is true too that many items in my Applications folder have gotten red X’s over the years already. Most of the time this poses no problem, but it happens still quite often I want to open some file only to discover that the application that created it no longer functions. What I’m doing cannot be compared to NASA but like them, I do have quite some inaccessible data now. The 32-64 bit transition will only aggravate that.

For what it’s worth, I have a VMware Fusion VM I made from my old Snow Leopard (client) startup disk several years ago, and ‘converted’ to Snow Leopard Server using the plist trick, and it still runs just as well as ever in VMware Fusion 7.1.3 on High Sierra (I haven’t yet gone to the point of upgrading to a newer version of Fusion than that).

I don’t remember doing anything other than the .plist trick to get it working in the first place. So either that trick still works, or newer versions of VMware Fusion are happy to accept VMs using that trick that have been ‘set up’ using earlier versions?

Does anyone still have the details of said plist trick? I still have a copy of 4.1 and SLC; and though i personally have SLS, I’m curious if i can solve this for others in this dilemma needing to cling to, e.g., Adobe CS3-4.

The steps I have noted down are:

  1. Go to /System/Library/CoreServices.
  2. Make a copy of the file SystemVersion.plist and call the copy ServerVersion.plist.
  3. Open the new ServerVersion.plist file in your property list editor of choice (e.g. I use BBEdit for this) and change the one reference to Mac OS X so it instead says Mac OS X Server.

But I don’t have a source recorded for these steps. A quick Internet search turns up a few links from a similar time period to my notes, including:

…though these mostly focus on installing a fresh copy of Snow Leopard, not tweaking an image you already have of a normal Snow Leopard installation so it will work in a VM.

Especially if the developer has passed away (R.I.P, Hardy) or gone out of business.

Or just lost interest. Anywhere from a person level all the way up to a board room decision.

TidBITS is where I see a lot of people getting upset about rental software. But in today’s world how can any developer without an ongoing revenue stream keep re-writing apps for changing OS requirements?

I think the complaining has decreased over the last few years but all kinds of people have seemed really upset that they don’t get updates and support forever. And there’s a big pile of software where that was promised and at some point the economics of it ran out the clock.