As Hardware Becomes Ever More Impressive, Software Suffers Rough Edges

Interesting article. I agree with many points. What I’d love to see is more research into WHY software quality has declined. For my comments, I’d restrict the discussion to Macintosh, iOS or Web-based software. Some guesses include:

  • Poorly defined requirements - no one took the time to design the UI/UX
  • Inadequate testing - no one tested what happens when people do obviously incorrect actions
  • “Agile” development, that has given developers an excuse to build without quality – “we’ll just fix it in the next sprint…”
  • Software development no longer is seen as a “craft.” Fewer developers truly understand WHY documents like the original Human Interface Guidelines (HIG) existed.

My favorite example is software (usually web sites) that asks for a phone number and can’t automatically process dashes or parentheses, stripping them out as required before storage – they instead have a text box telling the user not to enter dashes, etc. It used to be no software developer worth their salt would ever release code that didn’t sanitize before storage. Now – it seems common. I’d guess that if you asked the individual “coder” they’d say, “no one told me to do it…”

2 Likes

That is a management failure, not the coder. The coder works to the standards set by management.

motd:
Those who ignore history are condemned to build bad software.

I really miss Steve.

1 Like

That’s a great question. Some off-the-top-of-my-head possibilities along with those you listed:

  • Time to market. Everything moves much, much faster than it used to, which compresses the dev cycle such that there may not be enough time to apply the polish.

  • Lack of strong leadership from the top. Apart from a few outlying examples, none of which were in core apps, Apple generally abided by its own guidelines, which encouraged the majority of developers to do so as well. When developers deviated, such as the canonical example of Kai’s Power Tools, it was a notable move. But not one that caught on at the time.

  • For many years, there was an established playground for those who didn’t want to make a “real” Mac app: HyperCard. HyperCard stack interfaces were often weird and wacky, but they seldom splashed over into the world of commercial productivity software.

  • Short attention span for users. With most apps (particularly iOS) being free or cheap, users don’t need to invest money, and if they’re not investing money, they’re less likely to invest time. If you’re only using an app every so often for a short time, you don’t care as much if it’s weird or poorly coded.

  • Short attention span for developers. Developers have a tough time because if they don’t invest quite a lot of time and effort into making their app compelling at first glance, it’s unlikely to stand a chance with the short attention span of users. But if they don’t have the resources to make that investment, the app will be more likely to be weird or bad. And if the app isn’t an immediate success, developers may not be able to afford the time and money to polish for a 2.0 or even a 3.0. Lots of apps used to require multiple major revisions before they became classics.

  • Less public beta testing. It’s just harder to do these days, what with profiles and App Store codes and whatnot, and I see less of it.

  • Less connection between developers and users. In the past, it was more likely that users would be able to meet a developer at Macworld or would exchange email with the person who did the work. As the markets have grown, that connection has become weaker, such the developers have less of an idea of what users want, and users have less of a sense they could influence the direction of development.

7 Likes

Short attention span for users. With most apps (particularly iOS) being free or cheap, users don’t need to invest money, and if they’re not investing money, they’re less likely to invest time. If you’re only using an app every so often for a short time, you don’t care as much if it’s weird or poorly coded.

This resonates. Fewer people care, so fewer people complain.

2 Likes

All of what @ace said… The nature of attention has definitely shifted.

I saw a piece by a French artist in a gallery in Lisbon last December. Part of her work resonated in particular, she was visualising the early Internet. She said that the slope of the Web had changed in the recent past. She posited there was a gentler slope to the Web earlier, that you meandered around the Web, at a slower pace, visited more places, came across things which were unexpected. Now that slope is much steeper, people are propelled, ever faster, funnelled into fewer and fewer spots.

This crunching of time seems to have become a dominant tendency. Space is long overcome, now Time is being vanquished where everything is instant. It has a huge impact upon not just the user but also the developer. Attention certainly is one casualty, and with that perhaps due care or dreaming is set aside for quick hits and fixes.

4 Likes

I can add a few more to what Adam said:

  • The advent of the web. Web interfaces are notoriously awful, there are few standards, and the underlying tech keeps changing. The result is there’s a whole generation of users and developers who don’t know good interface and that translates into poor apps.

  • Quantity of users. It used to be there was a small pool of users (a few hundred million worldwide) so software cost a lot and users were picky. Now if your product only has a few million users it could be considered a flop. This has created a strange market where giant companies chase billions of users and give away stuff for “free” hoping to monetize some other way; smaller developers don’t have the resources to compete, especially when it comes to marketing their app.

  • The “default” standards of an app are very high now. I don’t mean quality, but just the basics of what an app needs to do. In ancient days, like under a command-line OS like DOS, your app needed to do its thing and be able to save and open files. That meant the developer could concentrate on the program’s functionality. Nowadays apps have to support a million technologies, from drag-and-drop, copy/paste, undo/redo, internet/cloud, etc. Much of that is provided by the OS, but there’s still overhead to support those things (even just in testing), and while the tools to build apps have gotten better, the complexity underneath is horrendous and making even a simple app requires a huge amount of work – especially if the developer is concerned about quality.

  • The vast number of apps. With so many millions of crap apps out there, average users can’t tell the difference. The problem with that is there is no incentive for the good developers to make their apps better as the marketplace doesn’t reward them. People won’t pay extra for an app with undo support, for instance. They download apps based on sketchy marketing schemes and never see the good apps. Eventually those good app developers either stop making apps or start making crap apps.

  • Larger development teams. Software used to be created by individuals or tiny teams of a few people. They could communicate with each other and make things better. These days software is created by committee with dozens or hundreds of developers. Small “insignificant” interface issues, visual bugs, rare bugs, etc. are low-priority.

  • Education. Programming used to be a esoteric craft and those involved were geeks who were really very good at it. Now it’s mainstream, but the vast majority aren’t any good. They don’t have the passion for it. It’s just a job, a career they got into for the money or notoriety. The advent of website programming has also impacted this as so many who “code” in HTML or Javascript think they can also make apps.

  • Speed of change. I personally hate web development more than anything, because the tech changes every five minutes. By the time I learn HTML, there’s CSS. I learn that and there’s Javascript, then JQuery, then PHP, and then whatever the new flavor of the month is for development now. The whole programming world is like that, and while usually the new languages and tools have definite benefits, there is no “deep knowledge” or “20 years of experience” that a traditional developer had. Even a current top programmer who has been doing stuff for 20-30 years can’t use many of the skills from 20 or 10 years ago because everything is new. The tools are all different, the languages are different, the APIs have all changed. This means the developer is spending half their time learning new stuff and that’s time they used to spend polishing their app. This trend is slowing slightly, but not enough, and the modern generation is going to be addicted to the high you get from “new stuff” and is always looking for the next big thing instead of just using the tools they have. Apple and other hardware companies have certainly contributed to this attitude with annual release schedules and an emphasis on “new.”

I’m sure there’s a lot more, but these are a few of the “big picture” aspects of software development that come to mind. I’m not really sure I see a fix, either. Bad software is the new normal. I’m not sure it’s “bad” in the sense of losing data or not functioning as intended (i.e. doing its job), but it is bad in the sense of being harder to use, buggy, and annoying. The vast majority of users don’t use the app enough to notice (or don’t have the taste to notice), so it’s only power-users and experienced software users who complain, and they’re a minority. So nothing improves.

5 Likes

I’d add to the list outsourcing.

In my career, the US developers (especially those who grew up in high-tech regions like Silicon Valley) were very very good and paid great attention to detail.

But there aren’t enough of these people to go around and they command very high salaries. So companies keep their best people to work on architecture and high-level design, giving all of the actual code-writing to outsourced teams in low-cost countries. These teams may have good programmers, but they didn’t design the software, they have limited contact with the people who did design it, and they are often paid with fixed-price contracts.

So it’s not in their best interest to ship a fully-polished application. They make the most profit by shipping as soon as possible. Once the app passes its acceptance criteria (based on the letter of the contract, not based on if anybody on the team is satisfied with the results), it is delivered and everybody moves on to a new project.

You can get quality from an outsourced team, but it requires very good management that can be directly involved in the team’s daily work. This is very uncommon - it’s a lot of work most managers are not trained to do, it involves working during odd hours (to sync with foreign time zones) and manager salaries are high - cutting into the cost saving that is used to justify the outsourcing.

5 Likes

I’ve not ever published on the Apple App Stores, but my understanding is that they intentionally make it nigh impossible for users to communicate with developers and vice versa.

I find communicating with users to be a great source of satisfaction when doing development. I think my experience as a developer would be impoverished in the world of app-store publishing, and likely the quality of my work would suffer.

2 Likes

I’m fortunate enough to make a living in academia so I can’t offer first-hand experience. But I have a wife who works for a medium sized software engineering company in Silicon Valley that makes devices in a highly regulated safety critical market segment. One thing she keeps telling me that interferes with getting good software to market is the the combination of Agile workflow and MVP (minimum viable product).

Basically, very early on in the planning phase of a new product the higher-ups decide what the MVP is. It is always low-balled so as to make sure the product managers don’t have to ask for many resources. It makes the product look less expensive and risky, and thereby promises much more revenue. That gets the top brass and board sold on the idea. But the MVP itself is a crappy product because a once great idea has now been stripped of all that makes it sexy and interesting. It’s really just a skeleton of a great idea. But anyway, the board was sold on that so that’s now what’s gonna happen.

The teams get to work on the MVP and start developing a very rough and crude product. It’s constantly reviewed an demo’ed (Agile) which creates this new normal of “that’s broken, but we’ll fix it later”. People can always ‘demo’ something, but they never ‘demo’ the real functionality because some parts are always sufficiently broken so you can never really demo the whole workflow. The engineering teams are working beyond full capacity and pulling overnighters to meet deadlines set by people who haven’t written a line of code in 20 years but are very afraid of a board room full of angry old dudes with short tempers. At some point the MVP is ‘ready’ and testing starts along with all the certification for the highly regulated market these products are used in. During this time the coders wind down and start to have time again to work on polishing the product and turning the bare MVP into something you might enjoy using. By the time regulators have approved the product and the company is ready to go to market, the engineers have meanwhile have managed to fix a lot of shortcomings in the MVP and now there’s all of a sudden a large amount of useful additions and nice-to-have enhancement along with a lot of bug fixes just waiting to be added. Sales people have already told their clients about all the great new stuff and the pressure is on to not bring the MVP to market as planned (and approved by the Federal agency), but to incorporate as many of the latest new features and bug fixes as possible. Management then needs to convince themselves that the changes don’t warrant having to open up an entire new regulation cycle, but that they’re nevertheless crucial product improvements and therefore important enough to warrant a delay. With the delay now settled, again the pressure is on for the coders to work overtime, because they’re now the ones delaying the launch. Rinse, wash, repeat.

From hearing all these stories, what seems to be missing is leadership that a) understands what it takes to make good code and how much effort is realistically required and b) have enough chutzpah to require certain usability/features right from the start rather than try to push some half baked MVP, and c) finally have the stomach to put their foot down when it counts and not rush a product to market just because that’s what a completely unrealistic plan somebody came up with 3 years ago says.

I would really like to learn what it is we users and consumers can do to reward good coding and software, and create incentives for companies and software devs to put in the extra effort to actually make good software that’s polished and fun to use. Paying for good software and avoiding crap is obvious. But what else?

5 Likes

I feel a bit silly, after all the preceding bullet points, posting a whine. But here goes.

Yeah, except when they don’t even have the text box. There is just an error message if the user doesn’t enter information (phone number, date, nine digit zip code) in the format that the programmer expected. “Please enter a proper …” In another thread, someone mentioned perfectly valid email addresses being rejected because the programmer didn’t anticipate something properly.

Alright, back to useful comments.

I’m not sure that I buy all of this. Is software really that bad? It’s easy to pick out anecdotes of bad design and failure, but perhaps it’s also easy to forget how limited software/apps were in the past compared with now. I almost feel that this is a little like Louis CK’s “Everything is amazing and nobody is happy”. I realize that things can get better, but, really, so can hardware. Intel’s chips we learn are plagued with speculative processing vulnerabilities (albeit without real world exploit, but is that really as great as it can get?) Apple just spent a couple of years selling no decent notebooks without flaky keyboards. It is 2020 and still Apple sells no cellular connected Macs, especially notebooks. Are we sure that hardware is as good as it can be? Are we sure that software is really worse than it was before - or is it possible that some things are worse but there are some (many?) things that are better?

How do we measure those things objectively?

2 Likes

I think your wife has it pegged. Not all companies are like this, but it does seem to be a common factor of the biggest (and therefore most bureaucratic) ones.

There’s noting inherently wrong with Agile and MVP, as such, but these concepts need to be supported by a management team that understands them and their ramifications. Managers that see them as an excuse to sacrifice quality in order to meet impossible schedules are the real problem. And these managers run the biggest and more bureaucratic corporations.

3 Likes

I really miss Steve.

3 Likes

Regarding software, I would be happy with two fundamental changes, both back to basics:

  1. Software developers (especially within Apple) should follow Apple’s original human interface guidelines; and

  2. Remove the Apple security that shifts control of computers from the owner/purchaser/buyer/user to corporate Apple. Minimally, let us install anything that we want on OUR computers! Remember when we would simply download or copy software and then run it?

Er, I do that regularly today.

2 Likes

Why do we have to type in our password to install applications in our Utilities folder but not our Applications folder?

Why do we need to approve running an application we’ve installed the first time we run it (yes, I know it’s intended to address viruses)?

Why are we prohibited from installing hacks in our system such as SIMBL without increasingly sophisticated deep hacking?

Why can’t we change default icons that are now under lock and guard in the System folder? Remember when we could actually open and change items in the System file with ResEdit?

Why the insistence on sandboxing applications with heavy pressure (to the unknowing) to purchase applications only on the App Store instead of the wide open internet?

In other words, why doesn’t Apple trust us anymore? Perhaps during a system installation, we should be asked just once if we are expert users and want to strip away all of these “protections”?

1 Like

I think in part it’s because Apple wants to be a consumer electronics company, IOW any idiot should be able to use a Mac without ever getting into trouble despite never having read a single readme or caring and taking responsibility for his/her security. It feels a bit to me lately like macOS has become the Simple Finder version of the former OS X. I realize a lot of this was done because of security concerns, but it would be nice if there were some more granularity for those of us who know what they’re doing. I’m no big fan of the locked down sandbox. I want to be able to continue using my Mac for work.

1 Like

Precisely. Apple wants iPhone users to purchase Macs, so it dumbed down and locked up the Mac system. In the process, they’re making Macs more like Windows machines. It’s so sad.

If Apple didn’t trust you, you wouldn’t be able to do most of the things you listed.

3 Likes

I’m not sure if these are real questions or rhetorical. Just in case they are real…

Sounds like a matter of permissions. What do you see on your system? Here’s what I see:

$ ls -ld /Applications /Applications/Utilities
drwxrwxr-x+ 92 root  admin  3128 Jun 11 12:43 /Applications/
drwxr-xr-x+ 28 root  admin   952 Jun  9 09:47 /Applications/Utilities/

Both folders are owned by root:admin, but note the permissions. /Applications is group-writable, so anyone in the admin group (that is any, account you configured to be an administrator) can copy files there. /Applications/Utilities is not, so everybody must authenticate before copying files there.

I don’t know why Apple did that. Probably to make it easier for administrators to install software.

If you don’t normally log in to an admin account, which is a very good practice, then it doesn’t matter - you’ll be asked for an admin user name and password before copying files to either location.

Because the system doesn’t know if you deliberately installed it or if it got installed through some other means (e.g. malware). By making you confirm your intent, you have the chance to abort launching something that you didn’t install.

In the Classic MacOS world, some of the worst instability problems were the result of people installing “haxies”, including APE and SIMBL. Either because the extension was buggy, or because it interacted badly with other haxies that were all installed and running together.

Even if you know exactly what all the risks are, most people installing these did not (and still don’t). They’ll just read random forum posts by people advocating various add-ons and install stuff they don’t understand. And when the system becomes flaky as a result, they blame Apple, who has to spend a lot of tech support money to figure out that the user’s problem was self-inflicted.

Yeah, I’d also like to customize these things. But Apple has always been this way. They don’t want users changing the look and feel of the system, and they definitely go overboard with things like this. But this is nothing new.

Using new system permissions to block access might be recent, but as far back a System 7, Apple was known for shipping system updates that would revert any such changes, forcing you to redo it all after every update.

I’m not sure I know what you’re talking about. You can still download and install apps from any source. Apps need to be signed and notarized in order to launch the first time with a simple double-click, but that’s hardly a heavy pressure thing.

The App Store is so popular is because it is easier to distribute software that way than through other means. But there are plenty of publishers who distribute their software in other ways as well.

And how many non-experts will say “yes” to that question and then go crying to Apple’s tech support when they make a complete mess of their computer? Probably everybody.

This is as much an issue of reducing support costs as it is anything else.

3 Likes