Indeed. I’ve moved away from some senior applications (as in age…) given feature bloat and a general desire to have nimble, clear purpose applications ready to hand. For me that’s the real appeal of iOS, those ‘pro’ Mac apps that migrate all features to their iPad version tend to not excite me as much.
That does sum it up! I’ve was in the industry in the 80s-2005 or so. I do still try and keep tabs on things but it’s nowhere near as easy as it used to be. I feel like back then, when there was a software upgrade, it was truly an upgrade with good, important features. And not on a yearly cycle! They came out when they came out.
Now things are “upgraded” yearly and it feels like a struggle to see which features I’d find useful. Every now and then I stumble on something in my phone and think “how long as this been there for? It’s great! I could have used this years ago!”. Yet I remember an upgrade that touted new emoticons and think “seriously that’s what the most important thing in here? Not interested!”
In other news, the Fitness app which I thought was a great idea, is still nearly useless for me. And Photo Shuffle on the lock screen still doesn’t give me any option other than Nature. It does pick up random new pix now and then but I don’t have the settings that others do. Another thing I think is cool but is really more like “meh”
And yes I feel like a complete idiot when I can’t figure these out considering my background. I have a 13 Mini, bought new, and still have no idea how to do a hard reset on it. It usually takes me two-three times to get the right button to turn it off (the buttons on that thing are REALLY awkward as well!)
As you suggest, I think we’re already there. From the perspective of an individual, the Internet is already infinite. As I’ve said elsewhere, I believe the only response is to focus on trusted sources, either publications with a sufficient track record or knowledgeable individuals.
For instance, the only movie reviews I regularly read are those from @khoi, who watches a LOT of movies.
Focusing on trusted sources is the only intelligent response; but most internet users gravitate toward convenient/familiar sources of information. The notion of “trusted” doesn’t even enter their heads except for the very questionable belief that the familiar can be trusted. This is surely a recipe for the continued deterioration of the reliability of internet-based information. Alas! It’s hard to imagine where corrective measures could even come from. Surely not from AI, which is one of the culprits!
I’m reminded of a story I read (perhaps 40 years ago).
Some researcher decided to investigate whether the newfangled computers actually did save time. The end result was something like this (and I’m working from a decades old memory).
An office worker now accomplishes in 20 hours what used to take the entire 40-hour week. But in order to accomplish those tasks so quickly, he (or she) spends 25 to 30 hours per week learning new software, installing new software, backing up work, recovering from crashes, and generally doing tasks that weren’t included in the original work week.
I have no idea what the methodology was or whether the writer’s tongue was in his or her cheek. In any event, it struck a chord with me.
As opposed to previous generations, where those extra 25-30 hours were spent changing the ribbon on the typewriter, getting a bottle of white out that hadn’t solidified, calling someone and leaving a message, finding stamps for a letter, posting the letter, trying to track down various people to get answers to questions, etc. etc.
Best relevant comparison I have is that artists are to reviewers as birds are to ornithologists and why should the birds give a rats ass about the ornithologists…
I saw an article some years ago about changes in the preparation of what I believe were architectural proposals. Going from paper and paste-up to computer printing and layout made things much more efficient. However, those efficiency gains were offset by increases in expections – proposals were expected to be longer, in greater detail, with signficantly improved color graphic elements. If you just kept the proposals the same as they had been before, you’d be rejected in favor of competitors who had made improvements.
The conclusion, as I remember it, was that the preparation of the proposal involved an expected commitment of time and energy, and computerization didn’t change the amount of time and energy that the client expected the firm to put into preparing the proposal.
I would argue there’s a far greater cost to increased complexity: reduced security and increased fragility. A maxim in software engineering is that the more complex a system is, the more likely it is to have security vulnerabilities. In addition, the occurrence of bugs often increases near exponentially.
The article “Taking computers to task” in the July 1997 issue of Scientific American has some good advice for corporate managers. Some interesting quotes: “… much of the time saved by automation is frittered away by software that is unnecessarily difficult, unpredictable and inefficient”. “What puzzles economists is that productivity growth … has fallen precipitously in the past 30 years from an average of 4.5% in the 1960s to a rate of 1.5% in recent years”. “A typical desktop PC carries a price tag of about $3,000 in the US or about $1,000 per year … but the average annual bill is more like $13,000” (This involved $1730 for software, $3,510 for support for each user, $1,170 for network support and $5,590 for “futzing” by workers - non-productive time such as waiting for software to load, fixing problems, waiting for help, rearranging files and playing games - an average of 5.1 hours per week!). “[workers typically] take 4 to 10% of their time to help co-workers solve computer problems…this hidden support lofts the total annual cost to about $23,000”.
Maybe. But an artist who isn’t (yet?) famous needs good reviews in order to sell enough product to make a living. Those that don’t pander to the critics often never get discovered, even if the work is liked by audiences.
Yes. Which is why there is continuous R&D with programming languages and standard libraries in order to simplify development (by replacing commonly-duplicated functions with built-in library code that is thoroughly trusted and generally reliable).
I do a lot of development in C++ using quite a lot of standard and add-on libraries. The systems I work on are pretty complex. I would not want to attempt to develop the same code in an older non-object-oriented language like C (or in older days, Pascal, FORTRAN or assembly language). Because I’d need to reinvent, develop and test workalike libraries in order to keep the rest of the app manageable.
This is one of the reasons why Apple pushes languages like Swift for their development. It’s not just because it’s the shiny new object (although I’m sure there is some of that), but because it includes a very robust set of language features and standard library functions that reduce the amount of complexity a developer needs to deal with.
Bugs and security holes can still happen, and they occasionally happen in those standard libraries, but they’re easier to fix if the overall system is modular enough to (for example) replace one shared library instead of forcing developers to rebuild and rerelease all their products, which was the case in the not-too-distant past.
It’s unclear if the critics serve anyone other than their publishers. But an artist needs to serve the critics as well as the audience, because bad press or simply the absence of any press can be a career ender.
Let me be more clear. All artists that have moved beyond their parents’ garage will have some kind of an audience. But without good press, it will be very hard to grow that audience large enough to be financially successful, no matter what the existing (limited) audience thinks of the work.
Again, this should not be surprising. We’ve seen all kinds of examples of “garage bands” who release a few albums (possibly self-published) to a small but highly appreciative audience. Until one song happens to become popular and get radio play, and then everybody wants to buy those back-catalog titles.
Ignoring critics is a double-edged sword. Yes, it allows you to remain independent, but it also may result in quite a lot of time elapsing before you can develop a large audience, because of bad (or no) press.
No, it’s clear that the critics serve the consumers. Publishers wouldn’t publish something that doesn’t draw an audience.
Well, yes. Artists can always produce their art. If they want to do anything more with it they have to engage with the larger world. I’m not sure why you’re blaming the audience or critics for that.
Schmoozers and ass-kissers tend to be the most successful people in almost every field, not only in the arts. We are taught to believe that talent and hard work are what it takes to succeed; but more often than not, this is not the way the world works at all.
I think we’re saying the same thing in different ways as security and robustness are simply aspects of a system’s overall integrity. The point here is that increasing complexity is actually the enemy of progress, not its facilitator. That’s how it looks to me anyway.
I guess there’s a few things about critics I’ve observed over the years in the arts. They’re probably pretty obvious to most.
it’s one person’s opinion
reviewers come in an ability/interesting spectrum themselves. Often a bad review is actually a bad review, it’s failed to grasp the piece at all.
So I don’t read them. If I want one person’s opinion on work, I’ll tend to take mine. I don’t read marketing guff either, I like to encounter work as unencumbered as possible, I don’t read labels in art shows until after I’ve looked at the work, I don’t read interviews with artists, I like to let the work speak to me. The rest I can do without.
One (attempting to get back on topic track…) issue with the rise of the Web and a change that it brought is that reviews are now forever. The monthly magazine or the ‘review of the year’ specials in the newspapers used to be as long term an opinion would survive. But now, reviews live forever. One reason I wish reviewers were more aware of their responsibilities in terms of being good at their job.
The way I relate to this is in my field of graphic design. I asked a friend who worked at a printshop was it was like before desktop publishing and she told me there was a company in town that only set type. She’d send them a piece of paper with the text she wanted, along with exact specifications of the type: font, size, column width, leading, etc. She had to indicate if certain words were to be bold or italic. A day or two later she’d get the typeset text back, and if she’d calculated wrong and it didn’t fit, she had to send it back and start over. Just getting a few lines of text typeset could take a week with all the back and forth! Once she had the final type ready, she had to cut it and paste it onto a layout, get the client to approve it, and then make printing plates.
At the time, I was using PageMaker to do all that myself in minutes and bringing her laser printed artwork she could make printing plates from. The next year she bought a Mac and within a year after that the company that only did type was gone.
While vastly faster and better in so many ways, the disadvantage of this change was that the work of typographers was put onto graphic designers. Type used to be a specialty and an expert would know what typeface was best. It’s vast field. Suddenly non-experts were just selecting fonts from menus and a lot of that knowledge was lost (or had to be relearned by designers). Today we take that for granted, but it was a huge change, and I still see a lot of horrible typography that makes me cringe.