I totally agree. The pace of change in today’s computer industry is totally over-the-top and out of control, imho. I’m at the age where, despite being a devoted Apple customer since the very beginning, it’s starting to feel like way too much work to keep up the way I used to.
I don’t think the pace is much faster than it was in the 80’s and 90’s. But my life has changed quite a lot.
In the 80’s and 90’s, I only had to worry about my classes (when in school), my job (after graduation), my apartment and my car.
Today, I’ve got a house, a family, two cats, and tons of other things that demand my time. And at the end of the day, I’m more tired and just want to watch TV after dinner. So there’s not nearly as much time to devote to following tech news as there was 40 years ago.
It’s an interesting question—what would be a relevant metric for the pace of an industry?
Online distribution has certainly made it faster and easier to release software, so we see many more updates than we did back then. When you had to ship floppy disks or CDs, you didn’t release an update without a really good reason.
I also pondered the idea of counting the average number of articles in an issue of MacWEEK and then comparing that against the number of articles produced by an outlet like MacRumors or AppleInsider.
There’s no doubt that there has been incredible progress. And the things that are invented today are far more advanced than the things that were invented 40 years ago. But that’s because last years’ inventions become the baseline for the new stuff.
But the rate of innovation - the number of truly new things - I don’t think is that much different. There are lots and lots of new products, but most of them are incremental changes to (or clones of) prior products.
I think it’s far from obvious whether the rate of groundbreaking innovations is rising, falling or remaining steady. A (far from exhaustive) list of the kind of things I’m thinking of includes:
The vacuum tube - made the concept of computing practical, to a degree that would have been impossible with prior tech (e.g. electromechanical computing devices)
The transistor (and by extension, the integrated circuit) - made computing small enough, fast enough and cheap enough to become mass-market products, first for large businesses, then smaller businesses, then for everybody.
Digital circuit switching - made the global telephone network (and by extension, the system of leased lines that nearly all data traverses) possible.
Packet-switched networking (especially TCP/IP) - made computer networks (and eventually, the Internet) possible.
Packet-switched radio communication - made possible all of our wireless communication, including microwave, cellular, Wi-Fi, Bluetooth and many others.
The GUI - invented by SRI and Xerox, made mainstream by Apple and Microsoft.
World Wide Web - the killer idea (and countless apps in support of the idea) that made global networking practical for the masses/
Neural networks - initially a software concept, later implemented in various hardware forms. The basis for all current AI products and the focus of most current AI research.
Mobile and wearable computing. In many respects, this is just personal computers shrunk down to very small sizes, but the ability to have on-hand an Internet-connected computer at all times had a huge effect on society. And the tech got started many years before the smartphone.
Note that these are not individual products, but are concepts that have been realized through many many products, and not necessarily by the companies that became famous selling those products.
Of course, the decision about what tech is groundbreaking vs. just an incremental update to prior tech is open to debate. And the timeline is necessarily fuzzy, because multiple groups often develop the same tech in parallel, and work begins long before the first products/patents are released. And it is sometimes not obvious until many years later.
Ground-breaking innovation is not what I was referring to. What really makes it hard to keep up now (as compared to back then) is the constant tweaking and updating of features (and even the interface itself) to give the appearance of “progress” for the purpose of selling more widgets to consumers who have become hooked on novelty for its own sake by the industry’s own marketing tactics. As one example, have you noticed that the same day Apple’s latest iPhone model is revealed, the internet will be flooded with articles speculating about what the next iPhone will be like? That’s just plain nuts imho. It’s like nothing is ever good enough!
For what it’s worth, I think both you, Alan, and David C. are correct, but just viewing the situation from slightly different angles.
As I see it, things seem to be changing (or progressing) in tech (and computing) at the same rate or even faster, but the substance of what is changing is less one of major leaps, than of incremental changes.
Each new Mac or iPhone is better or faster based on an incremental change planned out in advance by Apple. We’re moving from Intel chips to Apple’s M1 chips to M2 chips to M3 to M4, with Pro versions of those chips inserted as further incremental advances.
My hunch is that if Apple wanted to, they could leap from M1 to M4 chips, but that would mess up the timed roll-out of incremental improvements. They have to accommodate the pokey users who want to use their old Macs or iPhones as long as possible, but still serve their enthusiastic customers who are eager to upgrade their phones or Macs every two years or so.
It seems to me that what is most different from the old days of MacWEEK and the present, is that the “news” being fed us by MacRumors or other news sites is not so much of major changes coming out of left field from new hardware or software players, as leaks from (or rumors) from supply chains in the carefully timed release of changes.
But that’s nothing new, and it’s not unique to the computer industry.
The automotive industry puts out new models of every car every year, but the rate that new features/options are introduced is very low. Most changes are purely cosmetic. It doesn’t stop the press from reviewing each and every “change”, but it’s all cruft that most people just ignore.
The computer industry is the same. Sure, every company puts out new models at a high frequency, and the manufacturers hype every change as the dawning of a new age, even if it’s something as trivial as making the case thinner by a quarter of a millimeter. But its all cruft, and most of us learned to ignore it a long time ago.
The pace of innovation hasn’t changed that much in 50 years. The pace of marketing and advertising may have increased a lot, but does anyone (outside of the press and marketing departments) actually care about any of it?
Most news has always been fake. The difference between now and 50 years ago is that today there are alternative sources, so those who care can figure out what’s going on. Whereas 50 years ago, the cost of publishing was so high that only a competing news outlet could even try to fact-check the press releases.
If you say so. There’s no question that advertising and hype have been with us for our entire lives, but I don’t see things staying the same at all. We may be frogs who have been swimming in the same pot all these years, but the temperature is definitely rising!
Fake news may have always been part of the media mix, but I would not go so far as to agree that most news has always been fake. Nowadays, most news is certainly biased in the extreme, but I do not believe that fake news on the scale we’re seeing on the internet today existed in “the good old days”. Not even close imho.
That’s an interesting way of looking at it, and certainly, the ease of online distribution makes such incremental change easier to get to customers.
But it’s also true that the tech industry is vastly larger than it was in the past. So we’re getting more products and more releases from more companies than ever before. @agen is on vacation, so I’ve been keeping an eye out for Watchlist updates, and I’m impressed at the number of things that appear on MacUpdate each day.
Someone had this slide at a conference I attended, this is my rough recall of it… I think the broad user base is also a factor to consider regarding change rollout.
Adding to @shamino’s list of “advances” I would say moving from the single line (i.e calculator) display to mutli-line displays was important. My work acquired an HP 98 series desktop computer in the late 1970s. We did brilliant things with its graphic-orientated language but it was a relief to get PCs about 5 years later:
One thing about tech I find fascinating is how it only temporarily saves you time. Initially when tech comes out it’s amazing and it replaces a huge chore that used to take a long time (think washing machine versus doing laundry by hand, doing graphic design on computer versus manual paste-up, etc.).
But a while later and that tech becomes the norm and that time savings is used up doing other things. Sometimes it is competition, as in time saved on a job (i.e. one newspaper uses computers to publish faster), but soon all the competitors have upgraded and now everyone is back at the faster, but same level.
This creates a world where we are always running and feeling like there’s not enough time. The explosion of content in the internet era makes that worse. Magazines used to have deadlines expressed in months. These days with Twitter (or whatever it’s called this week) it could be minutes or even seconds. Even if innovations aren’t more frequent, they feel like they are.
I am curious how this will play out with AI. Right now it’s still new enough it’s a competitive advantage in some fields, but soon everyone will have it and have the expertise to use it, and then what? Whole articles written by AI in seconds so someone can claim “first” post? “Reviews” written before the product is even released or tried?
What if AI gets advanced enough it can be used by non-techies to do tasks that used to require decades of education? For example, say you’ve got an idea for making a widget. But you know nothing about engineering, manufacturing, distribution, marketing, selling, etc. You have to raise capital, start a business, hire engineers and experts, and it’s two years and millions of dollars before the widget is for sale.
Couldn’t AI simplify that so that the non-tech person could have the AI do all the engineering, calculate the stress on the materials to figure out how the widget needs to be built, generate 3D-printed mockups, build a marketing campaign, etc.? Suddenly a kid in a garage could go from an idea to a commercial product for sale in weeks. Wouldn’t that change the speed of innovation if hundreds of thousands of new inventions are released?
Possibly, but if the quality isn’t there to match the speed, the result will be to kill the entire industry as customers lose faith in the whole category of product.
Take, for example, movie reviews. I’ve seen tons of reviews that come out within hours of a movie’s release, and sometimes days before. And after having seen the movie, it becomes clear that the reviewer didn’t actually see the film, but wrote the review based on the trailer, his own preconceived expectations and what the Internet rumors had been saying about it. Producing a “review” that is usually wrong and is sometimes catastrophically wrong.
And what is the result? For me and my family, we’ve stopped reading reviews altogether because they’ve proven themselves to be useless. We used to use reviews to help decide what to see and either ended up seeing a bad film based on a glowing review or (more often) avoided a film because of a bad review, only to watch it on a streaming service a year later and regret not having seen it in theaters.
The news business is equally bad. And in both cases, the entire industry is dying. Not simply because Internet sources are replacing print sources, but because people no longer trust the business at all.
If AI does anything, it will accelerate this very bad trend. So we’ll have billions of articles every month, which nobody will read at all.
If AI tech can develop to the point of being like a human engineer - able to think outside the box to come up with innovative new solutions. But we’re lightyears away from that kind of tech.
With what we know how to do today, the best AI can hope to do is what’s already been done, but faster. So we’ll have endless new combinations of existing tech, and an infinite number of knock-offs. But I doubt anything truly new is going to be invented this way.
And this ignores one key thing. If one person has an idea, and a cloud server running various AIs does all the work, who actually owns the product? That one person who simply had an idea, or the company running all the servers and providing the AI, which did all of the hard work and may even be manufacturing and selling it? Who gets the patent? Who holds the copyrights? Who is going to get paid for all the sales?
This could easily be just another version of the very old-fashioned idea of an inventor getting squeezed out of any profit from his invention. Only this time with more justification because the “inventor” did almost no work beyond the initial idea.
On the weekend I went to a panel talk “Your brain on AI” that included physicist Paul Davies:
It was a fascinating and somewhat scary discussion. The panelists are certainly concerned about AI (that we know of!) being developed by just a few highly-commercial entities. Also the enormous energy consumption of AI computers was criticised - particularly in view of the mostly trivial purposes that they are used for.
A developing area of research is synthetic brains, that are grown from animal cells and interact with sensors. The expectation is that they will become powerful, specialised computers that use much less energy than “conventional” supercomputers.
“Eventually these companies have no choice but to add features that nobody asked for. Meanwhile bloated, overwhelming technology has a very real emotional effect on us; we feel like idiots when we can’t master it. Then, as the software becomes increasingly weighed down with features, the interface must be redesigned to accommodate them all. (Such a redesign is then, of course, marketed as a new feature.) And each time you lose a few days of productivity as you learn the new layout.”
All true, but it’s not just productivity that is lost in this never-ending cycle. As this often needless complexity increases, the inherent integrity of the whole system is compromised. It is often assumed that complexity equates to progress; but there is a flip side to this equation, and the law of diminishing returns must also be taken into consideration. As the wise man said, “What goes up must come down.”