Blockchain...
The first time I heard of it was in relation to cryptocurrency. The second time, another cryptocurrency. And then, several times later, I heard of it in context of a "decentralized internet" project called LBRY, of which I am an early user.
I've lost count of the number of times I've heard of it since.
Now, I recognize that this word represents some hugely vital concept in programming, which I won't even begin to pretend I understand. I'm not commenting on the viability or usefulness of blockchain for any given project. But eight years studying communication in this industry has taught me how to spot fads...and this one just hatched.
"Fad" is a pretty loaded word, so let me slam on the brakes and explain myself. There are several brilliant innovations I'd classify as technology fads, but never on the basis of their technical details. An innovation only becomes a technology fad when we start mass-adopting it on any basis other than its own merit!
I can point to three other distinct fashionable technology fads of today.
For example, look at Javascript, a language whose design is widely criticized in many circles, not entirely without merit. As much as I personally dislike Javascript, I'll grant that it presently has a solid use case in web development, specifically in manipulating HTML and CSS to create smooth, interactive web experiences.
Yet it seems that a major sector of the industry is trying to replace the entire technology stack with Javascript. Applications! Data processing! GUIs! There's nothing it can't do!...or so this crazed bunch tells us. But if we're honest, Javascript is not well suited to these use cases, because no matter how many magical frameworks we pile on it, Javascript was designed for web interactivity. We're now sitting on a volatile pile of fragile spaghetti code, just waiting for an IT to sneeze before it all falls apart.
This isn't Javascript's fault. We're just trying to use it to replace C++, Python, Rust, Ruby, Perl, R, Haskell, Go, and the rest of the gang. We've lost sight of what Javascript is supposed to be. By the same rule, we can't really use any of those other languages to truly replace Javascript, because they weren't built for the same purpose!
Languages need to know why they exist, and stick with those reasons. Writing a statistical computing program in C++ is (usually) as silly as building a game engine in R. You could probably manage to pull off both, but you'd be wasting a lot of energy and time for a suboptimal result.
In other words, Javascript's problem can be summarized in the words of poet John Lydgate: "You can please some of the people all of the time, or all of the people some of the time, but you can't please all of the people all of the time."
If we move away from language debates, we won't go far before we come across some mention of the cloud. It wasn't long after this mystical land of water vapor and data was first mentioned that some astute IT pointed out...
The cloud is just someone else's computer.
The cloud offers some very neat innovations. My company's website migrated from traditional hosting to a Linode VPS, and we couldn't be happier. Some software and services are empowered by an intelligent use of cloud computing and VPS technology (two different concepts that we keep lumping under the same cover-all name.) However, the cloud doesn't always have a silver lining.
One online friend described how his company's CTO decided to migrate their entire infrastructure to "The Cloud," with little more explanation than some empty tech babble he no doubt picked up by skimming an article on Slashdot. His plan held less water than the actual clouds over the tech office, but he was obliviously plowing forward with his plan, deaf to the warning cries from half his IT staff. He was insisting on migrating an entire, humming, live infrastructure, databases and all, from their in-house servers to AWS because he had heard the siren-song of The Cloud. The functional infrastructure was to be dismantled, and rebuilt in AWS, with no gain in functionality, a significantly higher operating cost, and years of fixing the problems that come from hammering a square peg into a round hole. And all the IT department wept.
I've also watched programmers "solve" problems in relatively minor app projects by using "The Cloud," when local based solutions were faster, cheaper, and more efficient. This included one case, I forget the technical specs thereof, where a college senior project set up no less than three cloud-based microservices for a smartphone app that, in retrospect, didn't really need any.
I can still hear the marketing exec who dreamt up the term "the cloud" cackling on his way to the bank.
Finally, we have neural networks. Again, I'm not going to claim expertise here, although I'll admit that I get a little excited seeing some of the stuff they manage with these! AI can pull off some amazing feats with neural networks and machine learning.
However, I'd also classify neural networks as a fad. I can be found lurking on Freenode IRC nearly 10 hours a day, six days a week, and I can't tell you how many times I heard people mention using neural networks for purposes they are entirely unsuited for. It was as if someone mentioned it on Reddit, and now every basement programmer wants to implement their own.
Of the three, this one seems to be losing steam very quickly. Perhaps it is that you can't get very deep into implementing machine learning before you realize you're in over your head, and you could achieve just as good a result in your Yet Another Number Guessing Game with a simpler algorithm.
I think blockchain may be poised to replace neural networks in terms of technology fad rankings, and it makes me sad. There are a lot of good ideas that we need to be pursuing with this technology, but after the hordes of fair-weather fans finish with it, the tech world will be left with ringing eardrums and a bad taste in their mouth. When that day comes, it will take a brave soul to even suggest blockchain, until the technology fades into a sunset of obscurity.
You may be shaking your head at me. "These aren't fads," you might say, "and even if they are, the damage won't be that bad."
Allow me to remind you of a few fads of yesterday.
Java. Hadoop. Wordpress. Python. Joomla. Shockwave Flash. Extreme programming. Singletons. HTML iframes. Macros. Goto statements.
All of these have uses. There are, or were, proper applications for each. But the frenzied application of these technologies to every problem imaginable wore off the shine and finish, leaving them with a haze of knee-jerk repulsion. Python had to gain a whole new version to regain some of its lost relevance.
Now we have a whole new batch of ideas and technologies, some of which are already suffering the damage of fad status.
Cloud Computing. Neural Networks. IoT. Go. SaaS. Rust. Docker. Haskell. Javascript.
And blockchain.
Our innovations deserve better treatment than this. Please, for the love of all the shiny new technologies, as well as the old reliable ones, follow these three simple rules:
Know why the technology exists, and generally use it only for that purpose. We can certainly explore innovative uses of technologies, but be careful not to force a square peg into a round hole, nor to make one technology do everything.
Choose technology solely on its merit in the context of your project, and never on the basis of its trendiness, popularity, or lack thereof. FORTRAN may well bear consideration in the same breath as Clojure.
Combat fad-based technology decisions. Have a broad understanding of technologies old and new, and be generous with this knowledge. When you spot someone building a desktop solitaire game in Node.js using a neural network, remind them that Python and basic conditional statements are things.
Working together, we might just prevent the day that some technical whitepaper bears the ominous title "Blockchain Considered Harmful".
What are some fad horror stories you've encountered?