Interoperability: Can’t We All Just Get Along?
By Russ Levanway, President
I’ve been thinking a lot lately about the Samsung Galaxy Fold, the bleeding-edge new mobile phone that claims to defy categorization. If you haven’t seen it, the Galaxy Fold is a foldable smartphone with a barely discernible seam down the middle. The user literally unfolds it for one huge display, then folds it to about the size of a large iPhone. It’s a crazy concept. It also appears to have been a flawed one.
Back to the drawing board
The first Galaxy Folds released to reviewers broke in alarmingly high percentages. Samsung was due to release the device at the end of April, but, due to high failure rates, they abruptly delayed the launch and then cancelled orders for the phone. It looked pretty bad, but I do appreciate that Samsung didn’t stick to their release date knowing their product was flawed. Smart companies know that when the technological envelope is pushed just once, the floodgates open for other companies to copy it, evaluate it, improve it and perfect it. Samsung clearly knows what it’s doing in that regard.
The whole Galaxy Fold debacle goes to show that, when it comes to consumer devices, consumers ultimately benefit from being able to test early-release technology. What’s the worst-case scenario if, say, your test of the Samsung Galaxy goes south? At worst, it means you go back to your old phone. The stakes, relatively speaking, are pretty low.
Good for consumers, not for business
Businesses, however, should generally refrain from adopting the newest bleeding-edge technological pieces in their environment. Why? Because usually the cost of adopting very new tech is extremely high. For instance, if something is brand new, untested and unproven like a new software package or new hardware system, there may be neither good documentation nor a robust history of bugs, glitches and issues. If you adopt very new technology in your business, you are the one to learn all of its pitfalls. Better to wait until the kinks are worked out, or, at the very least, documented.
That’s why, in general, businesses shy away from adopting the latest tech pieces in their environment. It’s also why, oftentimes, consumer products and business products don’t interoperate very well: consumer products may make use of very new technological and software features, but those features don’t necessarily interoperate with an older or more mature business application. People sometimes bring their own devices (we call it BYOD) to work – personal laptops, phones, etc. – and expect them to play nicely with business devices and environments. We’re always game to try and make it work! But the devices don’t always interoperate the way we’d expect.
Don’t miss the boat
Waiting to adopt new technology in a business environment makes a lot of sense, as we’ve seen. But waiting too long has its consequences, too. Recently, we’ve seen several clients suffer from long-term chronic underinvestment in their technology.
Sometimes, businesses get so far behind the technological curve that the cost to get them modernized becomes exponentially higher. For instance, let’s say you have a really key software application that your company adopted 15 years ago. Over that time, the software has provided updates every three years or so, but you’ve never updated. Maybe it’s because it costs money. Maybe it’s because there’s a learning curve with updated software. Whatever the reason, you’re now five versions behind, and your 15-year-old software can no longer interoperate with modern technology.
Here’s a riff on that example: your 15-year-old server dies, and a new server needs to be brought in to get you up and running again. The problem is your old software can’t interoperate with your new server. Maybe the new version of Windows is 2016, but your old server ran on Windows 2003, which has been discontinued by Microsoft. This is a typical scenario we see a lot: one piece of software is so old, so legacy, that it can no longer operate with any recent technologies, and the parts and software are no longer available to cobble together something it can coexist with.
When this happens, the cost to upgrade becomes exponentially higher for two reasons. First, if you’re upgrading from something that’s that old, usually there’s a big learning curve, unlike when you make incremental upgrades. And then there’s the cost of equipment. Sometimes in order to upgrade to the new version of anything, the upgrade process has to be staged. For instance, we may have to scrounge around and find an old server and software to upgrade halfway, and then upgrade again. This is not an inexpensive process to begin with, but on top of that, if you waited 15 years and your server failed, you’re likely in the midst of a catastrophe. Now you’re not able to plan the process any longer. Now it’s a five-alarm fire, an emergency, and you have no choice but to act.
Stay in the middle
The point of all this is to encourage businesses to put real thought into interoperability and where on the curve they should lie in terms of technological adoption. As a general rule of thumb, if you have key hardware or software applications that are multiple versions behind, there’s a very good chance they won’t coexist with modern software and hardware. It’s a prudent move to take steps toward keeping that system updated so that you have interoperability. Don’t get to technology’s bleeding edge, but don’t fall four or five versions behind. That middle ground is, in general, the most cost effective for businesses.
// Russ is a sought-after public speaker, technology expert, and community leader. As the president of an ever-growing managed services provider with offices in San Luis Obispo, Santa Barbara, and Fresno, Russ’s goal is to sustain and grow an IT company that provides incredible value for clients, and a great workplace for his team. When he’s not collaborating to chart out the future of CIO Solutions, Russ serves on several non-profit boards, volunteers at the People’s Kitchen and travels the world with his wife and two daughters. More on Russ>>