Mike Helfrich from Blueforce Development was correct in his comments on the previous post – the issue of standards did come up and was left unresolved and disputed by both the panel and the audience. I guess that means I’ll give it go here as well. The bulk of my technical background is in data networking, a market greatly influenced (maybe even defined) by standards. That is not to say that proprietary solutions haven’t been successfully deployed and even in some cases become a dominant force (in some cases ultimately becoming standardized upon). It’s also important to note that standards based solutions can form the very framework on top of which innovation can flourish.
Standardization is often opposed for two public reasons and one private reasons by companies participating in a market.
First, creating the standards is typically a political (in the corporate sense) and time consuming processes. This creates drag in the market and increases time to market for all players involved. For new entrants to the market, this can be particularly challenging, as they often differentiate themselves from the rest of the market with their development pace and agility. Further, the political process of creating the standard consumes precious resources which ultimately only level the playing field (or part of it) for all participants.
Of course, by definition, the newly minted standard represents a series a compromises reached by many participants with many different agendas. As such it is often the lowest-common-denominator solution (or at least approximates it). This is the second public reason companies leverage to defend forging their own way with a vertically oriented solution.
The private motivation for opposing standardization and championing a proprietary, vertically integrated, approach is to create vendor lock-in. Standardization is about the clean definition of interfaces between technological domains . These interfaces may be complex, but they must be well defined and include the definitions of the required responses of entities communicating across the interface. This necessarily requires sufficient documentation to describe these interfaces and interactions which ultimately enables third parties to implement against either side of the interface.
New players to a market can often benefit most quickly when well developed standards exist, as it allows them to focus their innovation and the value they ultimately bring to the customer onto a specific niche of the solution value chain. This focus can be achieved only because the new player (and the customer) can trust that the other links in the solution chain will be well behaved and compliant to the standard. Ultimately this leads to more choice for the customer and more focused competition in the market. These forces would seem to be fundamentally important (I hesitate to say “good”) in a market economy.
There is (at least) one additional reason that standards are “good” for the customer: independent validation. Without a public standard, the customer is unable to verify the internal behavior of a vendor’s solution. There will always be a part of the solution which becomes a “black box” to the external observer, however, transparency and test-ability are critical to obtaining “trust” in a solution. In the case of IoT, I think the most obvious application of this methodology would to obtain trust in the security of the solution (from data transport to data storage to data mining). That topic will have to wait until next time…
So we have forces arrayed in opposition to each other when it comes to standards in the marketplace. Opposing standardization is time-to-market (from technology onset) and tactical technological excellence (no compromises – in theory). In support of standardization is the ultimate efficiency and large-scope innovation in the market and thus available to customers as well as a more transparent and trustworthy solution.
What did I miss? And what side of the debate are you on?
Leave a Reply