Iotivity: open source framework for IoT

swaggy

Active Member
Just came across this:
www.iotivity.org

It looks like they just released the preview for Linux after CES. It's a framework for devices to communicate with each other via agreed upon APIs, they list support for BT/BLE/WiFi/Zwave among others. Still very early to judge whether this will go anywhere but it's driven by the Open Interconnect consortium and they do have some big names in the industry as members:

http://openinterconnect.org/about/members/
 
Also we have three other major players in the home automation/smart home field, Google, Apple and Samsung (GAS???) competing to set revolutionary new open industry standards. Apple with the Homekit framework and Google (Nest) with the Thread framework. Samsung has aligned itself with Google/Nest.
 
I firmly believe if a clear winner emerges from this unified open framework “battle”, they will radically change the HA/SmartHome market from a niche market fragmented by many diverse protocols/standards, both open and proprietary to a mainstream market with one unifying open standard making HA simple, reliable and easy to use thus facilitating non/technical consumer acceptance. The big guys know the key to making beaucomp bucks is to make home automation/smart homes as easy and painless as possible for the average non technical consumer.
 
The big guys also know they can't make any money in the HA market in its current condition. They have to mold it into market that fits their consumer electronics business model. And I believe that's what their trying to do.
 
In the end, it's a high stakes money game. Winner take all.
 
 
http://www.wsj.com/articles/the-race-to-build-command-centers-for-smart-homes-1420399511
 
Having a common protocol will only be a limited improvement. This is something that most folks might misunderstand. It's not the lack of a standard protocol that holds back home automation now really. There are plenty of automation systems out there that support large numbers of devices, and that's all you need to implement automation successfully.
 
Ultimately, having a common protocol doesn't make it any easier to set up all of the gear and get it working together the way you want it. It will make it easier to get to a 'remote control on the tablet' type of level, but that's not really automation. It's not going to make it any easier for the customer to figure out how to make sure that X happens any time motion is sensed in these three rooms and the security system is in this state except for when Tuesday between this and that hour. That's automation and that's what makes it really useful. But that's also why automation is limited now, not because we can't control the devices easily enough.
 
it's also not going to magically create really useful, very activity oriented interfaces for you. That's also what makes automation really useful. Having a remote control on a tablet is a marginal improvement, but it's not the kind of thing that gives you a touch screen interface that is based on how you want to do things, that works in terms of abstract activities that you define. What does 'sit down and watch TV' mean to you? Having a common protocol isn't going to figure out for you that that means you have to turn on these four devices, wait for that one to warm up, and only then select these inputs on those two devices, turn those lights off, mute the multi-zone audio output in this zone, etc... You are still going to have to set that up yourself.
 
So, anyhoo, I just think folks need to be realistic about these things. Nothing is going to magically turn the automation world around, as though no one in the automation world understood these things a long time ago, and realized that it will be almost impossible to herd all of those cats and that the results will be limited even if you manage to do it. And of course if any of those devices in my mini-rant above don't support this protocol, none of it works. And, even if this happened, it would be a decade before all of those non-compliant devices were well gone, and a long time before you could even start from scratch with fully compliant devices.
 
The key here is not so much a common protocol but a open common protocol. A protocol that is free from any type of litigation. Patent trolling is severely stifling innovation in the industry. Essentially, patent trolling is the purchase of patents by individuals/groups whose only interest is making a quick buck. They file a patent infringement suit against any company that they believe will buckle in and pay them off. HAI was on the receiving end of two of these suits in the three years prior to being acquired by Leviton. Both times they had to settle out of court and pay the trolls extortion money or close the doors. Taking the trolls to court is a much more expensive proposition and you really don't know if you're going to win. Thus, the trolls are banking on the fact that the target company will take the easiest and least expensive method to settle the suit.
 
Jay Mcllelan sat on a 2013 CES conference panel addressing patent trolling. Several U.S. congressmen were in attendance at this conference. And trust me, Jay and the others on the conference railed against patent trolling. One concern was that trolling is a major roadblock to creation and growth of small start up companies. Thus, severely stifling innovation.
 
Yes, there are many automation systems on the market today. If you buy into one automation system, you're pretty much locked into that system. For example, I own the Omni Pro 2 system. I can't use the Control4 touch screen with my OP2 panel. Although Control4 uses the Zigbee protocol, it uses a proprietary version of the Zigbee protocol.
 
End user programming of the newer products hitting the market has gotten a lot easier. Wink, SmartThings and Wemo now use a web service called IFTTT. It is so simple to use that I can even do product programming. This web service now creates a “recipe” for a product. For instance, a user can create a ”recipe” to monitor Facebook for any new posts. If a new post arrives, the “recipe” will have the product make an audio announcement that there is a new Facebook post, turn off the Samsung TV, light the way to the second floor office, turn on your computer and secure the first floor while you are checking out your new Facebook post. I can really envision end users getting carried away with these new web services such as IFTTT.
 
I honestly can't make the connection between a open common protocol and the end user experience. I believe an open common protocol serves two very important functions. One is to reduce/eliminate the patent troller issue which is plaguing the industry today. The second is to increase the interoperability between vendor products. For example, I am putting together a business plan for a home health care product in anticipation of the future smart home market. Now let's assume Control4 starts to market their own vital signs monitor and I need a vital signs monitor to enhance my product. If Control4 is using their proprietary Zigbee protocol, I either have to license that protocol from them or build the vital signs monitor from scratch. If it is using an open protocol, their monitor would be a “plug an play” option for my product. In the former, it's a lose for me and a win for them simply because it will cost me more for my product. The latter is a win for both of us. I get additional value for my product and Control4 will benefit from the additional sales of their monitor. The latter also allows the consumer to decide what products to use with my product as opposed to me trying to make a determination of their needs.
 
Maybe we're both just on the wrong page. It appears that you are referring to home automation and I'm referring to smart (devices) homes. Home automation is here today but smart homes isn't here yet. That's the reason why all us little guys are keeping a keen eye on what the big guys do. We want to jump on their shirt tails to profitability. We're all waiting on the sidelines with our business plans in hand for the big guys to make the smart home market a main stream market. Agreement on a common protocol is just the first step in a series of steps to reach that final goal.
 
 
Control4 purposefully doesn't want to use a common protocol for their own gear. We don't need another protocol for them to make their stuff available, they could have already done it. But they see zero point in that, and of course I have to agree with them, from their point of view. Why would they work hard to build an automation system, but then let you get all the benefits of their other toys but not buy the automation system? A company that makes its money selling vital signs monitors will be happy to let you access it because they just want to sell monitors. C4 wants to sell automation systems, so they have completely different business interests. And, there again, any company that sells monitors is perfectly free now and always has been to make that device available via an openly defined serial or IP based protocol, so a new common protocol isn't really preventing anything already.
 
And a common protocol in and of itself doesn't make it plug and play, which is my whole point above. Just because you have a monitor and it has a protocol that lets you read/write state information, that isn't anything plug and play. In order for that to happen, you have to have much higher level semantics defined for what is a vital signs monitor, what type of information does it provide, how does that information need to be presented to the user, what can the user ask this device to do when it is these particular states and how does the user interface know about those states and what to allow when and all that.
 
A common protocol generally doesn't define that stuff, it just means you can use the same protocol to talk to devise, and generally it means that you can get basic information about devices you don't know about up front and you can provide basic access to the device's function. But it doesn't really 'integrate' that device into a system. Even if they attempt to define basic device semantics, it's more likely to be simple stuff, like this is a lock, this is a light, etc... Z-Wave provides that level of definition currently, for instance, and an automation system that accesses that info can provide very basic 'integration' of the device into a system, but it's really only very basic. Nothing there tells you how a given light or lock relates to a particular requirement, say, away mode on the security system. You still have to set up what any lights and locks should do when the security system is armed, even if you have a common protocol that allows a lock or light to directly talk to the security system to find out what the state is.
 
And, honestly, I think the whole 'no central controller' thing just makes it harder to create real automation, because instead of a central place where you create all the rules, it's now all spread out all over the place. Folks who use Z-Wave also see this phenomenon as well, for instance. Each module can be set up to signal other modules in some way. But instead of a nice, easy to manage system that provides all   that interconnection, it's all spread out and much harder to manage. Ultimately the central controller provides coherence and security that a bunch of things talking to each other (probably with fairly limited little control over what gets to join the party and start telling other things what to do.)
 
Anyway, I'm not trying to a spoil sport or anything. But I think that people have a mistaken view of why the automation world is why it is. It's not because of lack of a common protocol for all devices to speak. It's because no matter what they speak, something has to turn them into a system, not just a bunch of devices, and that doesn't really get that much easier.
 
I think you can make an analogy to the days before the World Wide Web.  We had proprietary networks (Compuserve, AOL, etc) that were easy to use but generally expensive and none would interoperate with the others.  The early internet was there with email, ftp, irc, gopher, etc--and while free--was too complicated for most of the world.
 
Then came http.  Initially, it was much less functional than the proprietary networks but it was free and pretty good.  (No security initially which we're still paying for today.)  Multiple browsers emerged.  Multiple servers.  They worked together using a common protocol.  Content exploded.  Soon, web browsing killed the proprietary networks.
 
Home automation right now is either expensive (Creston, Control4) or for diy fanatics (too complicated for most people).  Dean is right that initial systems for the mass market won't be as richly customized as some people have right now.  That's irrelevant.  If one of the current pushes breaks through, the cost of smart devices (light bulbs, etc) will become much more affordable due to economies of scale.  Which puts it in reach of more people, and so on.  There are some automation benefits that are obvious to most people, such as energy savings, convenience, safety, etc.  These get the masses initiated to home automation.  In the fullness of time, they may understand the benefits of a richly customized system.  Just as http evolved over several versions, I suspect a home automation protocol will do the same.
 
Now, open standard or not?  Two issues:  security and mass awareness.
 
None of the open standards I've seen seem to have security baked in at a fundamental level.  The consequences of someone hijacking your automated home are a lot more serious than a browser hijacking.  
 
The next killer issue is breaking through to mass consumer awareness.  Nest did that and raised the bar on smart thermostats--Prophlix and other had been around for years and never managed to gain significant traction.  So Google/Nest may be able to leverage and build on that start.  
 
I think Apple is going to be first out of the gate, though, with Homekit.  Right now, hundreds of millions of iOS devices are running iOS 8 and therefore ready to control Homekit devices.  The Homekit software framework was announced at last year's Developer's conference.  The first wave of Homekit devices were announced at CES 2015.  Each device must incorporate an Apple-designed chip that, I believe, incorporates hardware-based security.   Homekit seems to address the two killer issues and is ready to go first.
 
Will only Homekit succeed?  I expect there is room for more than one standard.  Maybe it will be like mobile phones and tablets where Apple rakes in  a disproportionate share of the profit in the sector compared to those who chase market share at all cost. ;)
 
Craig
 
pvrfan said:
I think you can make an analogy to the days before the World Wide Web.  We had proprietary networks (Compuserve, AOL, etc) that were easy to use but generally expensive and none would interoperate with the others.  The early internet was there with email, ftp, irc, gopher, etc--and while free--was too complicated for most of the world.
 
Then came http.  Initially, it was much less functional than the proprietary networks but it was free and pretty good.  (No security initially which we're still paying for today.)  Multiple browsers emerged.  Multiple servers.  They worked together using a common protocol.  Content exploded.  Soon, web browsing killed the proprietary networks.
 
But, there again, though I hate to keep whining, HTTP isn't why that happened. It happened because a huge amount of work was done to define higher level constructs that sit on top of HTTP, and that's what really does the work. HTTP doesn't tell your browser how to play a video, or play an audio stream, make a phone call, or provide the means by which all of that visual formatting of content is done, or how handling user interaction logic is done. Without all that other stuff, HTTP would be fairly useless.
 
It wouldn't fundamentally change the internet if there were multiple commonly used protocols, since the browsers (and other clients) can easily support that. And of course that is in fact the case already. Not everything happens over HTTP, and clients must handle many protocols and technologies (SIP, RTP, RTSP, HAL, SDP, SOAP, XML, HTML, Java, Javascript, Silverlight, PHP, and on and on) in order to get the internet even up to the level it is today (which is still woeful compared to a high quality, dedicated automation system's tightness and security.)
 
I mean, your analogy is right, in that HTTP is sort of equivalent to a common protocol. But, until all those other, higher level protocols and standards were created (which deal with specific types of functionality, in the same way that such constructs are needed in automation to deal with different types of devices and functionality), and all those many ways of providing user interaction and display capabilities, were added on top of HTTP, the web was very, very primitive.
 
And the same thing will apply to any HTTP equivalent in automation. It's only the simplest level of functionality, and it's not really required to make automation 'happen'. Automation is already at the point where it's easy enough to support any device that provides a published, reasonable quality protocol. The same issues that limit automation now will continue to do so even in the presence of such a common protocol. And it's being available, in the same way that HTTP was, will only allow for a very primitive sort of 'integration', nothing like what folks expect out of a real automation solution.
 
And, if you look at how long it took for all those other constructs to be defined and refined, and that fact that that work involved not THAT many companies who had a vested interest in having these technologies in place, and you compare that to the situation with all of the MANY companies that would have to get on board to achieve any sort of ubiquity, and the fact that for most of those companies integration into automation systems is at best a very marginal issue, that doesn't bode too well for standards nirvana any time soon. Was that a run-on sentence or what?
 
In terms of multiple standards, if those standards really defined all of the high level constructs required for real integration, having more than one would be bad, because supporting even one such fully fledged standard would be a lot of work. Having to support multiples of them would be far worse. It's actually better for devices to just provide low level access to their functionality (in a quality way that takes into account the needs of automation systems), and let the automation systems provide the higher level constructs once for all devices it controls. Clearly all those fairly inexpensive devices out there aren't going to implement reams of functionality to support or interact with all kinds of other devices in anything beyond the most simplistic way, and in the face of multiple standards then obviously you are right back to needing a centralized automation system that can mediate between them anyway.
 
What would best serve the automation system vendor's burden would be for manufacturers to just get serious about providing high quality protocols that aren't afterthoughts full of ifs, ands, buts and gotchas. It's fine if they want to use their own msg format.
 
OK, anyway, that's my last bummer-rama post on the subject. I've said what I need to say. I don't want it to turn into a food fight or anything.
 
P.S. Just for funzies, here is what HTTP got us, and even then only with the addition of the (even then) much larger HTML standard on top of it. We've come a long way, baby.
 
http://www.microsoft.com/en-us/discover/1994/
 
I was sort of shocked that HTML5, which I'd not tracked that much during incubation, moved away back from XML instead of more towards it. Without the clean extensibility of a language like XML, I don't see how a Semantic Web is going to ever happen, and apparently now it won't ever happen since HTM5 pretty much threw away all that work towards XML compliance in the 4.x iteration.
 
And of course it means that you now have to write two complex parsers and validators, when one could have been used. I never read up on the rationalization, but whatever it was I think it was wrong. If it was that XML is too complex and formal, well hardly anyone writes web content by hand these days and those of us who do can handle XML syntax well enough. And the formality is why XML is the better choice in the first place.
 
Back
Top