Fastest rule-processing for automation?

Dean Roddey said:
Of course another thing that many folks don't consider these days is that, if you want really fast back end automation response times, you probably don't want to be running Plex on the same machine, doing on the fly transcoding of 3 streams of HD content. There's a trend these days sometimes to buy one big machine and run a bunch of virtual machines on it, some of which may be doing pretty heavy duty stuff. Even if they aren't completely chewing up all of the CPU, there are other system capacity issues involved, such as I/O, that might start slowing things down.
 
For me personally, I'd prefer to have the automation system itself on its own dedicated machine. It doesn't have to be a particularly powerful one if it's dedicated, and keep media processing on its own.
 
Agreed.  Too many eggs in one basket, and all that.  
 
I've had great results with splitting Plex onto it's own box and serving media from a QNAP 671 NAS (not an entry-level model by any stretch).  I've got a VM running on it, mainly to handle Crashplan backup configuration and Eye-Fi camera card sync'ing.  Works really well.  The NAS is capable of running Plex, but it been good to not have the transcoding bogging it down.  Truth be told though we still do a lot of viewing with Tivo, Amazon and Netflix.  The ease of use just isn't there with Plex, nice as it is.   It's a lot "less worse" than others but still annoys my wife enough to not get as much use.
 
Likewise, running automation services on something separate has also been on my list.  I'm very much interested in it being able to run unattended 24x7 without supervision.  But without having to throwback to old-school clunky programming and configuration tools.   I've yet to find something that best suits my interests.  They've all got a lot of good things to be said for them.  Nothing I've been willing to commit to though.  
 
What I'm most critical towards is performance for daily use.  I'd like my lighting to be a little smarter (no conditionals for RA2) but it's responsiveness has been great.  It's been fun to fiddle around with z-wave sensors and some zigbee lighting, but their responsiveness pales in comparison (cloud performance is clearly a problem with many things using them).  
 
What is nice to see from them is a fresher approach to a broader range of lifestyle interactions.  Those benefit from cloud and networked sources.  Sure, there's a lot of security and privacy concerns.  But there is a lot being done with them that helps avoid reinventing wheels.  
 
Dean Roddey said:
4. There are way too many factional 'standards' that folks sometimes throw together, and then if you want to support the device you figure out you have to somehow support an alphabet soup of standards, all of which have thick RFCs you have to read to understand them.
 
What's irritated me on more than one occasion is an unwillingness on some of the standards committees to draw the line at minimum acceptable practices and refusal to 'call out' bad practices.  Next thing you know some half-assed contractor working for a big company does a phenomenally shitty job of implementing what "looks like" it conforms, but really doesn't.  Then a horde of nitwits follows along, chanting "but 'Big Company We Love' does it that way" and then the whole process breaks down.  I've referred to this in the past with the statement "the tyranny of bad tools".  As in, the old saying "when all you have is a hammer, everything looks like a nail".   Doesn't matter that you've crafted the next best screw, the idiots are running around with hammers whining that screws are too complicated.
 
Yeah, there's a lot that gets thrown into some "standards" that's unnecessary, or beyond a realistic basic scope.  Tragically, though, there's usually a lot more implementations that fail to rise to even the basic scope and interop really suffers.
 
With things like the Echo and IFTTT demonstrate is there's a lot of ways people want to 'stitch together' many different things.
 
Yes.
 
I like it so much that I am just running a Homeseer Amazon Alexa application (sans Echo device)  and Kinect application these days. (well and either choose or not whether I want to use IFTTT).
 
With the Homeseer 3 to OPII panel plugin I will be able to manage if I want the OPII panel with either Kinect or Alexa.  I will probably get to that point, play with it for a bit then shut it off out of boredom.
 
I have shut off the Amazon Echo cylinder device as it isn't needed anymore.  (and I moved it to behind furniture and put a virtual on off switch on it and it was the cylinder appearance was a bit low on the WAF).
 
It took some 10 minutes to set this (Alexa and Kinect application) up; easy button stuff and way faster than earlier attempts at the Amazon Echo device connectivity.
 
My personal use of the Alexa application and Kinect today works great for my automation.
 
I have separate boxes for MythTV, ZM and 3 NAS boxes running embedded BSD or Linux these days.  I did shut down the embedded Windows server doing NAS a few years back.  The embedded NAS OS was still a bit too glutinous for me.  I utilize Aopen core duo mini PCs running a base of Ubuntu 14.04 64 bit doing Kodi and it works just fine for me.  (there is cable, OTA and satellite television in house still today and wife prefers her television (recorded or live) to come from satellite.) I am today now using one RPi2 to run Homeseer lite, another RPi2 to run my CumulusMX software and an old Seagate dockstar to do my irrigation with my favorite irrigation software running in Linux inside of the old Rainbird box.  I am too trying to see how much I can do with a little microrouter that runs at 400Mhz with little ram and no play space relating to automation and making it non internet dependant for my automation.
 
I have over the years (15 years) suggested to Homeseer folks that they would have difficulties running a NAS, Automation, Live TV capture and CCTV NVR on the same box as Homeseer.
 
You can virtualize and it does work if you have many CPUs and much memory.  I am today doing a Wintel virtual box inside of my Ubuntu 14.04 box and take what I need from the OS; nothing extra these days.
 
Concurrent here using a 4Gb embedded Wintel (my preference is Wintel over Android) image for my WIntel Atom based touchscreens which have MS TTS/VR and I can stream live TV on them these days and they are connected via POE.
 
I have been able to do this stuff now on a mini PC with a Baytrail Intel quad core Atom using up less than 20Gb of MMC space and it works fine.  I cannot do this on a RPI2 yet; maybe some day.
 
I do prefer the tethered to the home Amazon Echo over the tethered smart phone (IPhone, Android or W10) relating to automation.
 
wkearney99 said:
What's irritated me on more than one occasion is an unwillingness on some of the standards committees to draw the line at minimum acceptable practices and refusal to 'call out' bad practices.  Next thing you know some half-assed contractor working for a big company does a phenomenally shitty job of implementing what "looks like" it conforms, but really doesn't. 
 
The entire position of the early internet worked against standards. The prevailing attitude was (and in many cases still is): Be strict about what you send and be very forgiving of what you receive. Of course that can make it possible for your product to work with other things. But it creates a situation in which it's almost impossible to really validate your implementation of the standard, because it's impossible to know if anything will actually reject you if you do the wrong thing. 
 
But, if you don't do that, and your product (rightly) rejects other products because they are wrong, they won't get blamed by the customer, you will. So, ultimately, it's the customer that drives this problem, because they don't know from standards. They just want stuff to work. So, if there are a number of bad implementations out there that are fairly widely used, you have to support them or you are the bad guy in the customer's eyes. All these other things work, so why doesn't yours?
 
Oh, I was present when that notion started.  It's one thing to be tolerant for the purpose of avoiding casual errors, it's another to do so silently or code around known mistakes.  It isn't all that often that a developer won't address their mistakes if you reach out to them, especially when a validator can be used for free and anonymously.  But the spineless-ness on the part of some of my fellow committee members was, at times, nothing short of infuriating.  The customers, meanwhile, just want things to work and will dutifully mimic whatever they see the most.  I'm no perfectionist, not by a long shot, but some of the half-assed implementations out there... ugh...
 
This is one of the reasons I'm greatly skeptical of anything Apple creates for 'standards'.  Homekit?  No fucking way, not until ALL of it is an open standard.
 
Back
Top