Technology evolves so fast today, and this can change your perspective. For example, is it better to run Ethernet to every room, or is it actually better to run some type of cableway, so that future cables can be added later? I'd rather have upgradability than a fixed technology.Dean Roddey said:I would argue that, in general, Ethernet/Wifi is appropriate for the large bits and that something else should be used for the devices themselves. All of the major lighting system vendors obviously agree since none of them use Ethernet/Wifi or are likely to probably. And it's gigantic overkill for something like a light bulb anyway.
A proprietary wired or wireless protocol is best, IMO. It's the right size hammer for the nails being nailed. The network is the right sized hammer for automation servers, media servers, touch screen clients, media streaming clients, etc...
Wireless technology also really has evolved over the last 15 years. Dean says Wi-Fi is overkill for a lightbulb, and that is true from many perspectives, but why? Newer Wi-Fi radios can consume much less power today, and they have much less overhead then say 15 years ago. So today a Wi-Fi light bulb is not that crazy. Wimo, LIFX, Wi-Fi Cree, Flux, and TP-Link, all make Wi-Fi bulbs. Why? Because this means the Amazon Echo, the Google Home and others and your smartphone can all control them directly. Wi-Fi, Bluetooth, Zigbee, Z-Wave, ANT and a few others have all evolved over the years. When Wi-Fi was first introduced in 1997, it actually used IR, like your remote control. I'm waiting for UPB light bulbs. Maybe soon.