API documentation - Pre-Alpha - For Ezlo controllers

Although I wouldn’t say no to native Vera integration with Harmony hub and its “Home Control” functions.

Philips Hue Bridge and Samsung SmartThings for example are natively supported by the MyHarmony software.

Where you can add those systems and some others I think into Harmony and then control devices and scenes using the dedicated buttons for “Home Control” on the Harmony remote control handsets and also on the LCD screen of the Elite Harmony remote.

Only way I could achieve this with Vera was to use the Java HA Bridge software running on my file server that emulates a Philips Hue Bridge. And the Harmony software is connected to that instead.

We then use HTTP commands in the Java HA Bridge to control devices and scenes within Vera via the Harmony remote etc.

Thanks! I hope the same payloads could be sent via http and the same responses could be get back, in order to simplify things.

All that said, a bridge from websockets to/from http or mqtt should be easily doable.

Not much can be added, but looking at this doc, this sounds promising and really multi-threading :grin:

Thank you for the API specification. this is good , here are some quick feedbacks:

Great points:

  • an API is published
  • JSON payloads
  • an Event based model is defined

Axes of improvements:

  • it seems a bit partial we need different kind of API. a) a server side API where plugin will be able to operate with the platform. this is a server side to server side API , b) a client to server API for alternative UI or plugin’s UIs c) a 3rd party integration api, could be server to server

  • API is very structuring , API will drive the presentation layers, the features even the UI, I would spend a lot more time on having the right API before jumping into the UI development

  • the API technology should be chosen so that it enables multiple client device types, it should be pervasive enough to be the same available api for multiple kind of device ( desktop , phone , tablet ) so http and cross browser supported api are typically a good choice.

  • from a quick initial read , I find the api a little bit complex. a nice elegant http API could be RestFul with JSON payload and adopting a strong RestFull consistent approach would ease developers to assimilate the complexity thanks to a standardization. a good RestFull standard would mean to have a data model , with object and collections and use a object driven semantic ( as opposed to action oriented like create_device , get_list_of_rooms ) , with a unique id per object and a normalized url approach like:

    • https://fqdn/path/collection/(id) to work with an object of that collection depending on the http method Verb . ( GET for read, PUT POST for create/update , DELETE for delete ) and the payload json being the actual parameters
  • collections could be: root, controllers, networks, rooms, devices, services, plugins, users, authorized external integration ( oauth client id ) , etc
    external integrations ( the voice assistant, IFTTT ) could be exposed as another type of controller

  • the API need to offer ability to query metadata about devices capabilities. for instance an ability, to determine dynamically based on the device type what actions does it support , with what parameters & types. VERA was indirectly doing so with its upnp standard using XML file. was cumbersome but the functionality was somehow there, it is possible to enumerate devices and determine what actions is supported then dynamically construct a UI to trigger these actions

  • we need http API to be CORS enabled and the api should be thought day one to be multi controller so we can control several controllers on the network and uniquely identify controller_id+ device_id as a globally unique id ( same for rooms etc … )

  • we need a much easier remote access method, either proper OAUTH2 or JWT tokens for security

  • web socket events is good for a push model sry to client but on some restricted devices, an alternate way ( based on a PULL model ) could be offered in http to read from a message queue

  • I would not call “id” the parameter used for matching a request with a response. id typically identifies and object , the match request/response could be named something like “context” or “callback data”

  • I would from day 1 state that any timestamp / date is in ISO format yyyy-mm-dd:hh:mm:ss.uuuuZ GMT time , the UI would be in charge to translate to local timezone of the user

  • very short actions responses should be synchronous , but medium to long actions responses could be asynchronous with the event based model for the actual result

  • events : very import to allow for easy filtering by the reader of events. so events must have mandatory fields ( type for the event type, source for the event source / sender ). then data fields can be event type specific

hope that helps, if you go that direction, I am happy to participate in beta program ( providing you consider Europe for radio protocols )

7 Likes

I find it somewhat reassuring that there are a full set of boolean operators (and/or/not), some timing functionality and a modicum of validation with some informative error messages.

Compared to luup’s error messages e of “no message but all scenes stored after this one are also corrupted”, this is quite an improvement.

I am not a web api person or automation developer so I dont have any other opinions worth sharing at this point.

1 Like

amg0,
thanks for your feedback, it would be great to receive more feedbacks from you.

the API technology should be chosen so that it enables multiple client device types, it should be pervasive enough to be the same available api for multiple kind of device ( desktop , phone , tablet ) so http and cross browser supported api are typically a good choice.

Support of http already in our roadmap

the API need to offer ability to query metadata about devices capabilities. for instance an ability, to determine dynamically based on the device type what actions does it support , with what parameters & types. VERA was indirectly doing so with its upnp standard using XML file. was cumbersome but the functionality was somehow there, it is possible to enumerate devices and determine what actions is supported then dynamically construct a UI to trigger these actions

In our API - items are abilities, it will be available in next version of API documentation

web socket events is good for a push model sry to client but on some restricted devices, an alternate way ( based on a PULL model ) could be offered in http to read from a message queue

In order to have a great performance its very important to have good point-to-point communication and overcome hurdles which were put forward by HTTP. Also we like a full-duplex communication and possibility to have real time responses from the hub. All the WebSocket handshakes can be “checked” by the browsers using embedded developer tools in them, so you can easily use it.
Great performance, speed and stability of communication are the main goals.
But again http already in our roadmap.

I would not call “id” the parameter used for matching a request with a response. id typically identifies and object , the match request/response could be named something like “context” or “callback data”

I think mostly depends on preferences

I would from day 1 state that any timestamp / date is in ISO format yyyy-mm-dd:hh:mm:ss.uuuuZ GMT time , the UI would be in charge to translate to local timezone of the user

We are running scenes locally on hub including sunrise and sunset scenes.
Its mandatory for hub to know the region.

Very short actions responses should be synchronous , but medium to long actions responses could be asynchronous with the event based model for the actual result

Most of our APIs are synchronous.

events : very import to allow for easy filtering by the reader of events. so events must have mandatory fields ( type for the event type, source for the event source / sender ). then data fields can be event type specific

It is implemented: we have events and mandatory fields, detailed description will be available in next version of API documentation

Waiting for more feedbacks from you.

3 Likes

Hi,

We just released the pre-ALPHA pdf for the Ezlo LUA API. We’re waiting for your feedback in the comments over there. :slight_smile:

Ioana

1 Like

Thank you @Ioana, I was reading through it. I honestly have drafted a survey to the developers asking them if they preferred to maintain the old API/luup syntax or if it had too many problems and would want to start fresh but then recanted… I thought the answer would be obvious from the community but I will just give you my personal input and honest impressions.
We have a reasonably large and active community here. However you can see from the list of members than many (most) of the developers of old times and most active members have left.

To me what worked on the vera were:

  1. The local GUI device representation
  2. Lua/plugin both flexible and powerful
  3. Community forum
  4. The initial ability to run completely local

What did not work are:

  1. The zwave stack and command queue
  2. The engine reload at every sauce. And to a lower extent, device reboot abuse.
  3. Excessive desire to automate configuration and network management
  4. The incorrect use of the hardware resources. (I have storage in mind)
  5. The lacking zigbee stack support.

So my point is… It seems to me that you are very focused on changing what already worked. Yet leaving alone what was deficient, some which are foundational.

  1. The local GUI device representation: Gone to cloud and mobile.(for now)
  2. Lua/plugin both flexible and powerful: Radically changed
  3. Community forum: Increased censorship without courtesy notifications.
  4. The initial ability to run completely local: This has degraded over time with increasing dependance to the cloud and now is where you started your development from.

For what did not work and could use some changes:

  1. The zwave stack and command queue: No idea where this is going and I am scared to death that you will be using the same old stack handling and command queue. The inability to fix this after all these years does not give me a lot of confidence. The absurdity of some of the designs is still leaving me in awe.
  2. The engine reload at every sauce. And to a lower extent, device reboot abuse: This seems to be much better from what I have seen with the new data structure of the firmware.
  3. Excessive desire to automate configuration and network management: TBD
  4. The incorrect use of the hardware resources. (I have flash storage in mind): TBD
  5. The lacking zigbee stack support: TBD

It would really be helpful if you had designed an device level API and Lua API to be compatible or near compatible with the old one. At least for now so you can evolve it over time and benefit from all the integrations and plugins already written with maybe only minor changes.

You will understand from this long post that it all makes the decision to stay a little difficult. I am glad to see progress and this all has a lot of potential if I was starting from scratch… but we are not.

3 Likes

Hello @rafale77,

Thank you for your feedback.

Luup engine has monolithic architecture. It’s difficult to take only positive functionality and don’t take related problem with that.
Part of described issues are already fixed and another part in our roadmap.

I am scared to death that you will be using the same old stack

It’s completely new Z-Wave library.

Excessive desire to automate configuration and network management

Do you mean Lua API for working with controller settings?

The incorrect use of the hardware resources

Could you add details or examples of this issue?

The lacking zigbee stack support: TBD

It’s very important direction for us. But VeraEdge doesn’t have ZigBee module. Could you share list of ZigBee profiles which do you use? Or list of ZigBee devices which do you want to work with?

Sorry for hijacking this thread but @Sorin should have a pretty big list of Zigbee devices from the voted ”feature request - integrations” tag. Including Trådri, Xiaomi and Hue etc. Sharing is caring :wink:

2 Likes

Thank you Andrey,

It is somewhat reassuring that you are using a new zwave library. How about the command queue?

“Do you mean Lua API for working with controller settings?”

No I mean the inclusion process often failing because of excessive automation of the configuration. I hope this is not news to you. Also excessive intervention of the software in pro-actively changing data and user configurations. Deleting the virtual devices, child devices, automations even when they are orphaned… all well intentioned but should have a user input to decide whether or not to actually do them. The fact that they all rely on a reload of the engine to execute, which lead to an abuse of luup reload everywhere in the code. They should be independent functions manually triggered by the users. Not automated. User-data should be treated as sacred and never to be automatically modified.

For Zigbee, I don’t recommend going after a device library but a full stack support as the base. I would really recommend supporting both the HA and the LL stacks with all endpoints. For reference, given the hardware firmware you use on the current plus and secure this library:

runs circles around the vera. I actually made it run using the vera’s EM357 radio. (EZSP)

The incorrect use of hardware resource is pointing to an entire thread I started and is now deleted by the moderators here, discussing the storage partition of the veras… There is also the lack of usage of the multithread capability of the CPU…

1 Like

No I mean the inclusion process often failing because of excessive automation of the configuration. I hope this is not news to you. Also excessive intervention of the software in pro-actively changing data and user configurations. Deleting the virtual devices, child devices, automations even when they are orphaned… all well intentioned but should have a user input to decide whether or not to actually do them. The fact that they all rely on a reload of the engine to execute, which lead to an abuse of luup reload everywhere in the code. They should be independent functions manually triggered by the users. Not automated. User-data should be treated as sacred and never to be automatically modified.

Now I understand. We spend a lot of time now for correct migration user devices/items/rooms/scenes between updates. Z-Wave devices migration logic in Z-Wave plugin and can be fully replaced via custom plugin if you ready to write own Z-Wave integration plugin.

We try to avoid reloading firmware without a very good reason. Now We have one known issue with glib library which we use and we need to reload firmware after changing timezone on the controller. It will be fixed in next releases.

For Zigbee

Thank you. It’s interesting project. But this library is not certified and has GPLv3 license.

On zigbee, I understand, it is an open source, non certified library. The purpose was to give you an idea of what could be done.

For Zigbee, I don’t recommend going after a device library but a full stack support as the base. I would really recommend supporting both the HA and the LL stacks with all endpoints. For reference, given the hardware firmware you use on the current plus and secure this library

Zigbee3 would be the best thing to add, and I say this as a zwave user. It opens up a ton of cheap bulbs and sensors. And, if the whole “CHIP” thing ever gets off the ground, there is an 80% chance it will be some variant of zigbee-over-wifi (more correctly called dotdot on wifi) so having Zigbee3 support would be half the way to chip support.

Hi - we’ve published an updated documentation for this API documentation.

We’re waiting for your feedback in the post.

2 Likes

Hi,

Looking through the API i completely miss a way to get any true Z-Wave device level data. The UPNP definitions as used by Vera as something you could complain about, but it is a well defined standard and as far as i know widely in HA (https://openconnectivity.org/). Why reinvent the wheel? But more serious is that there is hardly any details to obtain about devices. Look at this wiki page http://wiki.micasaverde.com/index.php/Luup_UPnP_Variables_and_Actions and you see what I mean what we can see on a Vera.

This is all I see on the new platform:

{
          "_id": "5e88a318f4947f07db6e7e13",
          "batteryPowered": false,
          "category": "switch",
          "deviceTypeId": "600_3_4231",
          "gatewayId": "5e6a7dacb7c77f07569bf50f",
          "info": {
            "manufacturer": "NEO Coolcam",
            "model": "NAS-WR01Z"
          },
          "name": "SK Neo Power",
          "parentDeviceId": "",
          "reachable": true,
          "ready": true,
          "roomId": "",
          "security": "no",
          "status": "idle",
          "subcategory": "interior_plugin",
          "type": "switch.outlet"
        },

Much of this maps with the lack of core functionality of the new controller to make it a serious HA platform or even get it up to par with the Vera functionality.

Cheers Rene

4 Likes

You are spot on and, from my Linux FW testing experience, this will become an issue during the test/debug phase of the new Linux Firmware/HW . I reported a bug with a Zooz smart plug on an earlier Linux Firmware release and the development team requested additional z-wave device details. I asked if there was an API or some way to get the details (logs?) on the Linux Firmware but there wasn’t a way at that time. I actually had to exclude the device from the Edge running the Linux Firmware and include it on my Vera Plus running UI7 just to provide the relevant z-wave details so they could address a device-integration issue in the Linux Firmware. For this class of development bugs alone, they need to expose device details and I’m assuming that they just haven’t gotten to it yet. While we may not see it in a simple mobile App UI, it should be available via the more advanced WebUI and definitely via an API or logging to enable efficient development/test/debug.

2 Likes

You are 100% spot on.
We are in the process of exposing more and more of the functionality via APIs…
Could you help us by telling us which functionality is a priority so that we can start delivering those functionality in the APIs first? (every 2 weeks, which is our sprints, we hope to release new capabilities in the API).

1 Like

Melih, back in February I noted that there was apparently no way for a plugin/device to watch a value on another device and receive an event when it changes in the Lua API. At that time, Andrey said they would add it. This is a pretty big miss in that API. Now done?

I’ll check this and you will get the status on that here @rigpapa

Hello @rigpapa,
Implementation will be available in the next Linux firmware release( next thursday or friday ).
Here is draft of APICore Events - eZLO API.pdf (124.4 KB)

2 Likes