openLuup: Asynchronous I/O

Undoubtedly true, and these plotted values are already averages, so the peaks are likely to hit 100% at times.

This is when you actually see the CPU constrained and potentially crash.

However, a well-written system should degrade gracefully, and certainly not crash. Clearly you could bring it to its knees with multiple video streams, or such like, but I think that’s just inappropriate use of an HA controller.

Completely agree. I was thinking more of the vera UI7 rather than openLuup here… I have yet to get openLuup to crash due to cpu useage.

@akbooer Is there any things I need to check/validate/modify if I want to enable it or I can do it and it will just decrease the latency at some point? Right now openLuup is handling everything and vera only the “zwave” communication.

If you use the latest development version, it has some rather conservative, but user adjustable parameters (hidden, but reachable.) The only thing you need to enable it is the true flag in the AsyncPoll variable and then restart.

Would be very interested to know how you get on, or, indeed, whether you notice any difference at all!

AK

Will enable it for sure back @ home later today!

The latest openLuup development release has a new file in the openLuup/ folder: http_async.lua.

This now implements asynchronous HTTP and HTTPS requests (depending on which scheme is provided in the requested URL.) If you want to use it, you simply require it in your code like this:

local async = require "http_async"

It exports a single function require which works exactly as described earlier in this thread: openLuup: Asynchronous I/O - #6 by akbooer - openLuup - Ezlo Community (except for a change of name.)

For example:

local ok, err = async.request ("simple URL", "body of POST", myCallback)

I will be withdrawing the original openLuup syntax to call this function (luup.openLuup.async_request) in order to make any plugin which uses this transportable to Vera, since this new module can be copied to Vera and used in exactly the same way there.

I have done limited testing in Vera (quite amusing in the test window, because the function returns before the response arrives, so your callback routine needs to write to the log if you want to see the result!)

Example test code:


local async = require "http_async"
local ltn12 = require "ltn12"
local response_table = {}

local function request_callback (response, code, headers, statusline)
  luup.log ("CALLBACK status code: " .. (code or '?'))
  luup.log ("CALLBACK output length: " .. #table.concat (response_table))
end

local ok, err = async.request ({
      url = "https://api.github.com/repos/akbooer/openLuup/contents",
      sink = ltn12.sink.table (response_table),
      protocol = "tlsv1_2",
    }, request_callback)
  
print (ok, err or "all OK")

Giving this log output (you have to rush to the AltUI Misc > OsCommand page and tail the log)

50 05/11/19 12:56:53.317 luup_log:0: ALTUI: Evaluation of lua code returned: nil <0x7445d520>
50 05/11/19 12:56:54.107 luup_log:0: CALLBACK status code: 200 <0x7565d520>
50 05/11/19 12:56:54.107 luup_log:0: CALLBACK output length: 7421 <0x7565d520>

Huge thanks to @amg0 for suggesting that this module could be adapted to be used on Vera.

1 Like

Really cool, although I would call this “asynchronous HTTP requests”, as “I/O” is a bit broad. When I think of async I/O, I’m usually thinking of lower-level stuff (like async sockets, which Vera desperately needs).

One interesting thing to note about your example… the callback has to use response_table as an upvalue to get the response text. That means a lot of people will be tempted to use a global (the ultimate upvalue), and that will be fine in a lot of cases, but if this approach is used in a scenario where multiple simultaneous async requests are made/processed, the sinks will collide. If you wrap the async.request calls in a function or closure, however, and make response_table local to that closure, it solves that issue.

Certainly true. All the code is there for asynchronous sockets, it’s just that it’s wrapped in an HTTP request function at the moment. I could unwrap it … ?

Quite true. This was only a test example. The whole thing should be part of a closure. As Lua Test code, it already is.

Just have a conversation with @amg0 on exactly this topic.

Here’s the function wrapped for your own use, with a separate response_table for each invoked request…


local function async_request (request_table)
  local response_table = {}
  request_table.sink = ltn12.sink.table (response_table)
  local function callback (response, code, headers, statusline)
    -- do whatever you need with the response here
  end
  return async.request (request_table, callback)
end
1 Like

it is great , however I am not sure the function enclosure trick can be used every time.

it would still be useful to be able to pass in a context parameter that could be anything (scalar, table) but just returned back, as is , as part of the callback. in it we could store some info, data, indexes, anything

for instance if we have a loop iterating over a table and calling async_request () for each entry in that table, it would be useful for the callback to know for which entry it is being called back so it could for instance find in the context parameter the index of that entry

Wrap it in a closure rather than a named function; the loop wraps the closure.

Like this…

for i = 1,100 do
  local response_table = {}
  local ok, err =  async.request (
    {sink = ltn12.sink.table (response_table), ...},    -- plus other request stuff 
    function (response, code, headers, statusline)
    -- do whatever you need with the response here
    end)
end

Sure, but suffice it to say, this is going to focus people’s attention on scope. There’s a lot of lazy scoping out there, and that won’t pass muster in this context.

caveat implementor !

2 Likes

Hello ak

Been having a read of all this. Have we got any access to async sockets amongst all this of late? Or are we still stuck with having to use luup.io.intercept()? I’m thinking both Vera & openLuup. I wrote a Paradox alarm web page scraper years ago and it works fine. The plugin works using polling and logs into the alarm and then extracts the sensor status from the web page it reports.

I would really like to just use some async socket and get the alarm sensor data via a callback, as soon as it occurs, rather than after some fairly long poll delay, like I have know.

Any ideas on this one?

Hi,

The Vera build in is still the luup.io indeed. The way i worked around it for the Harmony plugin is to schedule a task each second and use socket:select to see if there is any data. Yes, process it. No do nothing. This is working nicely on openLuup and Vera. You can look at the Harmony plugin source for details : vera-Harmony-Hub/L_Harmony.lua at master · reneboer/vera-Harmony-Hub · GitHub
look at the wsAPI bit starting at line 818

Cheers Rene

The asynch_http module essential works the same way, using socket.select() as, indeed, does the openLuup scheduler.

So,in fact, you’ve always been able to use async sockets.

I wrote a proxy that runs directly on the Vera, or on openLuup, to provide async sockets with minimal changes on both platforms. It allows you to structure the plugin so it works both with and without the proxy, so if a user doesn’t have it installed, you plugin continues to work the “old” way (if you let it), but will be much more responsive if you use the notifications. Every one of my plugins that uses TCP sockets for communication now uses it, with and without SSL.

SockProxy

I also wrote a full WebSocket implementation (except 64-bit length frames, which I’ve not yet seen used in practice) on top of it, which again, also works without it.

3 Likes

@rigpapa Hey Patrick, I’m taking a crack at this with the EVL plugin. I’m not good enough with Linux to understand the pros and cons of using a systemd service file in place of the init.d file you suggest (I’m using Ubuntu). Are there any gothchas with the plugin if I just use the “before:” Systemd action to start things prior to openLuup.

I think any way you get it started is fine.