Modbus registrar in database and array management strategies

I found that reading 64 contiguous memory areas (elements) using modbus is the most efficient way to get information on Modbus using the rmodbus library.

My goal is to register the read information in a simple database that can be extracted to generate graphs and data tables on a web page and store the most recent value in an instance variable.

Data is read into the array by the rmodbus library, where each element index represents the element address. However, I would like to convert the index to octal, as this corresponds to the addressing scheme of elements that users are already familiar with, and it will be easier to reference in the interface.

change the addition of details and clarification: at the moment I am working with the following scheme:

create_table "elements", :force => true do |t| t.string "name" t.integer "modbus_connection_id" t.string "address" t.string "eng_unit" t.integer "base" t.string "wiring" t.text "note" t.boolean "log" t.datetime "created_at" t.datetime "updated_at" end create_table "events", :force => true do |t| t.integer "element_id" t.string "value" t.datetime "created_at" t.datetime "updated_at" end create_table "modbus_connections", :force => true do |t| t.string "name" t.string "ip_address" t.integer "port" t.integer "client" t.text "note" t.datetime "created_at" t.datetime "updated_at" end 

The idea is that the background process is likely to be tested modulo and compare the changes with itself before registering only those elements that have been changed and were requested for registration. Elements should probably be stored both in db and in variables that are already scalable, so the interface should not worry about that. Those that are not logged are still stored in instance variables to monitor the display state in semi-real time. Then, the registered elements in their "Events" table will be analyzed for graphs and tables only at the request of the user interface.

First question: (Finally!) Does it make sense to live with the data in the array and apply a layer that processes the index transformation (and, as it happens, the corresponding element value as I use v.collect{|i| i.to_s(16)} for conversion), or is it better to pass everything to a hash, where the index and value can live happily ever in the most convenient form?

Editing the first question: given the definition / evolution of my question about registering only data changes in plain sqlite db and that I will need to track the changes for the elements to determine which ones have changed between readings modulo, does the array or hash make the comparison more efficient ? Should I even care?

Second question: In Rails, assuming one minute registration interval, will it be better to keep about a thousand data points in independent fields, or should I leave them in 64 elementary fragments and analyze the information on the way to the interface?

Second edit question. Running a large amount of immutable data per minute the row of the database seems very flat. In addition, it does not allow you to easily dynamically select items for registration. It would seem much more appropriate to make the "logger" event based, rather than interval. Which pretty well means that the first question is more important here, as it is likely to also become a state checking mechanism.

I suggest that I unnecessarily reinvent the wheel with this revelation, as it becomes very similar to the existing "lumberjacks". Reading around SO shows that this is an old question about entering the database compared to FS. Since the journal itself is the basis of the application, I tend to register in the database, most likely sqlite, given what I read.

The second question has been edited again: now the normalization issue, everything that I read suggests that “scalability” tends to require denormalization. My registered events table will be relatively simple, timestamps, value and item id. Should it also denormalize the most common attribute from within the Elements table, or is it an ok join on this relatively small scale?

Does anyone have any favorite Ruby / gems / bundles / plugins / whatever frameworks?

+4
source share
2 answers

I don't think this question was answered elsewhere :) You can be the only person on SO using Rails and Modbus. I have Rails and Modbus, except that my Modbus experience is with nmodbus on a compact .net framework. I do not know that I have specific answers for you, but I can share what I used.

When polling a device, we immediately apply any parsing, scaling, or conversion to data (but we don’t have 1000 values). Then the data is recorded in the database. Now, any customers who want to use the database have no idea about Modbus, and they do not care; the problem goes from modbus knowledge to the fact that the application really cares (voltage!). In your scenario, I will try to completely separate this polling application from your Rails application.

Now why the outcome may not be possible - 1000 data points. This is problem. For arguments, even if you were able to normalize this data to 50 tables, it is 50 tables with 20 columns each ... yuck. I don’t know how easy it is to register 1000 data points, but creating a basically constant hash table may not work.

The bad part, requiring Rails to learn how to analyze register values, now your Rails application has Modbus knowledge (not at the read register level, but at the analysis level). In addition, if you want to use a client other than Rails, this application also needs knowledge analysis. Maybe the whole point of your Rails application? The Rails app knows how to cut and transcode Modbus readings and gives users / clients good user interfaces / web services to work with.

These are good questions, but it’s hard to give specific advice without having more knowledge of what exactly you are building. As for normalization - trial and error. Do it both ways. I can’t say things like “if you have more than 40 columns in your sqlite table that it is going to collapse” - things that you just need to run ...

+1
source

In response to question No. 2 of the OP. Instead of using 64 word blocks, I used a simple algorithm to get the optimal number of transfer requests.

The algorithm is pretty simple, you start at address 1 (assuming you are using a PLC), and then find whether this register should be polled in this case, use it as the starting address and increase the length to 125 (which is maxbus modbus). If not, go to the next item and see if you need to poll and repeat the process.

0
source

All Articles