I need to store records on a central "DB" from several embedded boards running Linux.
Currently, I write short local files of records, which I close every15m and send to a server by ftp, which in turn parses the received files and puts data in a DB The server may be disconnected, so my boards keep the files on the hdd until they can be sent.
The approach works, but depends on the ftp connection and there are failures which don't guarantee that the files are not deleted and not sent or sent twice. There is some coding in the name of the file which should avoid the latter problem, but I'd rather move to a more modern approach and use some kind of DB connection, which needs to be recovered every time it goes down. In the meantime, the records should be cached locally.
I was thinking of something along the lines of isql on the embedded boards with a thread that insert records and another one that selects from the same table and deletes the ones that were sent. For the upstream connection I was thinking about using mysql or postgresql with some library like opendbx to initiate a transaction and insert into the main db. This way, the server can just keep a connection on the database to work on data.
I could instead work the other way round, having the server downloading the data from the boards...
Any advice? Thank you