Connecting To The Outside World with Perl and Database Events
Written by Nikos Vaggalis   
Monday, 19 November 2018
Article Index
Connecting To The Outside World with Perl and Database Events
The Internals
The Perl
Conclusion

There are times when I want my database to be able to talk to the outside world, rather than the other way around. Let's explore some scenarios where this ability would come handy before implementing a solution using Ingres RDBMS and Perl. You can see the result in action and the code is available to replicate it with any RMDBS.

For example, the database ought to alert an external entity:

A. in a Drugstore, whenever an item's stock decreases beyond a fail-safe limit

B. in a Post Office, whenever an item arrives and its barcode is scanned

C. in a Hospital, whenever a patient is diagnosed with a condition

D. in a Hospital, whenever a patient is admitted into the Hospital or is prescribed medicine

All these business logic requirements are mapped into equivalent operations inside the database:

for A - an item's quantity is decreased

for B - an item is inserted into the inventory

for C - a patient is designated a new diagnosis (thus inserting a new record into the database), or the status of an existing diagnosis is updated (thus updating a record of the database)

for D - the details of the patient's visit end up as records in the hospital's database

The response to these operations in Business Logic could be :

for A - to make an automated order

for B - to notify the customer awaiting an order

for C - to notify the patient's insurance

for D - to update the Nationwide statistics on patient admissions (or even drug consumption)

Their realization in Application logic could be done in any of the following ways: 

send an Email

call a Webhook

call a Web service

add a message to a broker queue like RabbitMQ 

In simple terms, the data flow can be described as:

Physical Event (whenever an item's stock decreases beyond a fail-safe limit) -->Database Event (monitor when an item's quantity is decreased)-->Business Logic response (make an automated order)-->Application Logic action (send an Email)

 

The Use Case

The use case acting as the concept of proof is going to be based on scenario D where, at different points in time, hospital clerks collect the details of the patient's visit and register them to the system through a GUI application/data entry form.This data will then end up as rows in several tables in the database.The Ingres RDBMS will handle the database part, while Perl will handle the application part being in charge of gluing the database to the outside world by extracting and transforming this data to a HL7 message and sending it over to a Web service endpoint.

That would be the Health Service's Business Analytics endpoint which would consume the HL7 message in order to update its cumulative Nationwide stats on patient admissions (or even drug consumption). But for reasons of brevity, I'm going instead to encapsulate that HL7 message into a XML packet and forward it to the Gazelle's SOAP XML HL7 Validation service instead.

Thus, broken down into distinct steps the scenario looks like this:

A(1) Patient is admitted into the Hospital
A(2) the patient is prescribed medicine
A(3) the patient is diagnosed
A(4) the patient is undergoing surgical procedure
....
A(5) the patient completes visit

 

B When the patient's visit comes to an end, the data between point A(1) to point A(5) is assembled from the database, ending up as a HL7 message 

C Message is sent to the BI Web Service 

The problem lies in how to manage the interim communication between point A, saving the data through the GUI into the database, and C, calling the Web Service. The usual way would be to run a batch job at some point, probably at the end of the day when there's little load on the Hospital's database, which would in turn run the appropriate SQL queries to fetch the day's data and then transform them into HL7 messages, encapsulate them into XML packets and finally send them over the wire to the API endpoint.

The other way, which we are going to implement, does the same but in real time. Using the DBMS's Database Events component it fires a chain of triggers/rules whenever new records appear in the database. In turn a Perl daemon watching in the background is notified about the presence of fresh data in the database and receives a small piece of that data in the form of a table's Primary Key (PK).

The Perl daemon's sole responsibility is to forward that PK to another Perl program that handles the application logic.This uses the PK in order to re-query the database and fetch the full range of data which is spread across several tables.Then it shapes that aggregated data into a HL7 message, encapsulates that into another XML packet and finally calls the endpoint with that XML payload.  

This way, albeit more difficult, is much cleaner, but at the same time stresses the load capabilities of the endpoint. Netflix and Amazon's handling of millions of concurrent requests prove that the technology exists, suggesting that a fraction of that load hitting the Health Service's servers should be no reason for concern.

If you want to see the result before going through the details, there's a video of the system in action on the final page of this article. All the code is available on GitHub - follow the links given of the final page - and can be used to replicate it with any RDBMS, not just Ingres,



Last Updated ( Monday, 19 November 2018 )