Exchange Rate End-to-End Data Flow Project
In this project, we transfer live-stream exchange rates from API via Apache Nifi to Slack Application and SQL table with DBeaver.
Started the project to connect Docker-Compose and Apache Nifi and created a container for communication.
Checking Apachi Nifi 8080 ports and starting the data flow process.
My data comes from HTTP; e invokeHTTP to start getting live-stream data. I prefer to send more than one currency; I used Funnel to see USD-TRY, EUR-TRY, and GBP-TRY exchange rates at the same time.
In the next step, I added JoltTransformJSON to simplify my data, which is a list and JSON format. I removed unused keys and changed the key names.
After that, I assigned the variable in EvaluateJsonPath to transfer my data to the Slack App. Then, I added the RouteOnAttribute step to filter my result. The exchange rate between TRY and the other three currencies has been higher than 20 since 2022, so I filtered my rate bigger than 20.
Create a Slack Channel:
Slack is a free communication application that I use in my project to send instant currency notifications to users. I created a Slack Channel and started to send exchange rates moment by moment. Depending on the project, you can adjust the notification time to send to users.
The webhook URL is used for creating the Slack Channel. I added it to PutSlack to provide the connection between either side. There is a configurability to names or sorting on the webhooktext.
Google Cloud Process:
After checking every step is working, the next step is creating, transferring, and viewing a SQL Database in Google Cloud and Postgresql on DBeaver. The created SQL table assigns connections and users and moves it to ConvertJsontoSQL and DBeaver’a via Nifi.
The next step is to convert the JSON database properly to SQL format. JBDC Connection Pool is a key to work the data converting correctly. Because of this, make sure the Database Connection URL, Database Driver Class Name(org.postgresql.Driver), and Database Driver Location are correct.
The last part is transferring PutSQL inputs assigned to DBeaver via Nifi. I match Postgresql and Cloud in DBeaver with localhost, user, and password and create a Postgresql project.
For the Database of ConnectionURL in ConvertJsontoSQL, I created a second Postgreqsl database while choosing the URL to get DBCPConnectionPool’s value’s URL in the JBDC Connection Pool.
I chose Statement Type as an INSERT to add new data to the table.
As the final step, I start the process on Nifi. The data flows Slack Application and DBeaver PostgreSQL at the same time.
Thank you for your interest!