The Easy World of data movement and transformation with NiFi
Data is any fact, figure or any information that we use or store or communicate. In essence, data is everywhere, your click on your mobile screen or you asking Alexa to set an alarm, what cashbacks/rewards you have availed, and so on and on. While working with data we essentially have to do three steps:- Extract data from source __(E)__- Transform to desired format __(T)__- Load to desired destination __(L)__![Image](//images.ctfassets.net/urdv1oztwvyo/2RFbqBCSbXplB8MyeahpF/93767529629bc3d349d28308e65c4ba4/etl.png)([Source](https://docs.microsoft.com/en-us/azure/architecture/data-guide/relational-data/etl))First let us understand a few problems that one might encounter while working with data.__1. The amount of data that is being collected__With the abundance of data intensive applications, the amount of data produced is huge. Applications collect data for every interaction and events. Manually working with these large datasets becomes too tiresome and impossible in some cases. __2. Collecting data from multiple sources__Different pieces of required information are often present in different sources. One has to club data coming from multiple sources to get useful information. A lot of resources/infrastructure are spent providing secure and reliable connection to these sources. Whole code and packages/libraries have to be written for writing (producing) as well as reading (consuming) from another service in a reliable and resilient way.__3. Different formats of data from different sources__Multiple applications use their data in their own customized way. Thus sharing data between different sources leads to discrepancy in data formats. Data has to be transformed in the route between source and destination.__4. Lots of noise in the data__The data that is not relevant for the use case or that does not possess any significant information is noise. These large datasets contain huge amounts of noises as well. All of the data is not valuable for any application. Noises have to be removed for providing visibility for the required data.__5. Writing complex and long codes for data transformation__Huge amounts and different formats of data overwhelms the person working on it. One has to write large number of lines of code to: - Build a resilient and reliable data system- Handle and transform different formatsNow that we have understood how painful it is to work with large amounts of non uniform data, let us see what NiFi has in store to solve these problems.NiFi is built for the purpose to easily and efficiently automate the data flow from source to destination.__NiFi is a powerful tool that can be used to automate, schedule, transform data, send alerts and so on and on and on. It enables to create extensive data flows in an intuitive easy to use format.__![NIFI](//images.ctfassets.net/urdv1oztwvyo/44H1VTeqOuC7yWAN2JSjDq/2c986343fbc4c5ba7c96342e8b718e62/NIFI.png)__1. Built to handle large amounts of data__There is no restriction on data size when working with NiFi (only our infrastructure should be configured to handle large volumes of data). NiFi is designed to move large amounts of data. It also provides features like load balancing the data, prioritized queuing while moving data from one processor to another.__2. Provide large number of processors__NiFi provides 288 (as of 1.12.1 version) number of processors to perform different sets of transformations, thus making it a powerful tool that can handle almost all the use cases of data transformation. From taking a file from a local system to dumping the data into the azure buckets, all are covered with NiFi.__3. Easy to build, and easy to debug__NiFi provides an easy to use interface with simple drag and drop features to build data flow. A data transformation that can take hundreds of lines of code can be built by using a couple of processors without writing any code. For example: a simple validation of each row of a csv file can be done with 3 processors:- Picking up the data from source- Validating content of csv file by using CSVValidator processor- Dumping into the destination![csv-validate-flow](//images.ctfassets.net/urdv1oztwvyo/4zriOqouoIWujrqVdsr5X2/b65612acd3f4ac432ff24e041eb7704a/csv-validate-flow.png)With the help of different processors every transformation is isolated, thus making it possible to debug a part of the system without restarting the whole flow or stopping the complete data flow. The contents that are transformed from one processor to another can be viewed in the queue. ![queue-nifi (1)](//images.ctfassets.net/urdv1oztwvyo/6JzF5YErFcCF3QGrQSmrXa/72d94e3208b067d86ed71911e436bbab/queue-nifi__1_.png)__4. Easy to connect with multiple sources__NiFi provides a number of processors and controller services to connect with a large number of data sources. Instead of writing code and using libraries to provide secure and robust connection with multiple third party software, NiFi provides ready to use processors, where we just have to enter basic configuration for connection and voila, connection successfully and securely created.__5. Provides a secure system__NiFi provides secure flow of data at every step. It uses 2-way SSL for all machine-to-machine communication. It itself hides all the sensitive properties of any processor. It also provides features to allow multiple user access.### How we are using NiFi at Nuclei__1. Data In Nuclei__Nuclei deals heavily with transaction data coming from merchant SDK. Reconciliation data between multiple parties (bank, merchants, nuclei) produce everyday and have to be managed and sent to different destinations in different formats. For some banks, payments are supported by a file based system, leading to generation of payment files on a daily basis.User interaction data extracted from user activity on Nuclei SDK in the bank’s application.EtcAll these are large datasets, and have to be stored, managed, transformed, and perform ETL on Nuclei’s premise.__2. NiFi Deployment__NiFi is deployed on Nuclei’s premise on kubernetes with the help of the [cetic helm chart](https://github.com/cetic/helm-nifi). Deploying on kubernetes provides the advantages of:- Reduction of complexity and time of deployment, providing streamline delivery of updates- Increased scalability and accessibility__3. NiFi Registry__Nifi Registry is an add -on application that provides a central location for storage and management of nifi’s data.Nuclei uses NiFi Registry to store all the data flows developed in the NiFi dashboard. It acts like a Git repository for all pipelines and helps in version control of the data flows. Different registries are maintained for different development environments.