The Big Data has become a popular phenomenon and gained momentum in the last decade. The reason behind this revolutionary concept is its flair to deliver phenomenal insights to many real-world applications with the emergence of this rising paradigm, arrives not only an enormous amount of available data, but also the perception of its emerging speed, which is- these real world applications generate data in real time at speed faster than the traditional systems.
The constant flow of data- Data Streams in Big Data, example includes network traffic, sensor data, call centre records and so on. Their surpassing amount and great speed create a great challenge for the mining community to mine them. Data stream poses various properties and they are- Enormous length, Concept drift, concept- evolution. Drift comes about when the data transforms over time. Evolution come about when new classes develop in new streams feature evolution happens when feature changes with time in data streams. Each of these features adds a challenge to data stream mining.
Another significant case of the Big Data paradigm is machine learning for data stream which is real time data mining, real time analytics or stream learning, where series of data streams, perhaps appear persistently and where each data stream has a time stamp and also a temporal order. Each item (Data Streams) arrive one after another and we are keen to make and maintain model or predictors of these Data streams in real time. In the current industry needs, there are many issues to be called attention to, before the remaining methods can be efficiently applied to real- world complications. The present condition leads us to presume that we have to engage ourselves with developing innumerable and continuously growing datasets that may arrive persistently in batches, different from traditional system (batch learning), where there is easy accessibility to all previous data. These traditional processing systems surmise that the data are of rest and can be accessed. For example, database systems can store huge amount of data and let users to do transactions inquiry. The batch processing models routinely reconstruct new models from square one. The incremental learning, carried out by stream learning gives advantages for this specific stream processing by constantly integrating information into its models, and also try to take minimum processing time and space. Incremental learning has gained a lot of recognition for its capability of large scale and real time processing. Stream Learning has many issues and results into inflexible states. Such as, a small batch of instances provided to the learning algorithm every time abruptly, a very restricted processing time, a much bounded amount of memory and the requirement of having trained models at every level of the streams of data. On the top of that, these streams of data sometimes get affected by a change in their data distribution, pushing the system to learn under Non-Stationery data as a rule which is unpredictable or cannot be forecasted. Intrusion Detection, Intelligent User Interfaces, Loan Recommendations, Mobile phones, monitoring and traffic management are the examples of real world stream learning applications, where the Internet of Things (IOT) has become one of the most significant applications of stream learning, as it is processing a huge amount of data, constantly in real time. Computing systems track down and manages the health and actions of connected objects or machines in real time since Internet of Things is defined as sensors and actuators by networks to the computing system. Stream Data analysis is helping people and organizations to react instantly when something unexpected happens or some disruption arises, also when suddenly a new trend pops-up, by drawing out useful knowledge from what is happening at each moment.
In the recent days, Data stream is certainly a topic of broad and current interest in Machine Learning. This field is acquiring spontaneous momentum every day, with all the real world applications demanding real time actions. It is full of uphill challenges. It needs fresh ideas and developments. It is not only one of the most rising research lines in the technological sphere. But it is also a very essential skill for machine learning professionals. Learning to adjust ensemble stream data challenges is crucial to make a productive use of Big Data for organizations as a key of reactor.