Terminology of Data Analytics


Here, we will take a look at the terminology of Data Analytics: the popular terms often used while discussing Data Analytics, and what they actually mean. 

Each one of them will serve a distinctive purpose, and understanding them properly—and their differences— will play an integral part in properly understanding the process of data analysis.

Business Intelligence (BI) 

Business Intelligence refers to the process of extracting any insights and valuable information from past business data without utilizing machine learning techniques.

Business intelligence is typically performed by examining the data retroactively and draw conclusions from events occuring in the past—recorded in the data—.

Many consider Business Intelligence as the lesser, more obsolete version of data analytics—because it is indeed an older term and was practiced long before analytics was invented—, but a more accurate description would be that business intelligence is a component of the encompassing field of analytics.

Data Science

The term “Data Science” is often used interchangeably with both “Data Analytics” and “Business Intelligence”.

As opposed to Business Intelligence, Data Science was only popular much recently, some time around 2015-2016, and refers to the extraction of information from data that incorporates statistical techniques, data analysis methods, and Machine Learning. 

Big Data

Big Data is certainly a popular buzzword for the past half decade or so, but can be quite confusing for many people because it’s often used interchangeably with analytics and other terms. 

Big Data, simply put, refers to any data that is “big” based on three main factors known as the 3Vs: 

  • Volume: a lot of different data coming at one time, for example a million of images, each 1MB in size.
  • Variety: different types of data coming at the same time with different formats, sizes, and so on.
  • Velocity: how fast the data is coming from many different sources

Big Data can come in the form of structured data (i.e. a very big database of customer data, structured properly in different fields), unstructured data (data coming from social media in the form of photos, videos, texts, and so on), or a combination of both (semi-structured data).

Big Data Analytics

Big Data analytics is any analytics dealing with Big Data, since traditional analytics tools and methods are not designed to deal with:

  • Massive, unstructured data set
  • Data coming at really high velocity
  • Recognizing all the different data from many different sources with different types (high variety)

Nowadays, when we use the term “Data Analytics”, usually it will actually refer to “Big Data Analytics”. Data analytics can certainly deal with data sets that doesn’t comply as Big Data, and in this case it’s improper to call it Big Data analytics. 

Descriptive Analytics

Descriptive Analytics is often deemed the simplest form of data analytics, and can only be used to analyze data from the past. 

In a nutshell, Descriptive Analytics is the process of interpreting historical data to extract information, understand patterns, and find changes that have occurred in the past. 

Descriptive Analytics often use historic data to find comparisons, for example, month over month sales growth, total revenue per subscriber year after year, and so on.

In short, Descriptive Analytics measures and define what happened during a set period of time. While very limited and relatively easy to implement—so,often considered useless—Descriptive Analytics is still a very powerful tool for businesses.

Predictive Analytics

As opposed to Descriptive Analytics that ‘only’ describes information from the past, Predictive Analytics refers to the process of using advanced modeling and statistical methods to predict future values and behavior based on past and present data. 

In its most basic form, Predictive Analytics find patterns in historical data and correlations between variables to determine whether those patterns are likely to occur in the future. 

Predictive Analytics is mainly implemented in businesses as a decision-making tool. By predicting future events and the likelihood of them happenting, business stakeholders can make a better informed decision. 

Prescriptive Analytics

Prescriptive Analytics is a fairly new concept, only invented in the early 2000s. However, it has gained significant popularity in this past decade, and has grown into a very powerful tool for many enterprises all around the globe.

Prescriptive Analytics is—-in essence— a more advanced, faster form of Predictive Analytics that can predict outcome in the very near future—often just a few minutes or seconds in the future, so it can suggest a course of action in real-time. 

Prescriptive Analytics analyze information from historical data, current performance and available resources, among other data, to factors information about possible situations that might occur in real time. 

With this, Prescriptive Analytics can be used to assist immediate decision making, as well as long-term decision making. 

Prescriptive Analytics is only possible with the assistance of artificial intelligence technologies and techniques—such as machine learning and deep learning that can compute and analyze Big Data much faster than humans and traditional computers.

Artificial Intelligence

Artificial Intelligence, or AI, is a blanket term to describe machines and computers that are able to do tasks that would normally require human control or supervision.

It’s important to understand that there are two different types of AI:

  • Artificial Narrow Intelligence (ANI): refers to AI that is only focused on one single task (and can only do that task alone)
  • Artificial General Intelligence (AGI): AI that can do several different tasks just as a human could, and can make its own decisions to do the most effective task.

Today, all AI are ANI, and we still need many breakthroughs in technology before we can truly implement AGI.

ANI designed for Prescriptive Analytics, for example, cannot use its own prediction to replace humans, make its own decision and take action. It can only do that single, narrow task alone: analyzing data and provide suggestions. 

Machine Learning

Machine Learning has been a major buzzword in the past couple of years or so, but is actually a very old concept, originated during the invention of the computer itself.

Machine Learning is a concept that a computer or a program can ‘learn’ from historical, labeled data and adapt by reprogramming itself without any interference from human programmers. The idea is that the algorithms of the computer and AI are always up to date according to the changes in data, trends, and even worldwide economy.

Machine Learning is especially useful in performing Prescriptive Analytics—as discussed above. For example, an AI in an automated vehicle can teach itself about new hazards and obstacles, simulate future possibilities, learn from actual implications, and reprogram itself.

Deep Learning

Deep Learning can be considered as a subfield of Machine Learning. The invention and implementation of Deep Learning is what actually triggered all the hype surrounding AI, Big Data, and Data Analytics in the past. 

The main concept of Deep Learning is still similar to machine learning: teach itself using historical data. The main difference is that Deep Learning utilizes Artificial Neural Network to learn more “naturally” as human’s neural system. 

This, allows the AI to outperform humans in certain tasks (although it can only do single specialized task). 

Functions like voice recognition (like Google Assistant and Alexa), image recognition (like your iPhone’s FaceID), and video recognition in automated vehicles, are all implementations of Deep Learning and advanced Data Analytics.

Artificial Neural Network

Artificial Neural Network (ANN), is a technology initially inspired by the human brain and nervous system, but it;s important to note that the actual mechanisms are very different than our actual neural system.

ANN is often used interchangeably with the terms Deep Learning and sometimes, Machine Learning. 

In a nutshell, the artificial neurons can transmit signals to each other, and so can escalate the computation processes so that many different neurons can work together to achieve one single task.

End Words

While there are certainly other terms that might coincide with the term “Data Analytics” and its implementations, the ones we have discussed above are certainly the most important ones.

Understanding them is very important in properly grasping the concept of data analytics, and especially how to implement it.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like