Whether it’s a hurricane that creates a need to reroute products to stores where they are most needed or it’s someone missing a flight and getting rerouted by the airline’s reservation system, there are major opportunities for companies to achieve better business results by using data analytics in their daily operations. The world is unpredictable and for a company to thrive in its own ever-changing market, it is important for the company’s operations to have agility and resilience – two characteristics that can be strongly enhanced through the use of analytics.
When looking at analytics, it’s useful to consider the important role that relational database management systems play in the process, including considering how the database provides a foundation for the full spectrum of analytics, from traditional analytics to the more modern variants making use of machine learning (ML) and artificial intelligence (AI). Understanding the proper application of the different types of analytics in your business, from descriptive through prescriptive, can help you to maximize business performance and capitalize on new business opportunities. It can also guide you in choosing a proper database for your business analytical needs. I’ll touch on that more later.
Categories of data analytics
Over the years, the analytics world has created useful (and logical) descriptions for the different types of analytics that are used in typical business scenarios. Four major types of analytical systems play different roles, whether separately or in combination, in a business’s overall IT infrastructure:
- Descriptive analytics — what has happened or what is happening?
- Diagnostic analytics — why did it happen?
- Predictive analytics — what will happen?
- Prescriptive analytics — what should you do about it?
This last category is what is driving analytics to become truly operational. Prescriptive analytics can tell you what you can or should do in reaction to the insights you have gained from the other types of analytics, and can be used to drive the execution of the recommended action or decision automatically.
A company might begin their analytics journey in a traditional way using descriptive analytics to learn what has happened historically, whether over the last hour or last 10 years or more complexly, what is happening, by looking at current data in the context of historical data.
Analytical maturity continues into diagnostic analytics and trying to figure out why it may have happened based on all the historical data. Once the company has established these two pieces, they can then develop a prediction with the full process of “if that has happened in the past, and these are my current conditions, what is likely to happen now or in the near future?” Then you have prescriptive analytics, which is where you get into the interesting part: taking into consideration all the data, information, and insights to determine the best course of action to maximize or minimize some result.
The mechanics of analytics
The flow of data is important in this process and this evolution. Data is often extracted from online transaction processing (OLTP) systems and put into online analytical processing (OLAP) systems, whether a data warehouse or a data lake or, even more recently, a data lakehouse. It is in those analytics systems that models are developed from historical data and new insights are generated.
Companies that are sufficiently adept at using analytics to their best effect may put those models into decision-making systems, which are sometimes the original OLTP systems, in order to drive action based on the insights. There may be multiple steps and stages of data movement, extraction, and analysis, all of which add delays and introduce opportunities for errors or lost opportunities.
Organizations should strive to have their analytics done as quickly as possible, with a minimal movement of data. Those analytics can be done in the context of both current and historical data, using a lambda architectural approach if necessary, to make the insights most accurate, timely, and therefore, more useful and allow them to drive the company’s actions for best effect.
Operational Analytics across industries
Companies in all industries are being driven to use their data more effectively and efficiently, and analytics is where many of them turn to for improving their use of data. AI and ML are rising in popularity for the tremendous increase in speed that allows predictive and prescriptive analytics to be operationalized.
As we see in Henry Bell’s excellent article The Critical Role of Predictive Analytics in Insurance, predictive analytics is extremely important in the insurance field. We can easily see how the kind of value that is gained through the use of analytics in insurance would be of value in almost every industry and market. Also, Bell talks about the value of predictive analytics — deriving insights from data that help inform decision making for improving risk assessment or providing better rates or even being better at detection fraud.
But these don’t have to be real-time decisions. In the insurance space, those decisions are expected to take some time — they can take a few seconds or a few hours or even a few days. But businesses in all spaces are being driven to do everything faster, including making critical decisions like whether to insure someone based on the data that is available about them, while the person is waiting on the insurance company’s website or mobile application. It then becomes important for a good user experience (and thus, for customer satisfaction and retention) that these decisions and the analytics that drive them be operationalized and made to happen in real-time.
Use cases such as recommendation engines in retail (made popular by Netflix, for example), image recognition (like facial recognition in public surveillance applications), and routing (whether in transportation or network operations) all benefit from applying prescriptive analytics in real-time operations.
Fraud detection is one area where fast operational analytics can be critical. While a fraud detection system for an insurance claim might have hours or days to work, a credit card fraud detection system might have a time window of only a hundred milliseconds in order to decide whether a transaction is potentially fraudulent. The speed at which those transactions need to occur requires a database that can process those types of queries very quickly, with very low latency and very high accuracy.
If we examine the mechanism of such a fraud detection system, we’ll see the components described earlier. There are transactional systems that control and record current transactions where fraud detection rules that were developed in analytics systems are applied in real-time. Historical data containing past transactions are fed into analytical systems that then look for patterns of fraud and create models of those patterns in the form of rules.
Once you have a new fraudulent behavior pattern encapsulated as a rule, you add that rule back into the transactional system so that it can be applied quickly and proactively to new authorizations to try to stop any fraud matching that rule from occurring. The combined data from both current and historical systems are necessary for the entire fraud detection and prevention to work, to instantly decide whether a current transaction is safe or fraudulent.
What the choice of database may mean for businesses
Businesses want to be able to quickly react to external or internal events to allow them to manage risk and protect customer trust, increase revenue, reduce operational costs, improve employee productivity, and ultimately improve customer experience. All of which points to using data to improve top-line or bottom-line results.
Having the right type of database is critical for the ability to build the kind of operational, prescriptive data analytics needed to react actively, accurately, and quickly to changing circumstances and to drive the actions needed to react to those changing circumstances.
The majority of analytics performed today are still historical/diagnostic analytics, which is reactive because the angle of analysis is focused on historical data. Operational analytics, however, is critical because these systems are responsible for low latency and timeliness, the accuracy of search, the durability of data, and data quality. A key consideration is in choosing a proper database, like PostgreSQL, that offers these features and can support wider business goals.
The role of postgreSQL in analytics
I am relatively new to the world of PostgreSQL. I have worked with databases for many years — I have spent time at a data warehouse appliance company, which used a database that was derived from PostgreSQL, and I have experience with other products of Dr. Michael Stonebraker (the creator of PostgreSQL). But it is only recently that I have started to explore the full potential of PostgreSQL for solving the data management challenges of today’s businesses.
PostgreSQL is a general-purpose database software management system, so it can be used for both transactions and analytics. It is also continuously evolving and improving, thanks to both the extensibility that was baked into it during its formative years and its active developer community.
In my recent search, I have found some very interesting use cases and articles about using PostgreSQL for analytic systems, such as Using PostgreSQL for Real-Time IoT Stream Processing Applications, as well as great presentations on the use of PostgreSQL for AI and ML (read Bruce Momjian’s presentation, Postgres and the Artificial Intelligence Landscape, or watch it on YouTube).
There are certainly many other transactional DBMS offerings and many other analytical ones as well. But there are few that can handle both types of workload with aplomb, and even fewer that are open source and that are supported by a growing, vibrant user community. If you are looking to add a new database to be the foundation for your company’s business or perhaps you need to cull the long list of single-purpose databases that you are now supporting down to a more manageable set of multipurpose ones, PostgreSQL is worth considering.
Conclusion and suggested next steps
Operational analytics is just one of the areas where a good database management system can provide tremendous business value to the user. Moreover, the right database can create new opportunities for businesses by enabling:
- Faster time to market and increasing the competitive advantage with new services and innovation
- New or increased revenue streams from efficiencies and new and enhanced business models
- Increased customer satisfaction and loyalty
- Protection against brand exposure from data compromise and ensuring compliance with regulation
As cited in the introduction of this article, in the retail and supply chain industries, natural disasters such as a hurricane or a pandemic can lead to unpredictive spikes to demand. Operational analytics can be used to automatically reroute deliveries to where products may be needed the most.
To reach this level of analytics value, you need the right database systems in place that can react at the speed and accuracy that is required. This is when the correct robust, reliable database isn’t just a business benefit – it could mean the difference between happy customers and empty shelves.
Dennis Duckworth is the Senior Director of Product Marketing, EDB. TechNode Global publishes contributions relevant to entrepreneurship and innovation. You may submit your own original or published contributions subject to editorial discretion.
Image Copyright: appler
0 Comments