As one of those modern cliché sayings goes, data is the oil of today’s world. But that does not just mean the earning potential found in data. Above all, it means understanding the fact that just like unprocessed crude oil is relatively useless, so is unanalysed data.
That is why a lot of SMEs have huge amounts of data stored in various systems or outdated data warehouses.
This makes sense however, since huge amounts of data go hand-in-hand with business – they are created from communications with clients, marketing campaigns, sales numbers, carrying cost reports and various other daily activities. Additionally, data has a habit of creating more new data. If used skilfully, all this data can help optimise work and improve business results. Sadly, this is not the case in most cases, since handling the ever-increasing amounts of data is a huge task. Most often, the largest obstacle tends to be the fact that all the data is fragmented between different warehouses and systems, and that the systems cannot always convert the data easily into a usable format.
In-house analytics and reporting
When any new data is created within an organisation, then usually, the first thing that is done with it is an internal report that, for example, specifies all different business-related KPIs. To achieve that, first various existing data warehouses, databases and other places that store data need to be interfaced. The most common way of doing this is hiring a service provider or partner who can help do at least the initial setup AKA connect the different data sources. For this, they create a data lake, which is a common repository that aggregates data from all of the different sources the company has. This makes it easier to cross-analyse the data. Data lakes, which essentially have unlimited amounts of space, are cost-effective, have very precise security protocols (for example, it is possible to limit a system’s access to specific pieces of data only), and they allow for interfacing any kinds of existing data warehouses and services. Although the creation of a data lake alone is a huge step forward for many organisations, this still doesn’t help them analyse the data.
For that, a data specialist will create various preset reports within these systems that the correct people can then keep track of in real time. Here’s a simplified example: optimising a popular web store with a simple Excel table or ad-hoc SQL queries to make it more user-friendly can be a big challenge – data has to be submitted in a specific way for this, an overview of it can only be generated after the fact and only static files can be shared with colleagues. On the other hand, interfaced data sources and preset reports allow to track and get an overview of customer behaviour changes in real time, separate resources as optimally as possible, see hidden trends, replenish warehouse stock according to sales figures etc.
Predicting the future with data
Once this first step has been taken, it is time to take the next one by combining data analysis with artificial intelligence, which helps to discover anomalies in real time, create and analyse possible future scenarios, predict trends, and gather the type of knowledge from the data that would usually take hours, if not days or months, to compile.
Implementing artificial intelligence has been made so simple nowadays that even someone who has no specialised preparation in the field can do it.
It is enough to know that most cloud service suppliers have different pre-made services known as application programming interfaces or APIs (for making data-based forecasts, identifying patterns, identifying objects and faces in images and videos, identifying voices and converting them to text and vice versa, monitoring the context of a conversation etc.) that can come to various conclusions and make predictions based on data. Of course, it is possible to create one’s own AI services or models instead, but usually, this is more difficult to do and requires a larger investment. However, this does not mean that it’s impossible to use more complicated AI services for one’s business. Larger cloud service suppliers have a variety of services for quickly running AI clusters and often, they have a big catalogue of pre-trained models that can easily be adapted for use.
Today, most Estonian companies are still struggling with the first step. They are still trying to make sense of their data and find ways of using the data for business gains. Often, this is attempted with old solutions (by creating more data warehouses and adding more local analytics applications) but in practice, it has become clear that these attempts all fail since it is not possible to achieve the results people are looking for with this method.
Cloud services, data lakes, artificial intelligence and machine learning are all actually much more easily available and more useful than most people would think.