A growing number of companies are pouring investment into Big Data analytics. According to a Gartner forecast, the business intelligence and analytics software market is expected to hit a staggering $22.8 billion by the end of 2020.
It is well known that Big Data analytics can be used to make better decisions faster and increase profitability. However, many IT leaders do not realise that real results depend on what happens after the software solution has been implemented. Continuous monitoring and tuning is crucial for the model to consistently deliver the expected business value.
Without this next step, it can be difficult to achieve the desired results.
What can companies do to get the most out of their analytics investment? Here is a checklist of six best practices:
1) Start with the decision, not the data
Exploring different advanced analytics tools without a clear idea about how they will solve current business problems is a waste of time and resources. Start by focusing on the business problems that you want Big Data analytics to help you solve.
Once these problems have been identified and prioritised, the next step is to determine the type of data needed to inform a specific decision, and to ensure that this is up-to-date and accurate.
For example, there’s been a lot of recent interest in analysing unstructured data such as video and speech. However, if your business deals mainly in consumer transactions, which tend to yield structured data, the analytic focus should be on structured data analysis.
2) Ensure data analytic models can be operationalised
Although this seems obvious, far too many projects are left gathering dust or encounter delays because it is simply too challenging to deploy the models into the production environment, or to use the model findings within the decision-making process. The opportunity cost to a business from all the suboptimal decisions made in the interim can be immense.
To avoid this situation, the astute selection of data is critical. What looks wonderful in the lab may not be available or may be too expensive to obtain for use in day-to-day business operations. For example, models that rely on manually intensive data processing steps can cause problems at the implementation stage, or when they are used by less data-experienced teams in the business. Industry regulations may also affect where and how data can be used.
Today’s decision management platforms (available for on-site install or via cloud services) have built-in tools to expedite model deployment. Through the use of application development tools and rules engines, they also provide everything needed to create complete applications, including user forms and workflows powered by the analytic models.
3) Keep up with analytic innovations
R, Python, Hive, Groovy, Scala, MATLAB, SQL, SAS. One of the side effects of the exploding world of analytic innovation is that taking advantage of the latest techniques often requires learning a new set of tools. Analytic teams will inevitably need to use multiple development methods to deliver the insights the business needs. It’s also clear that combining different types of analytic techniques can deliver superior results.
To get multiple types of analytic models to work together in an efficient development environment and robust production environment, you need a flexible infrastructure that embraces diversity. Fundamental requirements include the ability to operationalise models authored by a wide range of tools by supporting extensible libraries, web services and standards such as the Predictive Modeling Markup Language (PMML) and the Decision Model Notation standard. Centralised lifecycle management should extend across models, business rules and analytic assets from any source.
It’s also very important to build a culture of documentation and control while leveraging all of these tools. Production use of analytics requires discipline and control whilst finding ways to embrace creativity. That’s the balance that successful organisations manage to find.
4) Enable analytic diversity
Following the previous point, the ability to introduce and combine different types of analytic techniques can deliver superior results. However, getting multiple types of analytic models to work well together has two pre-requisites.
First, your business needs to have an efficient development and robust production environment. Analytic teams will inevitably need to use multiple development methods to deliver business insights – which often involves continuous reskilling and upskilling. Staying updated with the latest development languages and techniques is essential if your business is to gain maximum advantage from new innovations.
Second, it helps to set and enforce best practices for model management and governance. This will reduce development and implementation delays and enable analytics to be deployed more quickly to keep up your Big Data momentum. Model management tools can help you keep model performance high, check for model performance degradation, and answer questions from regulators about which model, and even which attributes, drove a particular decision.
5) Leverage third-party cloud services
Creating Big Data analytics no longer requires a large upfront investment in expensive infrastructure and specialised skills. Companies can now use analytic services provided in the cloud to handle the underlying systems and services, just paying for the capacity and services they need.
An open, hub-based architecture is a quicker, less costly way to improve cross-functional visibility and coordination than traditional one-to-one systems integration.
6) Combine automation with human expertise
Big data tools and infrastructure are making it easier to apply machine learning techniques to trawl through vast datasets that include both structured and unstructured data. The right balance of analytic techniques with human analytic and domain expertise not only lifts business performance but also improves the ability of companies to learn at a fast pace from data-driven experiments.
Keep in mind that when analytics deliver disappointing results, it is often because there is not enough analytic experience at hand. Thus, ensure the people or businesses with whom you partner for your Big Data projects really understand the data that drives both the decisions and the building of the analytic models.
With the expanding space of open source and commercial data science tools, newer “data scientists” often use these tools without true understanding of how they work, what the parameters mean, and the impact they can have on your business decisions.
Big data has the potential to add great value to business decision-making. However, it can be difficult to extract analytic insights from immense stores and incoming streams of data in an actionable form – and in time to make a difference to your business. To ensure the maximum value is extracted from your analytic investment, the first step is to choose the right data and models, and follow it through with continual monitoring, tuning and upgrading. This will point to areas for improvement, and ultimately, ensure you stay on track to achieving your business goals.
Manish Gandhi, Senior Director for Analytics in EMEA, FICO
Image Credit: Sergey Nivens / Shutterstock