COMPAREXpress

10/11/2016

…The harder it is to analyse? The more expensive it is? The more complicated it gets? If any of those were the one true answer, then it’d make the title of this blog a whole lot catchier.

So do you want to press the back button where the story ends here? You can carry on browsing and believe whatever you want to believe? Or do you want to scroll down? You can read on, and we’ll show you how far down this rabbit hole goes…

 

Conceptually, it’s been around for decades, even during the monochrome-filtered days of basic analytics. Let’s be honest; whilst you’d struggle to find a data analyst or scientist who’d look back on the ‘good old days’ with a pair of rose-tinted shades, basic analytics helped them to get the job done. But through bringing technologies such as data mining, predictive analytics and data management to the table, big data analytics isn’t as much a tool as it is a whole army of analysts on their best day, every day.

Despite the smorgasbord of industry headlines ‘data’ enjoys, it’s almost impossible to make data seem cool. Even when you stick the word ‘big’ in front of it, data has its connotations of being complicated, slow and expensive that just won’t go away. We all know it’s important, that it can give a fundamental competitive advantage of speed and efficiency but it’s ‘the how’ that seems to be perpetually up for debate.

 

Horses, courses and other clichés

Interestingly, real-time vs. batch processing continues to divide opinion. Both offer their benefits and disadvantages, yet the question still remains of which is better, when it really doesn’t need to; it all depends on what works best for the business. Casinos, for example, may favour real-time processing since it only has a short window to address customer interaction, whereas a retailer such as Argos may prefer a batch processing method. It entirely depends on the need of the business.

Some say it all goes back to building the right infrastructure. Cloud vs on-premise is a debate which continues to permeate throughout the industry, and it’s never been more relevant when discussing how to process big data.

On-premise systems allow organisations to get the best out of analytics clusters, whereas cloud works well with processing data which originated from outside of the enterprise.

 

It’s an ‘all hands’ thing

Given that big data analytics has the potential to reduce costs, time and resources, as well as inform new product development and decision making, everyone needs to take their seat at the table and give their two cents worth.

And that means everyone. For example, the insights from big data analysis are inherently centrifugal to the customer experience and have been so for some time. Since almost all roles eventually lead back to the customer enjoying their experience whilst using the product/service, it’s incumbent upon all areas of the business to consider how data affects their job, as well as how it can help them improve.

SMBs need to be getting stuck in as well. It’s a common misconception within the market that big data requires the user to be big in nature in order to implement and gain from it, but this couldn’t be further from the truth. It offers much of the same benefits as it does for larger organizations in fact, (as long as they have the key ingredients for leveraging data, such as a single source of truth and dedicated task clusters) small businesses are able to fine-tune processes under a microscope and produce results a lot faster than their larger counterparts.

 

Whatever you do, do something

Whilst the market is saturated with opinion, your chances of finding out specifically what needs to be done are very slim.

Let’s face it, the ever-increasing intake of information is showing no signs of slowing down, and as a result, the need to process this information is at a parallel rate. Companies are usually faced with a choice of three options:

  1. Do nothing

  2. Add more hardware to their Enterprise Data Warehouse and operational systems

  3. Reconsider how they manage their data

There’s simply no way a business could get away with the first option in today’s highly informed world, and the whilst the second option would get the job done, it would be far too costly. The third option however, is the most viable - and this is where Microsoft Azure steps in.

Given how complex network environments are becoming, implementing Azure across all of your applications will allow you to efficiently dig through your data and discover hidden gems of insight that will give you a serious edge over your competitors.

When you think about it, perhaps big data is only slow, complicated and expensive because it’s been allowed to get that way. If all of its potential and tools were utilised correctly, then who’s to say it couldn’t be the polar opposite in practice?

 

To find out more about how we can help, contact our Cloud Solutions Specialist!

Kaz Traverso | Cloud Solutions Specialist | kaz.traverso@comparex.co.uk | +44(0)7917 641 431

 


Share this page

Do you want to continuously receive news via LinkedIn about COMPAREX UK in general, special offers and our events?

Start following COMPAREX UK on LinkedIn


Related Pages

 COMPAREX SAM2GO - Software Asset Management as a managed service.

Contact Us