Why do 85% of Big Data projects end up in failure?

16-022-supersoniccontract

In 2016, Gartner estimated that about 60 percent of all big data projects would fail. A year later, Gartner analyst Nick Heudecker said that figure was "too conservative" and that the failure rate for big data projects should be closer to 85 percent. And he still thinks so.In fact, the number of customers successfully applying Hadoop is likely to be less than 20, maybe even less than 10. This is a shocking result considering how long it has been around and how much the industry has invested.

Anyone familiar with big data knows that the problem is real and serious, and not entirely technical. In fact, technology is a secondary cause of failure relative to the essential cause. Here are four major reasons why big data projects fail, and four ways they can succeed.

     Problem #1: Poor integration

Heudecker says there is a major technical problem behind big data failures, and that is integrating isolated data from multiple sources to achieve the data processing power that enterprises need. Establishing connections to isolated legacy systems isn't easy. The cost of integration is five to ten times the cost of software, he said. One of the biggest problems is simple integration: How do you link multiple data sources together? Many people choose the data lake route, thinking it's easy, but that's not the case.

Isolated data is part of the problem. Customers tell him that when they pull data from systems into a common environment like a data lake, they can't figure out what the values mean.

      Problem #2: Unclear Goals

Most people assume that companies will have clear goals when they undertake big data projects, but that's not really the case. Many companies usually start the project first and then think about the goals. You have to take a hard look at this," says Ray Christopher, product marketing manager for Talend, a data integration software company. People think they can connect structured and unstructured data to get the information they need. However this has to be targeted in advance, what kind of information do you want?"

      Problem #3: Skills Gap

Too often, companies think that the internal skills they build for data warehousing will translate to big data, and that's not the case. For starters, data warehouses and big data process data in completely opposite ways: data warehouses execute schema on write, which means data is processed and organized before it enters the data warehouse.

In Big Data, data is accumulated and read patterns are applied, and data is processed as it is read. Therefore, if data processing moves from one method to another, the skills and tools should do the same.

     Problem #4: Technology Generation Gap

Big data projects often take data from old data shafts and try to merge them with new data sources (such as sensors, web traffic, or social media). This isn't entirely the fault of enterprises, which collected this data before big data analytics came along, but it's a problem anyway.

According to Greenbaum, the biggest skill that enterprises are missing is how to merge these two data sources so that they can work together to solve complex problems. Data silos can be a barrier to big data projects because it doesn't have any standards. As a result, when companies start planning, they find that these systems have not been implemented in any way, so the data will be reused.

     Solution 1: Plan ahead

It's a cliché, but it applies to big data projects. Successful companies are inevitably the ones with results, choosing something small and achievable and new to plan and implement. says Morrison, "They need to think about the data first and model the enterprise in a machine-readable way so that the data serves that enterprise."

     Solution 2: Working Together

Shareholders are often left out of big data projects . heudecker says that if all shareholders work together, they can overcome many obstacles. Adding technical staff working together and collaborating with business units to deliver actionable results can help.

Christopher believes that big data projects should be made a team sport, with everyone helping to curate and collect the data and process it to improve the integrity of the data.

     Solution 3: Narrow the focus

People seem to have the mindset that big data projects require very big moves. But just like the first time you learn anything, the best way to succeed is to start small and then scale up over time.

"They should carefully define what they're doing," Heudecker says, "and should pick a problem domain and look at solving it, such as fraud detection, segmenting customers, or figuring out what new products are being introduced in the millennial market."

     Solution 4: Leave Tradition Behind

It's important to avoid being interested in an existing infrastructure only because an enterprise has a license for it. Often, new complex problems may require new complex solutions. Using an enterprise's old tools from the past is not the right approach and may even lead to the failure of big data projects.

Morrison believes that enterprises should stop sticking to their old ways. He also said that enterprises can no longer rely on vendors to solve complex system problems for them. "For decades, many people seem to have assumed that any big data problem is a systemic problem. But when faced with complex system changes, companies must build their own solutions.