Big Data Use Cases Explained

Big data isn’t just a catchphrase. The reality of using high-end computers to crunch unimaginable volumes of data in pursuit of insights that can mean greater profits has developed to a point where it is hard to fathom an industry that wouldn’t benefit from the process. And if “greater profits” is too general a term, let’s get more specific. Big data applications can reduce processing flaws, increase efficiency, improve production quality, as well as save time and money. To get a sense of the possibilities, let’s take a look at a few specific big data use cases.

Before We Start

While the ways in which big data can be useful to an industry are essentially unlimited, unless you approach the process with a specific business challenge to be addressed, there’s a good chance you will end up wasting time and money. While the power of data acquisition and analysis is a mighty tool, unless you have a tightly focused query, useful insight will be hard to find.

Use cases, like those we are about to reveal, provide real world scenarios that illustrate the value to be found in spending the time beforehand to come up with a targeted question. You need to learn to think tightly and specifically when formulating a question for big data to chew on.

For example, asking where the next big market for your product will likely not yield as useful of a response as asking who in the US is more likely to buy more of that product? Ultimately, big data focuses on finding patterns and examples, thus coming up with appropriate questions that play to this strength is critical.

Product Design Customization

Without naming names, one example of using big data comes from a $2 billion company that engages in product manufacturing. With big data analysis in place, this company decided to focus on the behavior or repeat customers. At the heart of the attention lies the long-held 80/20 rule of business, which decrees that approximately 80% of a company’s sales come from 20% of its customers. By focusing on discovering the habits of this critical minority, it stands to reason that more profits could be generated overall.

One of the gold nuggets uncovered was that the company lost productive manufacturing time while waiting for contracts to be signed. By insuring that the necessary paperwork was always complete ahead of time, the result should be more uptime, productivity, and profits. Another result of the big data process in this example was a shift to lean manufacturing, which advises how to cease production of what the customer doesn’t want and concentrate primarily on what they do want.

Improved Manufacturing Process

The following is an example of successful big data use from the pharmaceutical industry. This company manufactures vaccines and various other blood components. In order to insure purity in the end result, they tracked 200 different variables. On the surface, that sounds like a pretty impressive effort to maintain a high quality in the final product. The problem, as big data analysis pointed out, was that this intensive quality assurance process still allowed for a yield variation from 50 to 100 percent. This level of inconsistency is enough to draw attention from federal regulators, which is bad news.

The analysis was able to separate and identify nine parameters that directly impacted the quality of the final vaccine yield. The rest of the 200 variables were essentially eliminated. The end result was that the company was able to save time in the testing process, increase production by 50 percent, end up with a higher quality yield, and save $5 to $10 million per year.

Fortifying the Supply Chain

Here’s a nifty way one manufacturer found to use big data to make sure they got their raw materials no matter what, even in the face of tornadoes, hurricanes, earthquakes, etc. Through various predictive applications, the company was able to calculate the probabilities of delays at various points in the supply chain, then make arrangements beforehand to identify backup suppliers. This is how you use big data to guard against downtime from unexpected natural disasters.

Better Testing = Higher Quality

A normal Intel computer chip used to go through 19,000 tests before it was cleared for sale. As you might imagine, this was a huge but necessary chunk of time and money dedicated to maintaining high quality standards. But you can bet that any company CEO would give his or firstborn child (an exaggeration!) to be able to cut down on testing time without quality taking a nosedive. Big data to the rescue.

By an exhaustive analysis that began at the wafer level, Intel was able to throw out a large number of their standard tests and focus on those that specifically yielded the most value. The savings on a single line of processors was $3 million in the first year, which is expected to grow to $30 million once fully implemented.

The Bottom Line

The preceding examples are just a few of the dozens of use cases that could be cited. Once you narrowly define the problem and turn loose powerful computers on a big pile of data, you might be surprised at the creative solutions uncovered to problems you didn’t even know you had.

Leave a Comment