4 Use Cases of Big Data and High Performance Computing

4 Use Cases of Big Data and High Performance Computing
đź‘‹ Hi, I am Mark. I am a strategic futurist and innovation keynote speaker. I advise governments and enterprises on emerging technologies such as AI or the metaverse. My subscribers receive a free weekly newsletter on cutting-edge technology.

With the amount of data that organizations have to deal, with expected to grow into the exabytes in the coming year(s), we will need better technology. Bring in High-Performance Computing, or HPC in short, and a completely new world opens for you.
HPC is dramatically changing the playing field.  In the past years, it has grown from an innovative feature used only by the most advanced scientific research centres, to a tool that enables organisations across industries to gain advantages of Big Data. Presidents from the USA to Russia have stressed the importance of HPC. Today more and more organisations are also valuing the qualities and possibilities of High-Performance Computing. But what is HPC and what can you do with it? Let’s take a deep dive into High-Performance Computing.

What is HPC and Why Do We Need It?

High-Performance Computing refers to technology that is capable of storing, processing and analysing massive amounts of data in mere milliseconds. An HPC infrastructure basically is a large amount of clustered storage and computing servers, which are interconnected using extremely fast and efficient networking.

In itself, it is not that interesting, but when you use HPC to analyse massive amounts of data, it becomes a game-changing technology. Using High-Performance Computing to analyse your data in real-time could enable you to solve a completely new class of computational and data-intensive problems. When used correctly, organizations can significantly improve their productivity, reduce their costs and create a long-term competitive advantage.

More and more we will see a merger between HPC and Big Data Analytics, as the two are so beneficial to each other. To stress this, IDC has created the term High-Performance Data Analytics. HPDA means, as stated by Steve Conway, Research Vice-President HPC at IDC, “the ability of increasingly powerful HPC systems to run data-intensive modelling and simulation problems at larger scale, at higher resolution, and with more elements”.

HPDA enables organisations to mix a wide variety of large data sets, integrate different information sources and perform analysis across enterprise, partners, suppliers and customers. This real-time analysis of what’s going on around your business, at every possible angle, will drive innovation and economic growth.

A great example of what is possible with HPC, and what still is at least a decade away, is personalised healthcare. Imagine the amount of computing power required to identify very effective, personalised, treatments in real-time by comparing a person’s unique genetic code, symptomology, health history and the social environment against tens of millions of archived patient data. That is the just one example of the future that awaits us thanks to High-Performance Computing power in combination with Big Data.

With the complex world we live in and the data generated at every possible angle, we will need High-Performance Computing, and in particular, HPDA, to understand the world around us, create personalised products and services and remain competitive.

4 Ground-breaking Use Cases of Big Data and HPC

As mentioned, traditionally, High-Performance Computing was primarily applied within scientific research, the financial world and relatively recently also to forecast the weather. The financial services industry was the first commercial industry to adopt HPC. Nowadays you will find the most advanced HPC systems trading stocks in nano-seconds and generating massive amounts of money for their owners. Banks have used HPC for pricing exotic financial instruments, optimising their portfolios of mortgage-backed securities and managing firm-wide global credit and debit risks.  But there are more use cases for HPC in combination with Big Data:

Fraud Detection

Fraud detection is an extremely important area for the financial industry. Millions of dollars could be saved if suspicious transactions are stopped before they occur. To do this, millions of transactions across the globe will have to be analysed in real time. The problem is that algorithms will need to find unknown patterns within that data, which could indicate a fraudulent transaction.

It requires not only analysing millions of transaction, but also many other data sources that offer the right context for a certain transaction. For example, is a certain credit card transaction valid or not. That not only depends on the transaction itself but also for example on the location or the moment of the transaction.

One company that deals with these volumes is PayPal. On a daily basis, they deal with 10+ million logins, 13 million transactions and 300 variables that are calculated per event to find a potentially fraudulent transaction. Thanks to High-Performance Data Analytics, PayPal saved in their first year of production over $700 million in fraudulent transactions that they would not have detected previously.

Personalized Medicines and Drug Discovery

Already I mentioned the possibilities of personalised medicines and how it requires HPC. Although personalised medicines might still be quite far away, the pharmaceutical industry is already using High-Performance Computing for drug design and discovery.

Drug design and discovery is a tedious process. It can easily require several years or longer before a drug hits the market. This is due to rigorous testing in labs on animals and later on humans before it is made available to the masses.

High-Performance Computing in combination with Big Data enables the pharmaceutical industry to find, for example, the right proteins for a certain drug among millions of compounds. This can be done thanks to simulation analysis and test a plethora of varieties on thousands of different virtual patients. As a result, drug discovery can be reduced with multiple years, eventually saving a lot of lives.

Smart Energy Grids

Smart energy grids might still be long away; already multiple energy companies are experimenting with the possibilities of a smart energy grid. The potential of a truly smart energy grid is tremendous; reduced energy consumption, better and safer electrical grids and a safer and cleaner environment. Thanks to HPC, smart energy grids are becoming a reality.

A smart energy grid will have to deal with massive amounts of data from a wide variety of sources, all located in a highly dispersed network. Imagine if every household would have a truly smart energy meter and at every single moment in time, it is measured how much energy they use, from which devices, for how long and at what price. Multiplied by millions of household, we are easily talking exabytes of data that need to be analysed in real-time to determine whether someone is allowed or not to charge his/her electrical car. That can only be done using an HPC infrastructure and a High-Performance Data Analytics environment.

Manufacturing Simulation Analysis

From the beginning, HPC has been involved in the modelling and simulation of complex physical and quasi-physical systems. Modelling and simulation analysis enables an organization to gain a better understanding of a certain project, without the need to actually test the product in real-life. Thanks to this approach, for example, Tesla was able to have the early edition of the Tesla Roadster pass dozens of tests, without the need for dozens of cars that could be crashed (as is traditionally done in the automotive industry).

What Tesla did in developing the Tesla Roadster has become common practice for Tesla. But also in other industry, simulation analysis can save companies a lot of money when developing new products. Thousands of iterations can quickly be tested taking into account hundreds of different variables. Based on the outcome of each simulation, the eventual product is improved. High-Performance Computing is required to perform these millions of simulations, at least if you want to get it done relatively quickly.

Other Industries

Of course, these four industries are just the beginning of what is possible. Basically, any organization that wants to make a serious business from Big Data could use HPC and HPDA.

Other industries that are already using High-Performance Computing are among other the Space industry, Engineering (Oil and Gas) industry, Defence, Publishing industry, governments (think the NSA) and of course academia. In the years to come, this list will probably become a lot longer.

The Future of HPC and Big Data

High-Performance Computing in combination with Big Data Analytics is rapidly taking over industries around the world. The insights that organizations derive from High-Performance Data Analytics are very real and very valuable to them. When organizations are faced with more data in the future, we will probably see new forms of HPC configurations deployed, unlocking even more insights in massive amounts of data. These insights it seems are only limited by the creativity of data scientists writing the algorithms to analyse all that data using a High-Performance Computing infrastructure. The future of HPC and Big Data is, therefore, a very bright one, bringing fantastic, smart, innovations to organizations who dare to take the step into High-Performance Data Analytics.

Image: Sashkin/Shutterstock

Dr Mark van Rijmenam

Dr Mark van Rijmenam

Dr Mark van Rijmenam is The Digital Speaker. He is a leading strategic futurist who thinks about how technology changes organisations, society and the metaverse. Dr Van Rijmenam is an international innovation keynote speaker, 5x author and entrepreneur. He is the founder of Datafloq and the author of the book on the metaverse: Step into the Metaverse: How the Immersive Internet Will Unlock a Trillion-Dollar Social Economy, detailing what the metaverse is and how organizations and consumers can benefit from the immersive internet. His latest book is Future Visions, which was written in five days in collaboration with AI. Recently, he founded the Futurwise Institute, which focuses on elevating the world’s digital awareness.

Share