Every day, 2.5 billion gigabytes (quintillion bytes) of data are created – with 80-90% of the data classified as “unstructured.” This article looks at how graph technology can help the data analysis industry solve this growing “data deluge” problem to find efficient and reliable insights.
Tech titans such as Google, Facebook, and LinkedIn have long capitalized on the power of graph data models to understand patterns and connections in their data. These insights have been used to improve web searches and better understand user behavior.
Today, graphs and graph computing have become ubiquitous in many more industry verticals, and they are being applied to find innovative solutions to new problems.
One example is the financial services industry, which is a huge area of growth for graph technologies. Analyst firm Gartner predicts that banks and investment firms will spend $623 billion on technology products and services in 2022. Digital fraud attacks against financial services companies were up by 149% in the first four months of 2021, compared to the previous four months, for example, and have become one of the most lucrative scams for fraudsters. To combat this issue, interaction graphs are used to drill down into complex interrelationships among customers, accounts and transactions, improving fraud detection. Similarly, interaction graphs can be built and analyzed to prevent money laundering by looking for anomalous patterns of transactions.
Another example is the pharmaceutical industry’s drug discovery space, which has received additional attention in light of the global COVID-19 pandemic. Graph technology can analyze a variety of medical knowledge data about medicines, treatments, results, and patients, and perform “hypothesis generation” to determine promising treatments for particular diseases. Equally importantly, this technology can be used to rule out proposed treatments for diseases as well. This allows scientists to reduce the number of costly and time-consuming wet-lab experiments they need to perform to discover treatments for diseases.
On top of accelerating the drug discovery process (which can cost over $1 billion on average and span 12 years or more), graph technology is also integral to the emerging field of precision medicine, turning away from a “one-treat-fits-all” approach in medicinal treatment to a custom treatment approach in which data about an individual patient is used find treatments targeted for that patient. This enables us to build a more personalized approach to medicine.
Graph technology is still in its infancy in some industries, so its applications in areas such as financial fraud, precision medicine, and information security are only scratching the surface of the technology’s potential. The technology can be applied to fringe areas, such as space exploration, oncology, and even decrypting ancient languages!
In spite of graph computing’s ability to deliver data intelligence at speed and at scale, there are two obstacles that have limited its widespread adoption – a lack of understanding about its capabilities, and the difficulty that many graph platforms have had in interoperating with third-party libraries and other systems in data processing pipelines. These hurdles are now being addressed by graph vendors.
As the amount of data continues to increase – and as organizations continue to struggle with managing unstructured data – organizations must find new and innovative approaches to using this information to extract timely insights. Graph technology is one key part of the overall solution, and graph systems, in conjunction with other analytics technologies, will permit organizations to unlock deep insights from the enormous amount of data they already own.
This is the first of a two part-series. In the next article, Keshav Pingali will explain what best practices systems developers should follow to leverage graph technology.
Keshav Pingali is the CEO and co-founder of Katana Graph, an AI-powered graph intelligence platform providing insights on massive and complex data. Keshav holds the W.A.”Tex” Moncrief chair of computing at the University of Texas at Austin, and is a Fellow of the ACM, IEEE and AAAS.
- Katana Graph optimizes analytics engine on 3rd gen Intel Xeon
- DesignDash aims to optimize SoCs by analyzing untapped EDA data
- The future of cloud convergence and the three P’s