Archive
02.2018

The Drilling Rigs of the Big Data Age

2017 was the year the phrase data is the new oil went mainstream. It is true that we are awash in all kinds of data, and the value chain is clear: large amounts of data, machines/software used to analyze that data, insights derived from it, money generated from those insights.
I’ve met with thousands of companies in this space. Here are some trends that are creating the need for a new class of companies that will serve as the drilling rigs of the big data age:

  • The ratio of communications vs compute has risen dramatically over the last 10 years. Engines built for data centers need to adapt for this
  • The amount of data we need to process can no longer be stored on a single server. It must be stored on hundreds or thousands of servers, which further increases the amount of communication required. The use of solid state storage has led to a 100-fold increase in performance, putting more demands on network infrastructure
  • Microservices are driving new ways for applications to be built and managed
  • The amount of data AI/ML also needed in order to function is very large
  • Moore’s Law (the observation that the number of transistors in a dense integrated circuit doubles approximately every two years) is going away
  • Edge data centers are on the rise for security as well as to deliver real-time, low latency application performance

As a result of these trends, we see five big areas of opportunities:

1. There will be a new Intel.

Over the next decade, we will see a flowering of silicon architectures engineered for specific domains, as we are already seeing with GPUs applied to machine learning. One example from our portfolio is Fungible, led by Dr. Pradeep Sindhu, Juniper Networks founder, which is building a specialized silicon architecture called a data processing unit (DPU) that can move data rapidly and connect different applications engines in the modern data center.

2. There will be a new Akamai.

We have moved beyond the era of CDNs, where edge data centers will require full-stack application platforms, which guarantee reliability and security while delivering real-time processing data-first apps with low latency.  Entrepreneurs with software, systems and networking experience will be ideally suited to create these companies forming the new category of a data delivery network (DDN).

3. There will be a new Oracle.

The era of big data and multi-cloud requires NoSQL vs. SQL relational databases and converged platforms which can support all data, on one platform, and on every cloud.

4. There will be a new Business Objects/Tableau.

Visualization of data takes on a whole new level when the underlying datasets are multi-modal and huge. Analytics software needs to run directly within modern data platforms such as Apache Hadoop and the cloud. It has to analyze large volumes of data without moving it, filling the gap between self-service BI and advanced analytics for use cases like cyber security, IOT, and customer intelligence.

5. There will other industry pioneers in management and automation.

As microservices emerge as the metaphor for devops, automation becomes the lifeblood. Tools like Kubernetes, Mesos, and Swarm automate container clusters.There is need for a platform that automates the deployment and operations of data services at scale.

At the risk of pushing the oil analogy too far, the age of big data presents many wildcat opportunities to build the new super majors.

This article was originally published on insideBIGDATA.

# #