Join online with today’s leading executives at the Data Summit on March 9th. Register here.
This article was contributed by Oliver Scheibenberger, Singlestore’s Chief Innovation Officer.
Satya Nadella, CEO of Microsoft, used the term tech intensity to combine technology adoption and technology creation. Companies can accelerate their growth by first adopting best-in-class technology and then building their own unique digital capabilities.
Over the past decades, technological innovation has followed a familiar pattern towards digital transformation in almost every industry or application sector. Innovation shifts from industrial technology (machines, manufacturing) to computing technology (hardware) to data technology (software). Connecting has evolved from building roads and railroad tracks to wiring between computers and software-defined networking. Automating intelligence has evolved from industrial machines to convert muscle power to translate well-known logic into machine instructions (e.g., tax preparation software) into modern AI systems based on data (e.g., natural language interaction) on their own. Programming logic.
Computer science also experienced this change as a discipline when it shifted its focus from computing to data almost 20 years ago. The computer science based approach to data brought us modern incarnations of machine learning and data science.
The shift to data technology does not eliminate other phases of technology innovation. We still use the road. Below the software-defined network, there are wired computers somewhere. Computerized knowledge systems still have their place – who would want their taxes on last year’s returns to be paid by a trained neural network?
But as the world digitally transforms and transforms into data, data-driven technologies have a logical consequence. The increase in technological intensity that we experience today is an increase in the intensity of data.
Intensity of data
In physics, intensity is the intensity of a quantity per unit. For example, the intensity of sound is the power transmitted by sound waves per unit area. In more colloquial language, intensity is understood as a high degree of strength or force. Conversational as well as theoretical definitions of intensity are both useful in our context, although we will not attempt a mathematical formula for data intensity.
Data intensity is about the characteristics and properties of data such as volume, velocity, type, structure and how you convert energy into data.
In his book, Designing Data-Intensive Applications, Martin Clapman distinguishes data-intensive applications from compact-intensive applications based on the nature of the primary constraints on the application. In computing-intensive applications, you worry about CPU, memory, storage, networking, and infrastructure for computing. In data-intensive applications, data becomes the primary challenge and concern.
This shift follows a familiar pattern towards data technology. Internal computing infrastructure is still essential, but automated provision and deployment, auto-scaling of infrastructure and resources as code eases computing concerns. When you worry about auto-scaling the app database, adding real-time text search capabilities to mobile apps, adding recommendations based on click-stream data, or managing data privacy across cloud regions, your app has become more data-intensive.
The concept of data intensity of applications extends to data intensity in organizations. The intensity of an organization’s information increases as it manages a greater variety of data (e.g. by volume, type, speed), more data becomes literate, adopts more data-based technologies (e.g., data integration, data Flow, no-code ELT), and its unique data-based content (e.g., inferential model).
Increasing the intensity of the data should be a good thing. As the focus shifts from operating data centers to data-centric, the rate of innovation should increase. Better decisions should be made as data literacy increases. Software-defined technologies should make processes programmable, eliminate risk, and make the organization more adaptable. Creating your own predictable models should increase diversity and enable better customer experience through personalization.
Alas, many organizations do not have that experience. Instead of focusing on how to get the most out of data, the challenges surrounding data create massive barriers. Instead of fueling the digital transformation, the intensity of the data seems to suffocate it.
- Finding an application that needs to combine structured operational data with unstructured data in document stores, geo-related real-time analytical insights can be very complex if it combines 10 different technologies together.
- Data stored in a separate system needs to be integrated for reporting, which causes time consuming and costly data movement and data duplication challenges.
- Lack of skills and scale makes it difficult to create unique data-based assets based on your own data.
- Data systems that reach their scale limits often do not slow down dramatically. When they reach the wall, they hit it hard.
When the intensity of the data leads to complexity and friction, the results are negative. People, processes and technologies adapted to one level of data intensity cannot cope with the next level of intensity: when the number of users increases tenfold, or the data volume triples, or predictions are required where descriptive statistics are calculated today. .
Data intensity becomes a surrogate measure for digital transformation, in combination with complexity it is a measure for digital maturity and resilience. In the coming years, the objectives, key outcomes and KPIs of many organizations will be linked to the intensity of the data to achieve that maturity level.
Oliver is the Chief Innovation Officer at Scheibenberger Singlestore,
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including tech people working on data, can share data-related insights and innovations.
If you would like to read about the latest ideas and latest information, best practices and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing to your own article!
Read more from DataDecisionMakers