New enterprise applications are hungry for data, and the industry’s current approach to providing that data has created security and management problems. The traditional way that companies feed these data-hungry applications involves making a lot of data copies. For example, when an enterprise deploys a new application, the IT team must identify the relevant data, make a copy of it, and store that copy in an environment that supports the application. That environment may be on-premises or in the cloud.

This seems like a simple and straightforward solution, but it’s contributing to an underlying problem that is growing worse as the number of applications and data copies increases. The greatest challenge is protecting and managing all of those copies and the compute and storage resources that each application requires.

The Hidden Costs of Mass Data Fragmentation

Rawlinson Rivera
Rawlinson Rivera is the Global Field Chief Technology Officer at Cohesity, Inc., and a well known as an industry thought leader on cloud enterprise architectures and hyper-converged infrastructures. Rawlinson joined Cohesity’s leadership team following 10 years at VMware, where he was most recently the Principal Architect working in the Office of CTO for the Storage and Availability Business Unit. Rawlinson has also authored numerous books based on VMware and Microsoft technologies, and is the author of the popular blog PunchingClouds.

Provisioning in additional compute and storage resources for an application is not a one-time expense. For example, if a company is running an application in a public cloud environment, that comes with a monthly fee from the cloud provider to cover the compute, network and storage capacity required to support the workload. And even if the application is running on-premises, it raises the demand for more data center resources, which include electricity and real estate costs along with the physical data center hardware.

Read More:   Update With AI Now a Commodity, the Speed of Iteration Is the Next Challenge

Perhaps more important than the monetary costs is how adding more storage and compute resources increases the burden on the IT team. Managing and maintaining new data copies for different applications and use cases adds to the workload of IT teams. These are usually full-time employees that have a solid understanding of data management policies and regulatory compliance. With the demand for engineers today, it’s already a challenge to sustain adequate staff to manage the organization’s existing infrastructure, let alone planning for future growth and data use cases. If new applications are brought on without expanding IT support, or without deploying modern data management solutions, the team can quickly become overwhelmed by the growing administrative demands and costs of managing these applications.

Increasing Risk of Data Security and Compliance Violations

Creating data copies for new applications also increases the risk of data breaches and compliance violations, particularly for companies such as healthcare or financial services that work with sensitive information. Every time enterprise data is copied, the risk of a data leak is multiplied. This becomes even more difficult when data copies are moved between different environments – which will often have different levels of protection and control.

Ultimately, the way to deal with data-hungry applications is not to create more data copies, and build out more resources to support them, but to find a better way to share data between multiple applications.

For example, using a cloud-based application means that some company data will be stored in the cloud where the application is running. Depending on the function of the application, this data may include sensitive information that would normally be stored in an environment with strict access controls and security mechanisms. Making sure these access controls and security mechanisms are replicated in the cloud becomes an additional task passed on to the IT team.

Read More:   A Business Perspective on CI/CD Pipelines – InApps Technology 2022

The risk associated with data copies has gone up as regulations governing data security and privacy have increased. These requirements often require that companies delete sensitive data after a certain amount of time or once a user discontinues service. As data is copied across multiple locations for different uses, it becomes much more difficult for IT admins to ensure that data is removed or deleted at the appropriate time. Introducing processes for removing data to adapt to the changing regulatory environment becomes exponentially harder as data is copied across more and more applications and infrastructures.

Reversing the Traditional Relationship Between Apps and Data

Considering these problems, enterprises should consider a modern approach that doesn’t require additional compute and storage resources for most applications. What does that modern approach look like? It starts by rethinking the way things are done within most organizations today. Why not bring apps to the data versus the old way of bringing data to the apps? In other words, this approach would bring the application to a single data repository rather than making copies to bring data to the application. So, instead of making data copies and having them distributed across different locations, enterprises could allow multiple applications to tap into a single source of data.

By reversing the traditional relationship between data and applications, enterprises can manage and protect just one collection of data no matter how many applications are introduced. This would dramatically lower costs by eliminating the need for the enterprise to create separate data copies and provision additional resources for each new application. It would also simplify data management, which would enable IT staff to spend less time setting up and maintaining the environments for different applications and more time focusing on growing other critical parts of the business. And, with simpler management comes stronger data security, IT teams can focus on protecting a single data repository rather than recreating the same level of monitoring and protection across the multitude of environments where copies of sensitive data are stored under the current approach.

Read More:   Update The Rise of the Event Streaming Database

Another benefit of creating a single data repository that can support multiple applications is that it would provide immediate and global visibility into enterprise data. Under the traditional approach, enterprises have no way to glean insight from the data until it is copied and moved to a separate environment where they can run an analytics application. With a system that allows an analytics application to be brought to the data, enterprises can eliminate the lag time that results from copying and moving fragments of data to new environments. This also can empower organizations to garner better insights because the analytics application can run across the complete collection of enterprise data rather than a copied fragment.

Ultimately, the way to deal with data-hungry applications is not to create more data copies, and build out more resources to support them, but to find a better way to share data between multiple applications. Establishing a single source of data that can serve many applications will allow enterprises to maximize the value of their data without increasing the size of their IT staff or their exposure to a data breach.

Feature image via Pixabay.

InApps Technology is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Unit.