Why enterprise security must apply zero trust to data management challenges

Why enterprise security must apply zero trust to data management challenges

Managing complex and sensitive data has never been easy. In this era of expanding workplace connectivity, however, data management tools and methods are clearly under the microscope.

Conventional data management approaches typically involve granting implicit trust to users who successfully sign on. The assumption here is that once that user signs in, any data they connect to after logging in should also be authorized for access.

With external and internal cybersecurity threats increasing both in number and intensity, there is heightened pressure to keep everyone from lone wolves or insider threats to cybercriminals or nation-state actors at bay.

The new imperative is to evaluate every single connection to the data — not just the first time a network is accessed, a device is connected or a user signs in. It’s not personal; even the most credible and devoted employees could be impersonated, have their credentials stolen or inadvertently encounter data that should be off-limits to them. As such, implicit trust should be a thing of the past.

“Freeing” the Data Creates New Challenges

At the same time, commercial and government organizations are eager to improve operational performance and agility through digital transformation, which often includes connecting systems to facilitate information sharing and optimize data utilization. But this involves freeing historically siloed, sheltered data and using it to derive actionable business intelligence. As a result, security leaders must continually adopt new practices to lock down critical, sensitive data. Many of them are turning to the Zero Trust Architecture (ZTA) framework, which was published by the National Institute of Standards and Technology (NIST) in August 2020.

For instance, the US. federal government’s ZTA strategy, announced by the White House in January, requires agencies to meet specific cybersecurity standards and objectives by the end of the fiscal year 2024.

Among the targets are encrypting and authenticating all traffic —  even internal traffic — and “categorizing data based on protection needs, ultimately building a foundation to automate security access rules.” Reconciling these seemingly contradictory data storage and management challenges is enough to keep any enterprise risk and security leader up at night.

Leveraging a Zero Trust Approach for Data Management

Adopting a zero trust approach can help organizations address these challenges while safely making complex and sensitive data accessible for the most demanding analytics, data science or AI tasks. At its core, bringing a zero trust approach to data management should assume that everyone —  and every transaction — is a potential threat.

But with so many potential threats and myriad uses for an ever-increasing amount of data, what steps can organizations take to secure their data?

Here are four key data management capabilities that enterprise security and risk leaders should consider to bring a zero trust mindset to protecting organizational data:

1. Fine-grained access control

With the increased diversity and volumes of data comes the need to manage data of many different sensitivities. Organizations should prioritize the ability to co-locate all data of mixed sensitivities to enable a “connecting of the dots” across these datasets. Fine-grained access control allows administrators to provide users access without giving them the “keys to the kingdom” for unnecessary data.

As a result, every team member can only access the data they need to access. Even system administrators would not know of the existence of a dataset or record unless they are specifically granted access, further guarding against potential internal bad actors.

2. Ingesting and storing any data

This should include “non-traditional” data sources (such as everything from the Internet of Things and documents to video and audio, etc.) without ETL (extract, transform, load) or pre-processing. Especially when it comes to business analytics functions, these non-traditional data sources are becoming just as important as structured data, but they present new challenges by exponentially increasing the overall volume of data available.

Data management technology should quickly ingest new and different types and sources of data without building individual “stove-pipe” integrations while still being able to leverage a range of reusable connectors and parsers or have the ability to add new ones quickly.

3. Late binding of data

Simply put, organizations need the ability to assign context to data when it is used, not when it is ingested. Presuming at the outset how different teams might use the data inherently limits how it can actually be used in the future. Storing the data in its raw form and then assigning the context when it is used will exponentially increase the value of that data through expanded use cases.

To maximize its value, this should include the ability to use the same data in many different ways and by many different data consumers.

4. Speed of delivery

While organizations are being asked to support more use cases for more data, they are also being asked to provide these capabilities in shorter time windows. The days of long development cycles are in the past and simply unacceptable to our quickly changing data landscape. Users are demanding the ability to quickly test out new ideas or develop new applications.

What’s more, the advancements in artificial intelligence and machine learning add further complexity for improved data management. Organizations must find ways to enable DevSecOps activities on top of all their data, resulting in faster time to market and ever-improving production capabilities.

With these approaches in mind, any security-conscious organization can benefit from applying a zero trust mindset to their data management practices, particularly those with data of mixed sensitivities, because the most valuable data is often the most sensitive data.

Applying this flexible, unified security model across all data — at both the dataset and record level — ensures safe access to complex and sensitive data, thus increasing the value and utilization of all data within the organization.

*This article first appeared in Security Magazine.