Info Image

Establishing Robust Data Management to Capitalize on the Digital Revolution

Establishing Robust Data Management to Capitalize on the Digital Revolution Image Credit: CreativeNaturePhotography/BigStockPhoto.com

Data management is key for today’s automated and AI-powered businesses where employees, suppliers, operational devices, and processes are dispersed across multiple sites, regions and perhaps the world.

Now, more than ever, it is essential for businesses to put robust data management policies in place as they cope with a flood of data. As investments in the latest technologies accelerate, their impact on data volumes may not be fully appreciated.

For example, data fidelity - this could be video from CCTV or collaboration tools such as Teams or Zoom upgrading from HD to 4K driving an exponential growth in both volumes and processing requirements. Or the number of data points a business generates growing at an unprecedented rate as better data and analytics themselves prompt more questions and requests for data.

The number of data sources is growing too, from mushrooming deployments of connected IoT devices to personal tech such as wearables. All this data needs to be organized, normalized, indexed, and aggregated with some kind of directory structure just so that it can be consumed.

Inadequate data management further strained by surging volumes and an ever-increasing number of data sources may also increase cyber security threats. Securing data access, governance, relevance, and integrity needs to be prioritized. Data cleansing is vital too. With more data, this will become more costly.

Are organizations considering all of this as part of the business case for new technology investments?

For example, a bank may wish to offer customers personalized insights on their finances based on their account activity to improve the service experience but also help with fraud detection. This also offers the bank marketing insights on individual customers such as their major life events. When aggregated across their customer base, it may help to identify demographic or regional trends. It’s a win-win for the customer and the bank, which benefits from insight-driven up-sell opportunities.

To do this in an effective way, inbound data interpretation needs to be cleaned and trained. This involves intensive processing but increasingly, the use of Machine Learning to literally read the volume of data and Large Language Models (AI) to interpret them can be used to bring the time to value right down. The challenge is that this comes at a cost, and it is why data management is key.

While the definition of data management is straightforward - collecting, organizing, and maintaining data to ensure accuracy, security, and accessibility - how it is done is constantly evolving. For all organizations - but especially those operating internationally - the trend towards more stringent regulation also must be considered.

National and regional regulators are quickly adapting to and developing rules around how organizations need to manage data they oversee. Data sovereignty has gained prominence in recent years. This is about the extent to which data is subject to the laws and regulations of a particular jurisdiction, regardless of where it is stored.

As organizations increasingly digitalize, they are managing more data, in more places with increasingly stringent regulations. As they do this, there are three key areas to focus on: managing data, increasing data literacy among employees and navigate data sovereignty.

Managing data volume

Data influx is not limited to customer-generated content; rather, it encompasses a wide array of sources including data from a booming number of operational devices (IoT), social media interactions and transaction records.

Building a secure and scalable data infrastructure is the cornerstone of managing surging data volumes. Scalable storage solutions, robust data centers, and efficient data pipelines are essential components of a well-architected strategy. Utilizing public cloud services can offer flexibility and scalability, allowing businesses to adapt to changing data needs without the up-front capital cost, time, and hassle of building their own data center infrastructure.

Amidst the flood of data, maintaining data quality becomes paramount. This requires strong frameworks, automating data cleansing processes, and regular, predictable data audits. Ownership and accountability should be front and center: encouraging employees to take responsibility for the quality of data they interact with enhances the all-important team-player approach.

Increasing data literacy

Data literacy is often overlooked. Employees must be able to read, analyze and interpret data effectively. In an era where data-driven decision-making is the norm, businesses should prioritize data literacy to ensure their people can harness data to drive innovation and productivity.

Effective data literacy begins with proper data governance. This means clear data policies, standards, and procedures to ensure data is properly governed - wherever and whenever people log in to work. Major steps include defining data ownership, access controls, and data classification. Data governance frameworks provide a structured approach to managing data assets and ensuring data quality and security.

Investing in training and education strengthens data literacy among employees. Programs can take various forms, including workshops and seminars to online courses. When hiring, it is important to integrate data literacy into the onboarding process.

Navigating data sovereignty

Data sovereignty is becoming increasingly complex, especially for organizations operating across multiple jurisdictions, with differing rules and guidance on data processing and storage.

To establish robust data management, organizations must comply with local regulations where they gather, store and process data. Failure can result in legal repercussions and reputational damage. They must stay updated on evolving data privacy laws, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States.

The physical location of the data centers that deliver an organization’s cloud services can have significant implications for data sovereignty. Some jurisdictions have strict data localization requirements, mandating that certain data types must be processed within their borders. Organizations must carefully select data center locations to align with these requirements.

To comply with this, some organizations deploy a fully sovereign cloud that keeps all data and metadata on sovereign soil and prevents foreign access to it. This creates a trusted and secure environment but gets expensive when multiplied by the number of jurisdictions the organization operates in. Furthermore, it is unlikely to deliver the economies of scale a large organization expects or the speed of innovation of public cloud, hampering competitiveness.

In this case, a distributed cloud should be considered - one that runs in multiple locations including: the cloud provider’s infrastructure; organization’s own data center or sites such as offices or factories; another cloud provider’s data center; or on third-party or collocated hardware.

To power this solution, organizations require a “cloud-centric” network - one that provides compatibility across all these clouds, offers interoperability, and smooths data exchange in a way that doesn’t lead to excessive egress and bandwidth charges when data is moved from one public cloud to another.

In effect, this type of managed distributed cloud moves to the data, complying with governance and regulations across multiple jurisdictions.

By investing in a cloud-centric infrastructure, data quality, analytics, data governance, training and compliance, multinational organizations can build a strong foundation for effective data management, ensuring that data remains a strategic asset rather than an overwhelming liability.

NEW REPORT:
Next-Gen DPI for ZTNA: Advanced Traffic Detection for Real-Time Identity and Context Awareness
Author

Colin Bannon is the CTO for BT Business and is responsible for defining the technology vision for BT customers. He was instrumental in the launch of Global Fabric, an end-to-end programmable platform, connecting you seamlessly to the cloud and bridging the gap between public and private clouds.

PREVIOUS POST

Push to Eliminate 'Digital Poverty' to Drive Demand for Satellite-Powered Broadband Connectivity Post Pandemic