Remote work models, accelerated by the Covid-19 pandemic, have pushed firms to move to new cloud services as they look to maintain communications and collaboration. Hackers and cybercriminals have recognized the opportunity this expanded attack surface represents. Maximizing data protection, even for organizations that are not necessarily held to the strictest data protection, retention, and security regulations, is now front-and-center in the “new normal”.
Multi-cloud services platforms, for example global file systems, allow users to collaborate on both projects with ease, no matter where they are located inside or outside network boundaries. Handled right, they can help organizations manage the data explosion, and with it the shift to remote workforces, by transitioning the complex and costly traditional storage model to the cloud.
Most of these services are promoted as secure, but as ever, the weakest link in the fabric that transforms cloud storage into a global file system, is solutions that were not purpose-built for the cloud in the first place. Sharing large volumes of data in an untrusted cloud remains a challenge, and when catastrophic security failures occur, such as ransomware schemes and other near and present dangers, there is notoriously little a CISO can do, no matter how great their infrastructure is.
As a troubling second wave of the pandemic comes into view, the urgent need for cloud-native file solutions that can prevent external attacks and data loss from the cloud and unmanaged devices, and which do not add friction when working from home in support of the health and safety of the enterprise, is uncontested.
SMB exploits and dark site security
SMB is a network file sharing protocol that requires an open port on a computer or server, generally port number 445 on recent versions of Windows, to communicate with other applications and systems over the internet. Traffic to SMB port 445 has been reported to account for 556 million attacks, perhaps best exemplified by the WannaCry exploits.
Port 445 is of special concern to certain entities because it represents a potential area that a hacker would focus on to gather information. A private secure site, also known as a dark site, is an installation with security restrictions that requires absolutely no external communications. These installations can include government and military sites and contractors, and highly regulated industries such as healthcare and finance.
The U.S. Computer Emergency Readiness Team (US-CERT) recommends stopping “all versions of Server Message Block (SMB) at the network boundary by blocking TCP port 445 with related protocols on UDP ports 137-138 and TCP port 139, for all boundary devices.”
Certainly, as a matter of IT policy, dark sites universally ban open SMB ports in the first instance. Even in these cases, however, insider threats by credentialed users carry the potential to introduce SMB exploits because this weakness is a relatively easy target.
Cloud-based file storage systems that use SMB ports 445, 135 and 139 to enable large-scale collaboration across network boundaries unnecessarily expose organizations. In fact, very few platforms are not designed to use these ports for outbound data traffic, leaving them at risk for any number of attack possibilities.
In particular, banking and financial services firms that also have private connections to cloud providers do not really want to allow port 445 outbound from their own networks, even via a private link to the cloud. This often leads to a brick wall when trying to architect hybrid cloud NAS deployments.
The only effective measure is to remove SMB ports from a network, offering hackers one less tool and ending its availability as a threat vector. By default, for example, Advanced Server for UNIX (ASU) and Azure P2S VPN do not use port 445.
In terms of file sharing systems, an approach which supports private secure site operations, disabling all external public communications in compliance with US-CERT requirements, is one way to deal with the problem. Moreover, strategies that allow port 445 to remain open but internally to an organization and the boundary network only, or which process SMB-related data within the network boundary and then send it de-duped, compressed and encrypted to the cloud on another port, as with Panzura, sidesteps the issue.
In light of this, decision makers may want to consider solutions that are certified by the U.S. Department of Commerce National Institute for Standards and Technology (NIST) as compliant with the Federal Information Processing Standards (FIPS) 140-2 validation criteria. Especially in the case of dark sites, compliance with the U.S. Government Configuration Baseline (USGCB) for IT products sets basic IT product configurations that can be deployed across federal agencies.
Moreover, organizations subject to HIPAA regulations require further assurance that these solutions meet strict mandates around the protection of protected electronic health records and other personally identifiable data.
Shared responsibility security
The enterprise is accountable for their portion of the shared responsibility security model that undergirds so many clouds today. But engineering cloud file systems that provide data protection and high-availability with no single point of failure has been challenging. The unfolding offensive in the face of Covid-19 is unprecedented, with new and existing file protections now top of mind.
From the standpoint of encryption, FIPS 140-2 compliance provides assurance, as does at-rest data protection with AES-256 bit encryption, while TLS 1.2 encryption secures data in flight. For instance, with these methods, when data is compressed and encrypted (in that order) before it is sent over the wire, it is intelligible even if intercepted. The most advanced cloud-based file systems use military-grade AES-256-CBC encryption to encrypt data stored both at the edge and in the cloud, and transport layer security (TLS/SSL) encryption technology is now being used.
Data masking, or obfuscation, makes data unreadable by replacing randomly chosen characters with randomly chosen data. Cloud file systems that use encryption technology to mask data stored in the cloud, which includes directory names, file names, and file data, provide an important means of data protection.
When connected to the cloud, enterprises should make sure not to rely on cloud encryption and security measures alone. Conversely, data bloat is a common result of poorly applied encryption methods, and can easily cause storage costs to get out of control. Companies can eliminate cumbersome, duplicative backup and DR processes with dedicated and global HA options, immediate data consistency, granular snapshot restoration, and a near-zero RPO.
Chunking files, de-duplicating them, and compressing the results before sending the data to the cloud is particularly effective, and blending all of these technologies effectively masks the data itself. Even if someone was able to access production data in the cloud they would only be able to view completely incoherent strings of characters.
The combination of public cloud provider data availability and durability, along with enterprise-grade data protection technologies, should be part of any global file system. As we look to protect the wellness and safety of remote workforces, while preserving the integrity, security and privacy of data across organizational boundaries, it is crucial that we consider the most innovative approaches.
The big picture objective, while today we are focused on the pandemic, is to build a secure future for cloud-based collaboration and file transfer as a foundation for real competitive advantage. Locking up this advantage will be the unanticipated dividend of securing the health of our colleagues.
Edward M.L. Peters, Ph.D., Chief Innovation Officer, Panzura