Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors https://solutionsreview.com/backup-disaster-recovery/ Solutions Review Mon, 10 Jul 2023 15:40:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 80744990 What to Expect at Solutions Review’s Spotlight with Rubrik on July 20 https://solutionsreview.com/backup-disaster-recovery/what-to-expect-at-solutions-reviews-spotlight-with-rubrik-on-july-20/?utm_source=rss&utm_medium=rss&utm_campaign=what-to-expect-at-solutions-reviews-spotlight-with-rubrik-on-july-20 Mon, 10 Jul 2023 13:58:56 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5660 Solutions Review’s Solution Spotlight with Rubrik is entitled: See How University of Reading Safeguards Their Data with Rubrik. What is a Solutions Spotlight? Solutions Review’s Solution Spotlights are exclusive webinar events for industry professionals across enterprise technology. Since its first virtual event in June 2020, Solutions Review has expanded its multimedia capabilities in response to […]

The post What to Expect at Solutions Review’s Spotlight with Rubrik on July 20 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
What to Expect at Solutions Review's Spotlight with Rubrik on July 20

Solutions Review’s Solution Spotlight with Rubrik is entitled: See How University of Reading Safeguards Their Data with Rubrik.

What is a Solutions Spotlight?

Solutions Review’s Solution Spotlights are exclusive webinar events for industry professionals across enterprise technology. Since its first virtual event in June 2020, Solutions Review has expanded its multimedia capabilities in response to the overwhelming demand for these kinds of events. Solutions Review’s current menu of online offerings includes the Demo Day, Solution Spotlight, best practices or case study webinars, and panel discussions. And the best part about the “Spotlight” series? They are free to attend!

Why You Should Attend

Solutions Review is one of the largest communities of IT executives, directors, and decision-makers across enterprise technology marketplaces. Every year over 10 million people come to Solutions Review’s collection of sites for the latest news, best practices, and insights into solving some of their most complex problems.

With the next Solutions Spotlight event, the team at Solutions Review has partnered with leading zero trust data security vendor Rubrik. The resource webinar will showcase how the immeasurable volumes of data in your Microsoft 365 environment are at risk. And now that Rubrik is partnered with Microsoft, its Microsoft 365 protection is even stronger.

  • Salvatore Buccoliero, Sales Engineer at Rubrik: Salvatore is a Senior SAAS & Security Sales Engineer who enjoys working with customers to secure enterprise data. Salvatore gets motivated by working with disruptive products and fast-growing organizations and has experience since the 2000’s in launching new vendors and distributing products.
  • Kevin Mortimer, Head of Operations at the University of Reading: Kevin, Head of Operations at University of Reading, has been a Rubrik customer since 2018. Kevin is a self-motivating, enthusiastic technologist at heart with a focus on service management. Innovative emerging technologies have always been a core value for infrastructure services.

About Rubrik

Rubrik is one of the most widely used enterprise data protection solutions in the world. Rubrik provides data protection and data management in hybrid IT environments. The platform is a scale-out-architecture-based data protection tool with cloud integration, live mount for Oracle databases, support for Office 365 backup, and support for SAP HANA backup. Rubrik‘s solution is recommended to buyers looking to protect highly virtualized on-prem environments and hybrid environments that leverage Microsoft Azure and AWS.

FAQ

  • What: See How University of Reading Safeguards Their Data with Rubrik
  • When: Thursday, July 20, 2023, at 12:00 PM Eastern Time
  • Where: Zoom meeting (see registration page for more detail)

Register for Solutions Review’s Solution Spotlight with Rubrik FREE

The post What to Expect at Solutions Review’s Spotlight with Rubrik on July 20 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5660
Storage and Data Protection News for the Week of June 30; Updates from Cobalt Iron, DataGrail, Zscaler & More https://solutionsreview.com/backup-disaster-recovery/storage-and-data-protection-news-for-the-week-of-june-30-updates-from-cobalt-iron-datagrail-zscaler-more/?utm_source=rss&utm_medium=rss&utm_campaign=storage-and-data-protection-news-for-the-week-of-june-30-updates-from-cobalt-iron-datagrail-zscaler-more Thu, 29 Jun 2023 22:25:03 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5650 Solutions Review editors curated this list of the most noteworthy storage and data protection news items for the week of June 30, 2023. Keeping tabs on all the most relevant storage and data protection news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines […]

The post Storage and Data Protection News for the Week of June 30; Updates from Cobalt Iron, DataGrail, Zscaler & More appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Storage and Data Protection News for the Week of June 30; Updates from Cobalt Iron, DataGrail, Zscaler & More

Solutions Review editors curated this list of the most noteworthy storage and data protection news items for the week of June 30, 2023.

Keeping tabs on all the most relevant storage and data protection news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last week, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy storage and data protection news items.

Top Storage and Data Protection News for the Week of June 30, 2023

BigID BLaunches Privacy and Security Connected Application at Snowflake Summit

BigID automatically classifies data and applies policies to identify sensitive data across a customer’s entire data landscape that is subject to security and/or privacy regulations such as CCPA or GDPR.  Powered by Snowflake allows BigID to enable customers to leverage the Snowflake Data Cloud to run security analytics with BigID metadata to deliver advanced insight and protection of their data.

Read on for more.

Cirrus Data Block Migrations Have Been Added to Azure Storage Migration Program

Cirrus Data’s Cirrus Migrate Cloud solution can assess your current system, find the most cost-efficient Azure resources to serve your running workloads, and then set up and migrate your data to those resources. The migration occurs in the background, and you get to plan your migration cutover – and thanks to Cirrus Data’s cMotion migration cutover technology, you can do this with nearly zero downtime, or no downtime at all for clustered enterprise applications.

Read on for more.

Cobalt Iron Patents Proactive Technology for Automated Remediation of Cyber and Storage Events

This patented technology is unique in that it enables automated health remediation of various failures and conditions affecting storage devices and backup operations. As a result, backup data and operations will become more resilient to storage device failures and cyber threats, thereby improving availability for storage and backup administrators and other IT professionals who are responsible for the health and security of storage and backup resources and operations.

Read on for more.

DataGrail’s Achieves More Than 2,000 Quality Integrations with Common Enterprise Apps

DataGrail provides that missing link so that companies can act and handle personal information accordingly, giving people choice and control. With DataGrail customers can see at-a-glance which integrations are connected, its owner and historical activity, and make changes as necessary.

Read on for more.

Keepit Launches Backup & Recovery for Microsoft Azure DevOps

Keepit is the world’s only independent, vendor-neutral cloud dedicated to Software-as-a-Service (SaaS) data protection with a blockchain-verified solution, and the Azure DevOps service adds to the company’s already industry-leading coverage for Microsoft’s cloud services.

Read on for more.

New Object First Research Finds Customers Demand Ransomware & Data Recovery

Consequently, 75 percent of consumers would switch to another company after a ransomware attack. Furthermore, consumers are requesting increased protection from vendors that hold their data, with 55% favoring companies with comprehensive data protection measures such as reliable backup and recovery, password protection, and identity and access management strategies.

Read on for more.

Quantum Integrates Backup & Data Protection Portfolio with Latest Veeam Data Platform

Quantum offers Veeam customers flexibility and choice when designing their backup storage infrastructure, with products that offer fast performance to minimize RTO and RPO, provide lower costs for long-term data archiving, and easy scalability from Terabytes to Exabytes of capacity.

Read on for more.

Rubrik and Microsoft Partner on Generative AI-Powered Cyber Recovery & Remediation

Rubrik’s ability to provide time series data insights directly into Microsoft Sentinel enables organizations to address evolving cyber threats and safeguard their most sensitive information.  With this integration, the platform is designed to automatically create a recommended task workstream in Microsoft Sentinel created by Rubrik by leveraging large language models and generative AI through OpenAI.

Read on for more.

Expert Insights Section

Expert Insights Badge SmallWatch this space each week as Solutions Review editors will use it to share new Expert Insights Series articles, Contributed Shorts videos, Expert Roundtable and event replays, and other curated content to help you gain a forward-thinking analysis and remain on-trend. All to meet the demand for what its editors do best: bring industry experts together to publish the web’s leading insights for enterprise technology practitioners.

Solutions Review Set to Host Zscaler for Exclusive Spotlight Webinar on July 13

Solutions Review’s Solution Spotlights are exclusive webinar events for industry professionals across enterprise technology. With the next Solution Spotlight event, the team at Solutions Review has partnered with Zscaler and Banner Health to provide viewers with a unique webinar called Learn How Banner Health Ensures Seamless Digital Experiences.

Read on for more.

For consideration in future storage and data protection news roundups, send your announcements to the editor: tking@solutionsreview.com.

The post Storage and Data Protection News for the Week of June 30; Updates from Cobalt Iron, DataGrail, Zscaler & More appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5650
Storage and Data Protection News for the Week of June 23; Updates from Cisco, Catalogic Software, Zscaler & More https://solutionsreview.com/backup-disaster-recovery/storage-and-data-protection-news-for-the-week-of-june-23-updates-from-cisco-catalogic-software-zscaler-more/?utm_source=rss&utm_medium=rss&utm_campaign=storage-and-data-protection-news-for-the-week-of-june-23-updates-from-cisco-catalogic-software-zscaler-more Fri, 23 Jun 2023 16:47:28 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5643 Solutions Review editors curated this list of the most noteworthy storage and data protection news items for the week of June 23, 2023. Keeping tabs on all the most relevant storage and data protection news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines […]

The post Storage and Data Protection News for the Week of June 23; Updates from Cisco, Catalogic Software, Zscaler & More appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Storage and Data Protection News for the Week of June 23; Updates from Cisco, Catalogic Software, Zscaler & More

Solutions Review editors curated this list of the most noteworthy storage and data protection news items for the week of June 23, 2023.

Keeping tabs on all the most relevant storage and data protection news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last week, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy storage and data protection news items.

Top Storage and Data Protection News for the Week of June 23, 2023

New Arcserve Survey Reveals Ransomware and Data Recovery Vulnerabilities

The segment focuses on government IT departments’ approach and experience with ransomware and data recovery preparedness. The findings reveal several weaknesses that can hamper government departments’ fight against ransomware and their ability to recover data.

Read on for more.

Barracuda Hires Siroui Mushegian as Chief Information Officer

Mushegian joins Barracuda most recently from BlackLine where she was responsible for all aspects of BlackLine’s internal corporate IT. Before BlackLine, Mushegian held executive IT leadership roles at WNET New York Public Media, NBA, Ralph Lauren, and Time, Inc.

Read on for more.

Cohesity Announces Inagural Customer Excellence Awards

As a reflection of customers globally who are deploying Cohesity solutions to transform the way they protect, store, and manage data, the 2023 award winners are being celebrated for their innovation, forward-looking leadership, and impactful achievements in safeguarding company data to protect their business using Cohesity solutions.

Read on for more.

CloudCasa for Velero Now Supports Red Hat OpenShift APIs for Data Protection

Red Hat OpenShift APIs for Data Protection provides APIs with the ability to backup and restore Red Hat OpenShift cluster resources, internal images and persistent volume data. The Red Hat OpenShift APIs for Data Protection Operator installs Velero and Red Hat OpenShift plugins for Velero to use for backup and restore operations.

Read on for more.

Hitachi Vantara and Cisco Partner on Simplifying Hybrid Cloud Management

As a member of Cisco’s Service Provider program, Hitachi Vantara offers consumption-based managed services to Cisco customers looking for data center and hybrid cloud services. The services can help address a critical shortage of skilled workers in the IT industry, enabling enterprises to adopt new and emerging technologies more effectively and rely on a trusted organization in Hitachi Vantara to streamline their hybrid cloud operations.

Read on for more.

Opaque Systems Unveils Confidential AI & Analytics for Data Protection

Through privacy-preserving generative AI and zero trust data clean rooms (DCRs) optimized for Microsoft Azure confidential computing, Opaque enables multiple organizations to easily and securely analyze their combined confidential data without sharing or revealing the underlying raw data.

Read on for more.

Expert Insights Section

Expert Insights Badge SmallWatch this space each week as Solutions Review editors will use it to share new Expert Insights Series articles, Contributed Shorts videos, Expert Roundtable and event replays, and other curated content to help you gain a forward-thinking analysis and remain on-trend. All to meet the demand for what its editors do best: bring industry experts together to publish the web’s leading insights for enterprise technology practitioners.

Solutions Review Set to Host Zscaler for Exclusive Spotlight Webinar on July 13

Solutions Review’s Solution Spotlights are exclusive webinar events for industry professionals across enterprise technology. With the next Solution Spotlight event, the team at Solutions Review has partnered with Zscaler and Banner Health to provide viewers with a unique webinar called Learn How Banner Health Ensures Seamless Digital Experiences.

Read on for more.

Continuity Software’s Doron Pinhas and Veeam’s Eric Ellenberg Explain How to demonstrate Storage & Backup Compliance

Compliance to industry standards and regulatory mandates can absorb a huge amount of time. Organizations need to verify they comply with the different requirements of security frameworks and regulations such as CIS, NIST, PCI DSS, ISO, and others. In this feature, Continuity Software CTO Doron Pinhas and Veeam‘s Eric Ellenberg offer tips on how to demonstrate data storage and backup compliance.

Read on for more.

For consideration in future storage and data protection news roundups, send your announcements to the editor: tking@solutionsreview.com.

The post Storage and Data Protection News for the Week of June 23; Updates from Cisco, Catalogic Software, Zscaler & More appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5643
How To Demonstrate Storage & Backup Compliance A Practical Guide https://solutionsreview.com/backup-disaster-recovery/how-to-demonstrate-storage-backup-compliance-a-practical-guide/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-demonstrate-storage-backup-compliance-a-practical-guide Fri, 16 Jun 2023 19:47:21 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5616 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Continuity Software CTO Doron Pinhas and Veeam‘s Eric Ellenberg offer tips on how to demonstrate data storage and backup compliance. Compliance to industry standards and regulatory mandates can absorb a huge amount of […]

The post How To Demonstrate Storage & Backup Compliance A Practical Guide appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
How To Demonstrate Storage & Backup Compliance A Practical Guide

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Continuity Software CTO Doron Pinhas and Veeam‘s Eric Ellenberg offer tips on how to demonstrate data storage and backup compliance.

Compliance to industry standards and regulatory mandates can absorb a huge amount of time. Organizations need to verify they comply with the different requirements of security frameworks and regulations such as CIS, NIST, PCI DSS, ISO, and others.

In addition, many of these standards require organizations to verify that they are carrying out their fiduciary responsibilities concerning Common Vulnerabilities & Exposures (CVEs).

The big problem is time.

Storage & Backup Compliance is Time Consuming

Some organizations spend countless hours manually preparing for compliance-related activities such as a PCI audit. Once the preparations are complete, even more time is absorbed in writing reports that demonstrate compliance—and this is only the beginning of an ongoing process.

According to NIST document SP 800-209 Security Guidelines For Storage Infrastructure, organizations are required to: “periodically and proactively assess configuration compliance to storage security policy”. This includes the following 3 steps:

Historically, these have been weak areas within organizations. The reasons are not difficult to comprehend—the scope of compliance for storage and backup systems is immense.

Many of the tools used to scan for vulnerabilities and security misconfigurations do a poor job in identifying storage and backup risks. In fact, they may cause the organization to falsely claim compliance when numerous security threats remain. The reason for this is that compliance often requires specific configurations for systems at all levels of your stack—not just the guest operating system that hosts your applications—working in concert to fulfill the policy’s objective. This includes your storage and backup systems.

Let’s dive a little deeper on this and take a look at 6 steps to verify storage & backup compliance.

Demonstrating Storage & Backup Compliance – in 6 Steps

System software

Storage and backup systems suffer from CVEs like any other software, yet many organizations are either unaware that they exist, or have been lulled into a false sense of security that all critical CVEs have been addressed. The plain fact is that storage and backup operating systems are often riddled with vulnerabilities that can enable threat actors to gain unauthorized access, elevate permissions, and run arbitrary code. As well as being present within storage and backup systems, vulnerabilities may also be found in underlying components and modules, including embedded switches, controllers, boards, drivers, firmware, and other components.

Unfortunately, most vulnerability scanners simply fail to assess storage and backup systems. They miss these critical CVEs and misconfigurations.

SAN Zoning and Masking

A large portion of Enterprise Block Storage is implemented using dedicated, non-IP Storage Area Networking (SAN). To allow hosts to access block storage devices (often referred to as “LUNs”), these networks need to be configured to support “Zones” (somewhat similar to Ethernet VLANs) that pool together hosts and storage devices that can communicate with each other, and “Masking” (somewhat similar to IP ACLs) that further control which block devices can effectively be accessed at various points along the network path.  Network Zoning and Masking mistakes are more common than many people realize. LUNs may have been left accessible to unintended hosts. Replicated copies and snapshots, too, may not have been properly secured. If that is the case, a hacker may be able to mount sensitive data to unauthorized clients.

Audit Logging Misconfigurations

Many backup systems are not configured sufficiently for audit logging. This manifests in ways such as missing audit log content, audit logs not relayed to central syslog servers, or logging settings that are tweakable by hackers to relay logs to unapproved hosts. These errors make it more difficult for the organization to detect brute force attacks and anomalous behavior patterns. They also impede forensic investigation and can curtail recovery efforts.

Default Accounts and Passwords

A surprising number of storage systems still include default administrative usernames and passwords. These factory settings can easily be exploited to cause serious damage. Compliance efforts must carefully look over the different storage subsystems and respective user accounts to ensure access security policies are properly enforced.

Control Over Administrative Access

Configuration drift and oversights result in more user accounts with administrative access than required. An excessive number of administrator accounts increases the attack vectors that can be exploited by malicious actors. Furthermore, storage management components including Command Line and API interfaces often do not follow a least privilege design (aimed at making them accessible only by a minimal number of administrative accounts using an authentication system that complies with security and audit policies). This leaves many storage and backup systems open for data manipulation, theft, and destruction.

Backup Isolation and Immutability

Various standards require that backup data shall be kept in an isolated, inaccessible environment that does not overlap with the production network.

These are just a few of the many security considerations and risks present in any storage and backup system.

Fines and Penalties Galore

Organizations that fail in any of the activities required to demonstrate compliance are subject to heavy fines and penalties. These days, when it comes to regulatory compliance, there are more eyes on backups than ever:

  • PII and PHI/HIPAA-HITECH, for example, are of interest to the SEC, PCI Council, and others
  • SOX and PCI-DSS are very much under the microscope of regulators in financial services, retail, and public corporations
  • Healthcare organizations must watch out for HIPAA compliance lawsuits in federal court
  • Too-big-to-fail organizations follow NIST, FFIEC and more
  • Federal organizations follow NIST
  • Critical Infrastructure organizations must adhere to NERC CIP
  • Retail, Financial, and many others follow PCI

Since becoming law in 2016, almost 900 organizations have been fined more than 1.25 billion Euros due to violations of GDPR. Amazon Europe alone was fined three-quarters of a billion Euros. Fines have been imposed on the likes of WhatsApp, Google, Target, Yahoo, Marriott, Equifax, and Facebook. All were doled out for various PII violations.

Access the NIST Special Publication 800-209, and get a comprehensive set of recommendations for the secure deployment, configuration, and operation of storage and backup systems.

The post How To Demonstrate Storage & Backup Compliance A Practical Guide appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5616
How Advertisers Can Adapt Data Clean Rooms for a Privacy-First World https://solutionsreview.com/backup-disaster-recovery/how-advertisers-can-adapt-data-clean-rooms-for-a-privacy-first-world/?utm_source=rss&utm_medium=rss&utm_campaign=how-advertisers-can-adapt-data-clean-rooms-for-a-privacy-first-world Fri, 16 Jun 2023 18:42:15 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5615 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Opaque Systems‘ VP of Partnerships Mark Ailsworth explains how advertisers can adapt data clean rooms for a privacy-first approach. Ever since the phase-out of third-party cookies, the AdTech industry has been scrambling. Understanding […]

The post How Advertisers Can Adapt Data Clean Rooms for a Privacy-First World appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
How Advertisers Can Adapt Data Clean Rooms for a Privacy-First World

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Opaque Systems‘ VP of Partnerships Mark Ailsworth explains how advertisers can adapt data clean rooms for a privacy-first approach.

Ever since the phase-out of third-party cookies, the AdTech industry has been scrambling. Understanding and using data is crucial to be successful and well informed. It is how the industry identifies users across different websites to generate personalized ads, run frequency capping, and measure campaign performance and attribution, among other things. Yet, with tightening privacy regulations, it is increasingly difficult to harness valuable data without sacrificing privacy.

Many advertisers have turned to data clean rooms as a solution, but recent regulations – even on a state level – signify a breaking point.

In today’s privacy-first world, data clean rooms must evolve to comply with ever-evolving policies and keep the industry on track. Otherwise, organizations run the risk of paying lofty fines for non-compliance and, more importantly, suffering detrimental impacts on their reputation.

Data Clean Rooms are Important but Have Become Limited

Data clean rooms were developed to provide a secure environment where two or more parties could share data for multiple advertising use cases. Traditional data clean rooms have at least some industry-standard security measures in place to maintain the confidentiality of the data assets added to the environment. In AdTech specifically, a publisher or data provider leverages a data clean room to collaborate with a client by combining, comparing or modeling data across two or more datasets.

Not all data clean rooms provide the same levels of protection and privacy – but they are all a first step to enabling personal privacy while personalizing the consumer experience. Traditionally, there are three main categories of clean rooms, and it’s important to understand the nuances of each before implementation. The main categories include:

  • “Media-Relevant Clean Rooms” are developed by large publishers and walled garden entities to compare first-party customer data to their specific audiences – think Google, Meta, etc.
  • “Data Enhancement Clean Rooms” are developed by Marketing Service Providers to enhance their client’s first-party data for audience-specific and campaign-specific use cases. These use cases include audience insights and segmentation, and multi-touch attribution.
  • “Bring-Your-Own-Data Clean Rooms” are suited to solve big data challenges and allow clients to pull in any partner or data source required for their collaboration projects. This is where we see Confidential Computing and privacy-enhanced technology join the conversation with the likes of Snowflake and Databricks.

But the question remains: why are traditional data clean rooms falling short? The short answer is that new privacy regulations limit the efficacy and usability of traditional data clean rooms, and the industry needs to adjust accordingly. New regulations in Connecticut, Utah, and Virginia – with more states following suit – advertisers need to prioritize securing data end to end. With penalties of these state-level regulations being incredibly costly (such as Sephora‘s recent $1.2mm fine), failing to do so could mean businesses come to a halt.

The Way Forward for Data Clean Rooms Combines Hardware & Software

Third parties need to be trusted when using data clean rooms of any type. Even when datasets are encrypted at rest and in transit, they must be decrypted manually to be used, processed, or modeled. This process opens up sensitive or unencrypted data to exposure. As evident by the plethora of data breaches over the years, this practice has diminished customer trust in AdTech organizations. To ensure data clean rooms holistically adhere to privacy regulations, there is a need for a hardware environment where data sets enter totally encrypted and remain that way throughout all data processing. In other words, transitioning from a human process to hardware and software-enabled automation.

This level of complete privacy protection enables multiple parties within and across organizations to share confidential data and perform analytics and AI without violating privacy laws and regulations. For example, if a marketing professional at a Home Improvement Retailer can identify customers who are a few weeks away from moving residences, a tremendous targeting opportunity emerges. By comparing the customer data set to a “new mover” signal dataset in a privacy-safe and completely encrypted process, marketers can hone in on their target audience and take advantage of perfectly timed ad targeting – all without ever having to directly share PII with their 3rd-party data vendor.

With no risk of human error, consumers can shift focus from institutional trust to programmatic trust, and the AdTech industry can remain compliant in a privacy-first world.

The post How Advertisers Can Adapt Data Clean Rooms for a Privacy-First World appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5615
The Critical Role of Safe and Secure Backup System Storage https://solutionsreview.com/backup-disaster-recovery/the-critical-role-of-safe-and-secure-backup-system-storage/?utm_source=rss&utm_medium=rss&utm_campaign=the-critical-role-of-safe-and-secure-backup-system-storage Fri, 16 Jun 2023 18:36:14 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5614 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, SANS Institute Dean of Research Dr. Johannes Ullrich explains the critical role of safe and secure backup system storage. It’s no secret that backup systems are critical to preserving sensitive data files from […]

The post The Critical Role of Safe and Secure Backup System Storage appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
The Critical Role of Safe and Secure Backup System Storage

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, SANS Institute Dean of Research Dr. Johannes Ullrich explains the critical role of safe and secure backup system storage.

It’s no secret that backup systems are critical to preserving sensitive data files from ransomware, theft, sabotage, and accidental loss. However, it’s important to remember that merely leveraging backups isn’t the end-all-be-all solution to a challenge riddled with complexity. Just because organizations have backup systems in place does not always mean their data is fully protected in the wake of a loss-causing event. And amid sharp rises in the volume and velocity of attacks, the consequences of poor data backups are too severe to overlook. For example, IBM’s 2022 Cost of a Data Breach Report found:

  • Globally, the average total cost of a data breach increased by 13% YoY to a record-high $4.3 million in losses. U.S. organizations were most impacted, with an average loss of $9.4 million per breach.
  • The average duration of identifying and containing a data breach lasted more than 275 days – equivalent to over nine months of downtime.

As attackers have grown more skilled and sophisticated, they are now leveraging hard-to-detect tactics, techniques, and procedures (TTPs) that capitalize on backup system vulnerabilities to either steal data or disrupt recovery operations. Remote access backups, for instance, are often reliant on password protections. Due to poor password hygiene or the absence of two-factor authentication, these backup systems can be easy targets for threat actors to utilize as attack vectors against protected systems.

When exploited, backup software vulnerabilities can also compound into giving attackers direct access to live system environments. Take the CVE-2022-36537 vulnerability that was publicized in early 2023 for example. Threat actors used it to access additional servers that were backed up on the same system, essentially “surfing backward” into live environments to exfiltrate data and distribute malware. That very same scenario is impacting organizations of all sizes or sectors, heightening the criticality of effectively implementing safe and secure backup system storage to maximize protection and agility.

The 3-2-1 Rule

Organizations should consider data assets at risk if they are not backed up in at least three different locations. Coined the 3-2-1 rule, this approach combines a diverse mix of cloud, on-premises, and offline/remote copies to ensure data can be preserved even if an online backup is disrupted. Among all forms of backup systems, cloud-based backups are often most vulnerable. In turn, organizations should be leveraging an on-premises backup that can drive rapid restoration at scale, especially in cases where there’s a high volume of critical data to recover.

Always be cognizant of testing recovery speed ahead of time. This provides an accurate barometer of how long it will take to recover sensitive files in the wake of a breach when extended downtime durations can translate to millions in financial losses. It took the City of Atlanta’s municipal department seven full days to restore services from a ransomware event, and in a similar attack against Baltimore’s city municipal department, the recovery timeline lasted more than six weeks. Both city governments ultimately suffered a combined $20-plus million in losses largely due to operational downtime.

When designing a cloud-based solution architecture, focus on access controls, authentication requests, and how the backup lifecycle – spanning from creation over retrieval to eventual deletion — is managed.

Best Practices to Consider

Any data leaving the direct control of an organization, whether it’s physical backup files being shipped offsite or online backups migrating to the cloud, must always be encrypted before exiting the environment.

Encrypting backups adds an additional layer of security by converting sensitive information into an unreadable format – if attackers intercept data while in transit, they still couldn’t access it without a decryption key. Beyond transit data should also be encrypted while at rest at the secondary backup location as well. In addition, organizations should allocate equal prioritization to the three foundational components of effective data management:

  • Data Protection: Actively protect both primary and secondary data backups from loss, theft, compromise, and corruption with the ability to rapidly restore data after an incident.
  • Data Storage: Create a well-defined security architecture that promotes the safe storage of data backups both on-premises and in the cloud.
  • Data Compliance: Ensure all backup systems and network users continuously follow access policies that are compliant with federal and industry compliance regulations.

It’s still important to understand that primary and secondary backup systems weren’t initially designed to defend against cybercrime, especially not from expert threat actors who leverage encrypted malware, double extortion, and phishing campaigns, among others, as core competencies of their TTP framework.

At their inception, backups were made to preserve data in cases of file corruption or accidental removals – not ransomware. However, as cyber threats targeting data assets have intensified, they have emerged as a must-have tool within the enterprise data security arsenal. By implementing effective backup practices at scale, organizations can take proactive steps to strengthen their data security posture and safeguard sensitive files.

The post The Critical Role of Safe and Secure Backup System Storage appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5614
Achieving Data Resiliency with Data Classification and the Shared Responsibility Model https://solutionsreview.com/backup-disaster-recovery/achieving-data-resiliency-with-data-classification-and-the-shared-responsibility-model/?utm_source=rss&utm_medium=rss&utm_campaign=achieving-data-resiliency-with-data-classification-and-the-shared-responsibility-model Fri, 16 Jun 2023 18:24:17 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5613 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Clumio co-founder and CTO Woon Ho Jung offers commentary on achieving data resiliency with data classification and the shared responsibility model. The last year brought with it a number of high-profile data breaches […]

The post Achieving Data Resiliency with Data Classification and the Shared Responsibility Model appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Achieving Data Resiliency with Data Classification and the Shared Responsibility Model

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Clumio co-founder and CTO Woon Ho Jung offers commentary on achieving data resiliency with data classification and the shared responsibility model.

The last year brought with it a number of high-profile data breaches at prominent companies such as Uber, T-Mobile, Rackspace, and LastPass – leading to all manner of frustrated customers and compromised data, not to mention class action lawsuits. More specifically, ransomware attacks are becoming more expensive and time-intensive to recover from, increasing in 2022 by 41% in cost and 49 days in recovery time. This threat is lethal to businesses that overlook their data protection strategies.

If the past year has taught us anything, it’s that anyone can be vulnerable to attacks in the cloud without proper protection—from the largest organization to the smallest startup. Modern applications are powered by ephemeral compute, yet persistent data—vast data lakes and data warehouses. As this data continues to grow exponentially, the attack surface for breaches, ransomware, and even accidental deletions keeps increasing. While the last decade focused on putting data to use quickly, this decade will be about reining it in—bringing organization, structure, and resilience to this data in order to ensure proper protection.

The Call for Classification

As multi cloud environments become more common, it not only becomes more difficult for customers to enhance the resiliency of their production workloads, but even the process of identifying and locating their data across cloud environments becomes a greater challenge. Organizations must think of the cloud and data stores as rooms in a house. It has become incredibly important to go through each repository of information, clear out unnecessary material, and know where and how data is stored to ensure it is also being protected.

The ability to look inside storage and backups by means of an index and catalog also helps understand its usability and lineage. This is critical for compliance audits and proving disaster resilience. It’s time to clean out those old snapshots, replicas, and archives, and consolidate data archival and backups into a well-cataloged, searchable platform that ensures efficient storage and observable data trails that are easy to maintain on an ongoing basis.

Such organization calls for data classification—a key shift from protecting all data en-masse toward implementing tiering and group-based classification systems. This not only strengthens data security, but delivers financial savings for businesses. Take, for example, a healthcare data lake. A majority of information that is backed up from that data lake requires only 30 days of retention for operational recoveries, but the data lake may also contain health records that need to be retained for 6 years to comply with the Health Insurance Portability and Accountability Act (HIPAA).

In this case, rather than backing up all the objects that comprise the data lake for 6 years, data classification during backups can reduce costs by over 90% without any compromises to security and compliance. Classification by access patterns, object tags, tiers, and other metadata also allows businesses to store their data in a way that’s neither overprotected nor under-protected, but perfectly tailored to the unique aspects of that dataset. Classifying data in this way best protects it while reducing costs and meeting compliance standards.

Taking on the Shared Responsibility Model

There are two key threats to data resiliency in the cloud—the misconception that your cloud or SaaS provider will ‘automatically’ safeguard your data, and thinking that cybersecurity is the same as data security. Customers need to remember that the validity, security, and resilience of this data is the customers’ responsibility, clearly stated in the Shared Responsibility Model of most cloud service providers’ terms of service. Customers also need to understand that cybersecurity alone doesn’t suffice. A huge component of data security is protection against accidental deletions, disasters, and misconfigurations—most of which are user-driven.

In addition to the regulatory commitments of an organization, data needs to be operationally resilient. For example, many architectures on AWS, even those that split workloads into multiple availability zones, have one central data lake or bucket. The biggest myths in AWS architecture are often related to resilience. The service is resilient, yes, but there is no guarantee for the resiliency of the data, configuration, or other components that turn building blocks into functional applications.

Even though cloud providers are responsible for maintenance of the hardware and data centers that run their cloud services, customers still need to improve their data protection and resiliency in the event that the provider suffers a large-scale outage such as an availability zone failure. With the growing threat of ransomware, cloud customers should also consider adding immutable storage to ensure that hackers or other security threats do not delete, corrupt or encrypt valuable production data residing in a cloud environment.

Restructuring cloud storage with data classification and promoting the Shared Responsibility Model will be key to effectively protect organizational data in the cloud and ensure data resiliency. While business continuity is about emergency preparedness, data resiliency is an ongoing, 24/7 activity. Data resilience is about ensuring that any data that is deemed critical is protected from operational deletes, ransomware, cyberattacks, and the like — all the time.

The post Achieving Data Resiliency with Data Classification and the Shared Responsibility Model appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5613
3 Ways Channel Partners Can Expand Their Data Protection Revenue Stream in 2023 https://solutionsreview.com/backup-disaster-recovery/3-ways-channel-partners-can-expand-their-data-protection-revenue-stream-in-2023/?utm_source=rss&utm_medium=rss&utm_campaign=3-ways-channel-partners-can-expand-their-data-protection-revenue-stream-in-2023 Fri, 16 Jun 2023 18:20:39 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5612 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Arcserve Director of Product Management Ahsan Siddiqi offers three ways channel partners can expand their data protection revenue right now. With the ever-increasing risk of cyberattacks and data breaches, MSPs understand the critical […]

The post 3 Ways Channel Partners Can Expand Their Data Protection Revenue Stream in 2023 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
3 Ways Channel Partners Can Expand Their Data Protection Revenue Stream in 2023

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Arcserve Director of Product Management Ahsan Siddiqi offers three ways channel partners can expand their data protection revenue right now.

With the ever-increasing risk of cyberattacks and data breaches, MSPs understand the critical role that data protection plays in keeping their clients safe and secure. That’s probably why 88 percent of MSPs say that expanding their data protection revenue stream is a top business priority for 2023, according to a recent poll conducted by Arcserve.

But establishing that priority is the easy part. Enacting it is more complicated. So how can MSPs expand their data protection revenue streams? Here are three strategies to make it happen.

Offer the Protection that SaaS Providers Don’t

Organizations large and small have wholeheartedly embraced SaaS applications like Microsoft 365, Google Workspace, and Salesforce. Consequently, channel partners have seen a surge in demand for their expertise in deploying and implementing those applications for customers. MSPs can also offer backup and recovery solutions for SaaS applications as part of their service.

These data protection solutions are much-needed because when an organization transitions to the cloud, the cloud provider does not provide data backup and recovery. It’s a shared responsibility between the customer and the provider, whether it be Microsoft or Google. Although they may not openly state it, their terms and conditions contain legal language that clearly says they are not liable for data loss due to data corruption, security breach, or accidental deletion. The onus is on the customer to recover the lost data and repair the damage, not the cloud provider.

It’s similar to the arrangement between a driver and an automaker. The manufacturer is responsible for meeting quality and safety standards, but it’s up to the driver not to be reckless and crash the car. Regarding data in the cloud, the customer is responsible for protecting their data. The fine print in SaaS provider contracts protects providers from lawsuits; it does not offer protection for customers against data loss and its financial implications.

There is an opportunity for channel partners to provide that layer of protection for their customers to help them safeguard their data and mitigate risks in the cloud. Specifically, MSPs can offer data protection services that meet the unique needs of their clients. For example, they can provide backup and recovery services tailored to virtualized environments, SaaS-based applications, and remote work environments.

Explain the Risks of Not Having Data Protection

Educating customers about the importance of safeguarding their SaaS data is crucial. Channel partners should illustrate to customers the risks associated with data loss, accidental deletion, ransomware attacks, and other threats that organizations may face in the SaaS environment. They should highlight how these risks can have severe consequences for business continuity, compliance with data protection regulations, and overall peace of mind for the organization.

Once they outline the risks, channel partners can explain why SaaS backup and protection are critical in mitigating them. They can demonstrate how backup and protection solutions deliver a necessary layer of security and assurance beyond what cloud service providers offer. They can highlight how these solutions enable customers to recover lost data, restore systems to previous states and protect against ransomware attacks, helping organizations minimize downtime and possible financial damage.

Furthermore, channel partners can emphasize the value of SaaS data protection in terms of business continuity. They can demonstrate how having a reliable backup and protection strategy ensures that organizations can quickly recover from data-loss incidents, maintain operational continuity, and reduce the impact of any potential data breach or accidental data deletion.

Finally, channel partners can point out the peace of mind that comes with robust data protection. Knowing that critical data is backed up and protected against potential threats gives organizations a sense of security and confidence. It allows them to focus on their core business operations without worrying about data loss and security breaches.

Provide Value-Added Services

MSPs and channel partners can boost their revenue from SaaS backup and protection services by offering extra features that add value for their customers. For example, MSPs should provide automated backups at regular intervals or on a scheduled basis, which reduces the risk of data loss due to human error. They should also offer the ability to recover data granularly, such as individual files, folders, or emails. This kind of granular recovery option enables customers to restore specific data items without restoring the entire backup, thus providing greater flexibility and efficiency in the data recovery process.

One of the primary challenges with SaaS data is privacy and security. MSPs can offer services that ensure customer data is backed up securely and stored in compliance with relevant regulations such as GDPR or HIPAA. To ensure data protection and compliance with regulations, MSPs can provide features like data encryption, access controls, and regular security audits. These kinds of services are especially valuable for customers with specific compliance requirements. By offering these features, MSPs can differentiate themselves from competitors and maximize their revenue.

Final Takeaway

SaaS backup and protection is an essential service that MSPs and VARs can provide to their clients. But to do it well, they should stay updated with the evolving SaaS backup and protection market trends. By delivering the latest and most relevant solutions to their customers, channel partners can maintain a competitive edge and build a more profitable business in 2023.

The post 3 Ways Channel Partners Can Expand Their Data Protection Revenue Stream in 2023 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5612
The Secret to Stopping Human Error is Automating Cloud Governance https://solutionsreview.com/backup-disaster-recovery/the-secret-to-stopping-human-error-is-automating-cloud-governance/?utm_source=rss&utm_medium=rss&utm_campaign=the-secret-to-stopping-human-error-is-automating-cloud-governance Fri, 16 Jun 2023 17:42:54 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5608 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, ALTR CEO James Beecham offers commentary on how the secret to combating human error is automating cloud governance. The White House has called for a major overhaul of its cloud computing systems, citing […]

The post The Secret to Stopping Human Error is Automating Cloud Governance appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
The Secret to Stopping Human Error is Automating Cloud Governance

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, ALTR CEO James Beecham offers commentary on how the secret to combating human error is automating cloud governance.

The White House has called for a major overhaul of its cloud computing systems, citing the existential threat of data breaches. The causes of data loss include mis-configured settings, human error, system failures, and malicious attacks by hackers and rogue nation-states. When privacy regulations cover 75 percent of the world, a breach has severe consequences. Data loss incidents can lead to financial loss, damage to reputation, and legal issues that could put organizations out of business or lose the trust of stakeholders, investors, and customers.

Cloud and artificial intelligence (AI) technologies can facilitate work efficiency, but they put organizations at risk without proper data protection measures. Government regulation of cloud services can help mitigate the impact of human error, but it may not address the human tendency to take shortcuts. If we want to stop cloud data breaches, we should automate access controls.

The Problem is Not Really Human Error

Daily misconfiguration breaches used to plague public cloud service providers, but companies like AWS and Microsoft now help administrators understand when a misconfiguration might occur and alert users. Today, people rarely leave open cloud or data center-delivered services on the public Internet. But configuration mishaps and mistakes are caused not by ignorance but by human tendencies to prioritize speed and efficiency over thoroughness.

Government agencies may mandate changes to Infrastructure-as-a-Service (IaaS) offerings, but placing the security burden solely on cloud providers is a false hope. Every regulated industry continues to have problems. In healthcare, for example, HIPAA regulations have not prevented the loss of patient data. Earlier this year, the Department of Health and Human Services reported that healthcare data breaches grew from 2012 to 2021. With all these controls in place, human error and the desire to take shortcuts will still exist.

Security Automation is Essential

The flexibility and freedom of the cloud can lead its users to make unsafe decisions with their data and systems. Cybercriminals and hostile nation-states can use the scale, innovation, and flexibility of these technologies to launch attacks and hide from defenders. Bad actors don’t need to deliver secret information in person; they can simply send a message within a popular gaming site where millions of people are actively chatting and playing. Remember, Bin Laden used a Microsoft email server to hide his communications for years. Are we going to overhaul and regulate email servers?

Security must understand what the end-user wants to do, identify the manual tasks required, and perform those tasks for the end-user. For example, computers are very good at performing repetitive and detail-oriented tasks. A program can scan all the ports on a cloud firewall and report the open ones, allowing a human to take corrective action.

Big Data Demands Automated Access Controls

Automation is especially important in controlling access to sensitive information. Data that used to live in silos is being merged into a single pool. That’s good for business because an extensive database or repository makes it easier for analysts to run queries and gain insights from the accumulated data. It’s also bad because security must now replicate barriers and segment data within cloud platforms; it was easier to secure separated systems.

The complexity increases because people want direct access to information for work. It’s easy to add users and data when data sets are aggregated in a cloud-based platform like Snowflake. But you increase the risk of human error when you give users access to that data. An administrator could try to take a shortcut, such as granting access to the entire database, rather than limiting the permissions to a specific user or a specific data set.

We need automation to check what’s in place and respond quickly. For example, if an analyst makes a mistake when creating a new data model or user group, security should flag the problem. It should prompt the user to review the process or block steps if something doesn’t look right.

That’s fine for smaller organizations with dozens or hundreds of users. Large organizations must manage secure data access for thousands of users at a time. The need for “real-time” performance is greater. When a service slows down because of security features, the answer is usually to turn off security. It’s just another way to take a shortcut.

Scaling Data Governance and Protection

Automation eliminates these problems and provides the speed that data teams need. Some data governance platforms today allow data teams to classify data types, apply controls, and automatically block access as data moves into Snowflake. Role-based controls and masking policies ensure users have only the level of access they need to do their jobs. A marketing specialist may need a customer’s full email address, while an analyst needs to know how many people are using the Gmail domain.

Policies can set time or capacity limits on data access or restrict viewing to only a portion of a data set. When sensitive data is accessed, administrators can receive an alert and determine what data is being used, by whom, when, and how much. Just as fast, they can revoke privileges if an activity violates the policy. This automation scales governance, security, and privacy for organizations of all sizes.

Now more than ever, governments are holding companies accountable for the security of their data. As privacy regulations increase, automating governance at scale could mean the difference between profit and pain for organizations. Data teams would be far more successful if they could easily provide and manage access for many users at once to facilitate collaboration while preventing breaches. Sure, people can forget. But the best way to prevent mistakes and costly shortcuts isn’t more regulation: it’s making sure security is turned on when you need it.

The post The Secret to Stopping Human Error is Automating Cloud Governance appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5608
Federated Learning Promises to Change the Game for Money Laundering https://solutionsreview.com/backup-disaster-recovery/federated-learning-promises-to-change-the-game-for-money-laundering/?utm_source=rss&utm_medium=rss&utm_campaign=federated-learning-promises-to-change-the-game-for-money-laundering Fri, 16 Jun 2023 17:15:10 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=5595 Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Consilient CEO Gary M. Shiffman penned a commentary on how federated learning promises to change the game for money laundering. There’s a new sheriff in town to stop money launderers, and it’s a […]

The post Federated Learning Promises to Change the Game for Money Laundering appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Federated Learning Promises to Change the Game for Money Laundering

Solutions Review’s Premium Content Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Consilient CEO Gary M. Shiffman penned a commentary on how federated learning promises to change the game for money laundering.

There’s a new sheriff in town to stop money launderers, and it’s a subset of Machine Learning known as Federated Learning (FL). This technology has the ability to analyze data without actually moving the data, so for the first time ever financial institutions can collaborate without violating privacy regulations and the evolving landscape of expectations.

Why is this a big deal? Let me explain: criminals place, layer, and spend illicit proceeds through numerous financial intermediaries to avoid detection – this is the essence of money laundering. These bad actors are usually not as smart as portrayed in movies, but because the system they exploit is so vast that catching a money launderer is the equivalent of finding the proverbial needle in the haystack. This vastness only increases as dark web and crypto assets further expand the domains of illicit trade.

In my years in national security and law enforcement, I collaborated every chance I could. But for commercial institutions such as banks and fintechs, identifying criminal money launderers and human traffickers and maintaining customer privacy appear directly at odds. For example, financial institutions could discover more money laundering if they could share data with each other and across jurisdictions, but privacy regulations included in Europe’s GDPR and virtually every U.S. state law forbids this practice.

Federated Learning, however, breaks this historic tradeoff. Financial institutions can now, for the first time in history, collaborate and simultaneously protect privacy.

Risks, Technology & Regulatory Conundrum

FL is now available to the Financial Crimes Compliance market. Over the past several years, the cost to organizations to comply with regulations and identify money launderers has skyrocketed due to two factors. First, the regulatory burden expands each year and banks are required to do more in a complex market that includes e-commerce, fintechs, crypto, dark web, etc. Second, regulated financial institutions continue to use the same tools and technologies they’ve been using for years. Most of these tools were created decades ago and were never meant to cope with today’s world. Criminals use the latest innovative technologies to evade discovery while the banks fail to adopt innovations, or move much more slowly.

This conspiracy of a growing regulatory regime and the innovation gap between criminals and bankers creates many problems, one of which professionals in cybersecurity are only too familiar with: false positive alerts. These old anti-money laundering (AML) systems throw off between 97-98 percent false positive alerts, meaning that financial crime employees spend most of their days chasing down could-be money launderers that aren’t. It is no wonder that turnover in this industry remains high. Plus, criminals understand what these AML systems are capable of, so they are careful not to do things that will trigger them and know it is rare for them to be caught.

Enter Federated Learning

To dramatically decrease false positives, find more risk, and spend less time and money, financial institutions must collaborate. FL enables collaboration without moving data – thus protecting privacy.

Think of cyber and financial crimes as patterns in data. Criminals are human, and humans behave in largely predictable ways. Through FL, people can exploit this behavioral science insight. And they can share this insight with external organizations – whether they are other offices in different jurisdictions or different companies. That’s where FL platforms come into play – they enable organizations to share information without violating privacy.

The Power of Moving Analytics to Data

The next time a cyber-crime discussion gets hijacked by data-sharing concerns, shift the discussion to sharing the analytics. By moving algorithms to data through FL, models can continue to improve and evolve, and these insights can be shared with others without sharing any data.

Technology adoption in crime fighting usually lags that of the criminals. It’s a matter of incentives – criminals look to exploit holes that enable them to make money, and the “good guys” reactively patch those holes to stop the criminals. Fortunately, the era of FL is slingshotting the good guys ahead. Think about it: criminals can adopt AI and move to crypto and other places with alacrity, but FL allows financial institutions to more easily collaborate and stop them. By dramatically reducing false positives and increasing discovery of malfeasance, FL allows financial institutions to finally put a dent in the dirty money-laundering industry.

The post Federated Learning Promises to Change the Game for Money Laundering appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
5595