Storage Archives - My TechDecisions https://mytechdecisions.com/tag/storage/ The end user’s first and last stop for making technology decisions Wed, 05 Apr 2023 15:09:11 +0000 en-US hourly 1 https://mytechdecisions.com/wp-content/uploads/2017/03/cropped-TD-icon1-1-32x32.png Storage Archives - My TechDecisions https://mytechdecisions.com/tag/storage/ 32 32 Backup and Storage Devices Contain an Average of 14 Security Issues https://mytechdecisions.com/it-infrastructure/backup-storage-devices-contain-average-14-security-issues/ https://mytechdecisions.com/it-infrastructure/backup-storage-devices-contain-average-14-security-issues/#respond Wed, 05 Apr 2023 15:09:11 +0000 https://mytechdecisions.com/?p=47749 With storage and backup devices often representing the last line of defense against ransomware attacks and outages, they should be as secure as possible so organizations can restore their data in a critical time of need. However, new research from cyber resilience company Continuity shows that the average enterprise storage and backup device has 14 […]

The post Backup and Storage Devices Contain an Average of 14 Security Issues appeared first on My TechDecisions.

]]>
With storage and backup devices often representing the last line of defense against ransomware attacks and outages, they should be as secure as possible so organizations can restore their data in a critical time of need.

However, new research from cyber resilience company Continuity shows that the average enterprise storage and backup device has 14 vulnerabilities, three of which are rated as high or critical. With backups such a crucial part of an organization’s infrastructure, a compromise could lead to a much more significant cyber incident.

The New York City-based company analyzed more than 700 enterprise storage and backup devices as well as nearly 10,000 security issues and found that the average backup and storage device had more than a dozen vulnerabilities. Those security flaws include insecure network settings, unaddressed vulnerabilities, access rights issues, insecure user management and authentication, and insufficient logging and auditing.

According to Continuity, unpatched vulnerabilities in storage and backup systems are main attack points for ransomware actors to cripple an organization’s restoration plans and force the victim to pay a ransom.

The company’s study, The State of Storage & Backup Security Report, finds several reasons why those security issues exist in backup and storage environments, including a growing divide between IT infrastructure and security teams.

The report suggests that security teams are developing policies and procedures that infrastructure teams are tasked with implementing, sometimes with minimal direction.

In addition, security teams may be unaware of the cyber resiliency capabilities offered by storage and backup systems, while infrastructure teams are more focused on day-to-day operations and less concerned with defending against cyberattacks.

In addition to leveraging automated security posture assessment tools, the report recommends that organizations identify storage and backup security knowledge gaps and develop a plan that puts it on par with that of compute and network security.

Continuity also offers these questions that organizations should ask themselves to help clarify their level of storage security maturity:

  • Do our security policies cover specific storage, storage networking and backup risks?
  • Are we evaluating the security of our storage & backup infrastructure on an ongoing basis?
  • Do we have detailed plans and procedures for recovery from a successful attack on a storage or backup system? Do we test such procedures?
  • How confident are we that the key findings highlighted in this report, and similar ones do not, and cannot occur in our environment?

Organizations should also read the NIST SP-800-209 Security Guidelines for Storage Infrastructure, which were co-authored by Continuity.

The post Backup and Storage Devices Contain an Average of 14 Security Issues appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/backup-storage-devices-contain-average-14-security-issues/feed/ 0
Amazon S3 Will Now Encrypt Objects by Default https://mytechdecisions.com/it-infrastructure/amazon-s3-encrypt-objects-default/ https://mytechdecisions.com/it-infrastructure/amazon-s3-encrypt-objects-default/#respond Mon, 09 Jan 2023 14:37:56 +0000 https://mytechdecisions.com/?p=46436 AWS has announced that Amazon S3 will encrypt all new objects by default and automatically apply server-side encryption for each new object unless specified otherwise. Server side encryption (SSE) for Amazon’s Simple Storage Service (Amazon S3) was first introduced in 2011, with users given the ability to request encrypted storage when storing a new object […]

The post Amazon S3 Will Now Encrypt Objects by Default appeared first on My TechDecisions.

]]>
AWS has announced that Amazon S3 will encrypt all new objects by default and automatically apply server-side encryption for each new object unless specified otherwise.

Server side encryption (SSE) for Amazon’s Simple Storage Service (Amazon S3) was first introduced in 2011, with users given the ability to request encrypted storage when storing a new object or copying an existing object.

Encryption by default puts another security best practice into effect automatically and the change has no impact on performance, according to AWS. No user action is needed, and S3 buckets that do not use default encryption will now automatically apply SSE-S3 as the default setting.

However, users can choose one of three encryption options: the new default SSE-S3 setting, customer-provided encryption keys or AWS Key Management Service keys.

“To have an additional layer of encryption, you might also encrypt objects on the client side, using client libraries such as the Amazon S3 encryption client,” the company says in a blog post.

Previously, opting in to SSE-S3 meant that users had to certain that it was always configured on new buckets and verify that it remained configured property over time. For organizations that require all objects to remain encrypted at rest with SSE-S3, this update helps meet those compliance requirements without any additional effort, the company says.

“With today’s announcement, we have now made it ‘zero click’ for you to apply this base level of encryption on every S3 bucket,” the company says.

The change is visible in AWS CloudTrail data event logs, and users can also see changes in the S3 section of the  AWS Management ConsoleAmazon S3 InventoryAmazon S3 Storage Lens, and as an additional header in the AWS CLI and in the AWS SDKs over the next few weeks.

To verify the change, users can configure CloudTrail to log data events.

Read AWS’ blog for more information.

The post Amazon S3 Will Now Encrypt Objects by Default appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/amazon-s3-encrypt-objects-default/feed/ 0
Google Announces New Cloud Storage Solutions https://mytechdecisions.com/it-infrastructure/google-announces-new-cloud-storage-solutions/ https://mytechdecisions.com/it-infrastructure/google-announces-new-cloud-storage-solutions/#respond Fri, 17 Sep 2021 15:16:42 +0000 https://mytechdecisions.com/?p=34171 Google has announced new cloud storage solutions designed to improve data protection and resiliency, including Filestore Enterprise and Backup for Google Kubernetes Engine. According to the company, the new services will make it easier for enterprise customers to protect data out-of-the-box across a wide variety of applications and uses. The company announced the Preview for […]

The post Google Announces New Cloud Storage Solutions appeared first on My TechDecisions.

]]>
Google has announced new cloud storage solutions designed to improve data protection and resiliency, including Filestore Enterprise and Backup for Google Kubernetes Engine.

According to the company, the new services will make it easier for enterprise customers to protect data out-of-the-box across a wide variety of applications and uses.

The company announced the Preview for Backup for Google Kubernetes Engine (GKE), which it calls a simple, cloud-native way for organizations to protect, manage and restore containerized applications and data, enabling users to more easily meet service-level objectives, automate common backup and recovery tasks and show reporting for compliance and audit purposes.

Google says this means more applications can be deployed in GKE, making it easier for larger customers to expand their use of GKE and manage more demanding workflows.

According to Google, Backup for GKE orchestrates data protection and restores for the customer so IT only has to manage data at the container level.

This allows customers to create a backup plan to schedule periodic backups of both application data and GKE cluster state data. Customers can also restore reach backup to a cluster in the same region or to a cluster in a different region.

“You can even customize your backups to ensure application consistency for the most demanding, tier-one database workloads,” reads a company blog post. “The result is a feature that drives down the operational cost for infrastructure teams at companies like Atos, while also making it easier for architects and developers to use GKE for their most critical applications.”

Google also announced Filestore Enterprise, which it calls a fully managed cloud-native NFS solution that lets customers deploy critical file-based applications in Google Cloud. The service is designed to applications that demand high availability and is backed by a Service-Level Ageement that delivers 99.99% regional availability, the company says.

According to Google, the Filestore family now includes:

  • Filestore Basic for file sharing, software development, and GKE workloads
  • Filestore High Scalefor high performance computing (HPC) application requirements such as genome sequencing, and financial-services trading analysis
  • Filestore Enterprisefor critical applications (e.g., SAP) and GKE workloads

According to Google Filestore also lets customers take periodic snapshots of the file system and retain a desired number of recovery points, allowing for the recovery of an entire file system in less than 10 minutes from any of the prior snapshot recovery points.

The post Google Announces New Cloud Storage Solutions appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/google-announces-new-cloud-storage-solutions/feed/ 0
Film Companies Turn to Glass for Better Data Storage https://mytechdecisions.com/it-infrastructure/film-companies-turn-to-glass-for-better-data-storage/ https://mytechdecisions.com/it-infrastructure/film-companies-turn-to-glass-for-better-data-storage/#respond Wed, 27 Nov 2019 19:00:46 +0000 https://mytechdecisions.com/?p=20171 Project Silica is helping Warner Bros. store and protect its data more effectively.

The post Film Companies Turn to Glass for Better Data Storage appeared first on My TechDecisions.

]]>
Film companies are looking at new ways to preserve how their media is produced and archived. Specifically, Warner Bros. and Microsoft recently joined forces to test Project Silica, a project that uses laser optics and artificial intelligence to store data in quartz glass, TechSpot reports.

Project Silica’s storage process involved burning “voxels” into glass in a 3D array “to allow for a high storage density;” a piece of glass that is 2 mm thick can contain over 100 layers of voxels “that physically deform the glass through laser pulses,” TechSpot reports.

One of the main purposes of the project is finding new ways to develop long-term storage solution for the cloud, especially as storage demand increases and moves away from “traditional magnetic media.” Warner Bros. was looking for a solution like this for its “cold” data, which entails archived, valuable data that isn’t frequently accessed. Its current solution involved more fragile storage options that required continuous maintenance.

So Far So Good

One of the results of Microsoft’s pilot was storing the 1978 Superman movie, which is 75.6 GB of data, in glass “no bigger than a drink coaster.” In order to integrate the data on glass, Microsoft encoded it with lasers, then tapped into machine learning to décor images and patterns created by light shining through the glass.

Not only did the silica glass provide a stronger storage solution for the entertainment company, it also brought improvements to Warner Bros.’s data. Brad Collar, SVP of global archives and media engineering at Warner Bros., told TechSpot that the glass allows data to be read the same way as it does when it comes out of the camera, which preserves the original pixels in the best possible manner.

The glass could provide additional benefits down the road, too. Due to its protective preservation properties, the Silica Glass project might enable entertainment companies to cut costs on creating archival film negatives for digitally shot content, since it is a cheaper and higher-quality replacement for physical archives. It could also help companies lower their carbon footprint since the glass doesn’t require “energy-intensive air conditioning” to maintain air quality.

The post Film Companies Turn to Glass for Better Data Storage appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/film-companies-turn-to-glass-for-better-data-storage/feed/ 0
InfiniteIO and Cloudian Solution Optimizes the Economics of Private Cloud Storage https://mytechdecisions.com/it-infrastructure/infiniteio-hybrid-cloud-tiering-cloudian-hyperstore-object-storage/ https://mytechdecisions.com/it-infrastructure/infiniteio-hybrid-cloud-tiering-cloudian-hyperstore-object-storage/#respond Fri, 16 Aug 2019 12:00:18 +0000 https://mytechdecisions.com/?p=18310 Enterprises and organizations across industries can improve storage economics without making changes to existing IT operations using a new joint solution introduced  by InfiniteIO Hybrid Cloud Tiering  and Cloudian HyperStore object storage. The solution, combining both of the solutions, offers customers and channel partners a simple-to-use solution that optimizes storage cost and performance with no changes to users, applications […]

The post InfiniteIO and Cloudian Solution Optimizes the Economics of Private Cloud Storage appeared first on My TechDecisions.

]]>
Enterprises and organizations across industries can improve storage economics without making changes to existing IT operations using a new joint solution introduced  by InfiniteIO Hybrid Cloud Tiering  and Cloudian HyperStore object storage.

The solution, combining both of the solutions, offers customers and channel partners a simple-to-use solution that optimizes storage cost and performance with no changes to users, applications or systems, the companies said in a joint release.

“Customers in industries such as scientific research, healthcare, surveillance, and media and entertainment are increasingly adopting private clouds to meet their data storage needs,” said Sanjay Jagad, senior director of products and solutions at Cloudian.

“The Cloudian-InfiniteIO joint solution enables these customers to overcome the scale and performance limitations of traditional storage at a cost savings of up to 70 percent.”

What the private cloud storage solution does

The Cloudian and InfiniteIO solution helps ensure data is properly placed across primary and secondary storage as well as public cloud, potentially saving millions of dollars in primary and backup storage costs, the companies say.

Organizations can install InfiniteIO like a network switch to offload file metadata operations and intelligently migrate hundreds of petabytes of inactive data from on-premises NAS systems to the exabyte scalable Cloudian object storage system.

Read Next: It’s Time to Assess Your Cloud Security & Maturity: Here’s How

Using Cloudian and InfiniteIO, customers can attain highly available enterprise-class storage with the performance of all-flash NAS in all storage tiers.

“Cloudian’s focus on delivering limitlessly scalable, highly cost-effective storage is the foundation enterprises need to manage and protect increasing data volumes across on-premises, hybrid cloud and multi-cloud environments,” said Liem Nguyen, vice president of marketing at InfiniteIO.

“The simplicity, performance and scale that InfiniteIO and Cloudian are bringing together will help organizations extend their existing IT investments to save money yet uniquely avoid disruption to their business.”

The post InfiniteIO and Cloudian Solution Optimizes the Economics of Private Cloud Storage appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/infiniteio-hybrid-cloud-tiering-cloudian-hyperstore-object-storage/feed/ 0
Startup Stores All of Wikipedia Data on a DNA Strand https://mytechdecisions.com/it-infrastructure/startup-stores-all-of-wikipedia-data-on-a-dna-strand/ https://mytechdecisions.com/it-infrastructure/startup-stores-all-of-wikipedia-data-on-a-dna-strand/#respond Thu, 18 Jul 2019 14:00:18 +0000 https://mytechdecisions.com/?p=17457 A technology as old as life itself, DNA strands provide a chemically stable storage alternative that resists the technological inevitability of becoming obsolete.

The post Startup Stores All of Wikipedia Data on a DNA Strand appeared first on My TechDecisions.

]]>
A startup called Catalog is working around the disks and drives of storage that quickly become obsolete by employing a more timeless technology: human DNA. The company successfully uploaded all 16GB of text from Wikipedia’s English-language library onto genetic molecules that look and act just like the ones you would find in our bodies.

The Boston-based company was founded by Chief Executive Hyunjun Park and Chief Technology Innovation Officer Nathaniel Roquet in 2016. At the time they were an MIT postdoc and Harvard graduate, respectively. They devised a technological process that allows them to record 4 megabits of information directly onto synthetic strands of DNA. They hope to get this speed up to 125 gigabytes per day. That’s as much information as many upscale smartphones can hold. 

Putting digital data onto a genetic molecule is an impressive process that requires a DNA writer, which CNET describes as being able to “fit easily in your house if you first got rid of your refrigerator, oven and some counter space.” Catalog’s DNA strands are much shorter than human DNA, but they use more of them to create more storage space. The idea of storing information on synthetic DNA strands may seem cumbersome, questionable, and unfathomable for many, but the strands themselves are compact and reliable. 

That being said, you won’t be choosing between a DNA writer and a USB drive at your nearest convenience store anytime soon, so who are these synthetic strands helping? 

Catalog recently announced a partnership with Arch Mission Foundation, whose goal is to store human knowledge elsewhere in the solar system, such as the Tesla Roadster launched into orbit by SpaceX. Catalog’s potential customers are specific but scientifically ambitious. 

“We have discussions underway with government agencies, major international science projects that generate huge amounts of test data, major firms in oil and gas, media and entertainment, finance, and other industries,” the company wrote in a statement.

The post Startup Stores All of Wikipedia Data on a DNA Strand appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/startup-stores-all-of-wikipedia-data-on-a-dna-strand/feed/ 0
Argonne Completes Largest Single File Transfer in Globus History https://mytechdecisions.com/it-infrastructure/argonne-completes-largest-single-file-transfer-in-globus-history/ https://mytechdecisions.com/it-infrastructure/argonne-completes-largest-single-file-transfer-in-globus-history/#respond Wed, 10 Jul 2019 18:00:27 +0000 https://mytechdecisions.com/?p=17506 Argonne National Laboratory scientists transferred 2.9 petabytes of data on the Oak Ridge Summit supercomputer, the largest migration in the history of Globus data management.

The post Argonne Completes Largest Single File Transfer in Globus History appeared first on My TechDecisions.

]]>
This week research data management service Globus announced that the largest single file transfer in the history of the company occurred. A team of scientists at Argonne National Laboratory led the movement of 2.9 petabytes of data – part of a research project involving three of the largest cosmological simulations to date.

The data was stored on the Oak Ridge Summit supercomputer, which is currently the world’s fastests supercomputer. Globus was tasked with moving files from disk to tape, a typical migration for researchers. Globus is software-as-a-service for research data management, used by hundreds of research institutions and high-performance computing (HPC) facilities worldwide. The service enables secure, reliable file transfer, sharing, and data publication for managing data throughout the research lifecycle.

“Storage is in general a very large problem in our community — the Universe is just very big, so our work can often generate a lot of data,” says Katrin Heitmann, Argonne physicist and computational scientist and an Oak Ridge National Laboratory Leadership Computing Facility (OLCF) Early Science user. “Using Globus to easily move the data around between different storage solutions and institutions for analysis is essential.

“Due to its uniqueness, the data is very precious and the analysis will take time,” says Dr. Heitmann. “The first step after the simulations were finished was to make a backup copy of the data to HPSS, so we can move the data back and forth between disk and tape and thus carry out the analysis in steps. We use Globus for this work due to its speed, reliability, and ease of use.”

“With exascale imminent, AI on the rise, HPC systems proliferating, and research teams more distributed than ever, fast, secure, reliable data movement and management are now more important than ever,” says Ian Foster, Globus co-founder and director of Argonne’s Data Science and Learning Division. “We tend to take these functions for granted, and yet modern collaborative research would not be possible without them.”

 

The post Argonne Completes Largest Single File Transfer in Globus History appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/argonne-completes-largest-single-file-transfer-in-globus-history/feed/ 0
What Decision Makers Need to Know About Energy Consumption and Universal Memory https://mytechdecisions.com/it-infrastructure/what-decision-makers-need-to-know-about-energy-consumption-and-universal-memory/ https://mytechdecisions.com/it-infrastructure/what-decision-makers-need-to-know-about-energy-consumption-and-universal-memory/#comments Wed, 10 Jul 2019 14:00:47 +0000 https://mytechdecisions.com/?p=17379 A university’s recent breakthrough on Universal Memory promises a solution to the spike in residential energy consumption, and an idea of which direction memory research needs to go.

The post What Decision Makers Need to Know About Energy Consumption and Universal Memory appeared first on My TechDecisions.

]]>
According to Lancaster University, a new type of computer memory will solve “the digital technology energy crisis” – Universal Memory.

A type of electronic memory device, Universal Memory provides ultra-low energy consumption, which is expected to “reduce peak power consumption” by one-fifth at data centers. Lancaster University says this is especially important as residential energy savings continue to go down due to a boost in end users’ use of computers and mobile devices; “by 2025 a ‘tsunami of data’ is expected to consume a fifth of global electricity.”

While the development of Universal Memory has been challenging, the university says its researchers used quantum mechanics to “solve the dilemma between choosing between stable, long-term data storage and low-energy writing and erasing.” With this issue solved, Universal Memory might be able to replace the $100bn market for Dynamic Random Access Memory (DRAM), which has been used as the current memory of computers and long-term memory of flash drives.

As many decision makers may already know, DRAM has been successfully used for a long time. Writing data to DRAM has traditionally been fast and low-energy; however, its downside is that the data becomes volatile, and needs to be refreshed on a continuous basis to prevent it from being lost. Similarly, while flash has typically stored data well, the process of writing and erasing is slow-going. It also utilizes tons of energy, which deteriorates it, thus making it unfit for long-term memory storage. “This is clearly inconvenient and inefficient,” Lancaster University says.

In their article on Universal Memory, published in Scientific Reports, researchers at the University of Lancaster suggest that their solution will pay off in the long run, especially as end users’ and decision makers’ memory needs shift. They also predict that research on memory utilization won’t stop at Universal Memory; it will always be on-going. “Whilst the different forms of conventional (charge-based) memories are well suited to their individual roles in computers and other electronic devices, flaws in their properties mean that intensive research into alternative, or emerging, memories continues.”

The post What Decision Makers Need to Know About Energy Consumption and Universal Memory appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/what-decision-makers-need-to-know-about-energy-consumption-and-universal-memory/feed/ 1
Q&A: An Opinion on Data Loss Detection and Response Platforms https://mytechdecisions.com/it-infrastructure/cybersecurity-data-loss-and-response-platforms/ https://mytechdecisions.com/it-infrastructure/cybersecurity-data-loss-and-response-platforms/#comments Wed, 29 May 2019 10:02:51 +0000 https://mytechdecisions.com/?p=16495 In a world of ever-evolving threats, organizations must be able to protect data regardless of where it travels while enabling collaboration and information sharing, so people can get work done. To react to today’s problems and proactively anticipate tomorrow’s, it takes a whole new set of rules, especially as more businesses push data into public […]

The post Q&A: An Opinion on Data Loss Detection and Response Platforms appeared first on My TechDecisions.

]]>
In a world of ever-evolving threats, organizations must be able to protect data regardless of where it travels while enabling collaboration and information sharing, so people can get work done. To react to today’s problems and proactively anticipate tomorrow’s, it takes a whole new set of rules, especially as more businesses push data into public clouds.

My TechDecisions sat down with GRA Quantum’s Chief Information Security Officer Antonio Garcia to learn more about the portfolio of cybersecurity services the company offers, how they achieved greater control and visibility over their own public cloud deployment with data loss and response platforms, and why it’s important for them to try their partners’ software solutions before they sell them to customers.

TD: What are some of the biggest concerns your customers are facing in today’s business environment?

Antonio: The causes for today’s security threats involve people as much as the digital and physical limits of software solutions. Right now, businesses are in the midst of a significant shift as more organizations move critical data into public clouds to reap the many benefits of data loss detection and response platforms.

The downside to this rapid shift is that documents containing sensitive information are moving to the cloud faster than businesses can protect them.

This puts organizations at greater risk. Users are not following security policies or paying attention to the potential security dangers of sharing data from a public cloud. This is causing some anxiety for the security and risk professionals charged with protecting data and keeping companies in compliance with data privacy regulations.

They’re well aware that once files are downloaded from a public cloud and forwarded to outside users, they have lost the ability to protect that information from unsanctioned users.

TD: Could you tell us more about GRA Quantum? Who are your customers and how do you help them?

Antonio: GRA Quantum helps small firms facing big threats build and implement tailored cybersecurity programs that protect their business and reputation.

We offer managed security services and an array of professional services, including penetration testing, security risk assessments, incident response, insider threat planning, security awareness training.

TD: How do you select which technology vendors will be part of your portfolio of solutions?

Antonio: As a provider of cybersecurity services, GRA Quantum takes the trust of its customers and their security infrastructure very seriously. These things go hand-in-hand when it comes to selecting technology to use and sell.

A majority of the technology solutions we sell to our clients are the same ones we use and rely on to protect our business. We wouldn’t promote or endorse anything we hadn’t used or weren’t completely confident in.

Antonio Garcia, GRA Quantum

TD: Can you give us an example of a solution you used that you now recommend to your clients?

Antonio: Our team was in need of additional visibility and control over documents stored in our Microsoft OneDrive deployment across our global GRA offices.

We started using the Allure Security, a data loss detection and response platform, internally.

In a short amount of time, it gave us a greater understanding of OneDrive use in various locations. We were able to monitor document access, in real time, and know where and when documents were being downloaded and shared.

We gained a lot more visibility into user and file activities, and can better inform our data loss responses based on Allure’s unique document and geolocation indicators.

TD: Why is this important to know?

Antonio: For global organizations like ours, it is becoming increasingly vital to have more control over where data travels due to data privacy regulations such as GDPR and others.

It’s also just good business to add more controls around data stored in public clouds. Having more context around where and how documents are being shared allows us to establish a baseline of normal behavior within our OneDrive deployment.

We can then monitor and measure against that baseline, so when unusual behavior is detected, we can drill down into that and determine if there’s a risk of data loss.

For example, with the ability to know when large volumes of files are being downloaded, by whom, and where these documents are being sent, we can determine whether a data breach has occurred or it’s just a matter of a user needing these files to do their work.

TD: Is “alert fatigue” or information overload ever a concern when using solutions like this?

Antonio: With the Allure data loss and response platforms, we were able to configure data loss risk monitoring based on specific criteria. Our deployment is set up to only issue alerts whenever a document was accessed in a location or region where GRA Quantum has no office.

For other regions, we only receive alerts when an attempt to access a document falls outside of our company security policy.

Using these criteria, the team rarely receives false positive alerts, but even in these cases, they are not a waste of time to investigate. For example, we received an alert that a GRA administrative assistant who was working in the United States opened a file in the Philippines.

Read Next: Data Breach Incident Response Plan: 6 Essential Steps

We don’t have an office there, so once we saw the alert about this potentially unusual activity, we made a few calls and learned that this individual user had utilized a VPN with an IP address in the Philippines to access the file. It was reassuring to know that Allure was able to detect the activity, and it gave us peace of mind to be able to quickly determine that this wasn’t an attack.

TD: How does this solution differ from other security tools that you rely on?

Antonio: Anyone who manages or secures a OneDrive cloud instance is familiar with something called a cloud activity log. Security operations teams comb through these logs and look for any suspicious user behavior, but the difference is that Allure is able to flag and escalate any unusual activity ahead of time.

In the example I talked about earlier involving the VPN in the Philippines, our Managed Security Services Director actually checked for this activity in our raw Microsoft logs, and found that the event was incredibly difficult to locate. Allure was able to alert us to activity that would have otherwise been buried.

TD: Aside from the effectiveness and usability of a software solution, what are some other factors that help GRA Quantum when evaluating partner vendors?

Antonio: Once we have personally experienced the benefits of using a data loss and response platforms solution, we look at other deciding factors, such as the responsiveness of the vendor. What kind of support can they provide after the sale?

In this case, Allure’s detection and response technology inspired us to become a partner and now, we resell it to our clients. In addition to having great technology, Allure is also a true partner. They are receptive to our product feedback.

They listen to our needs and ideas, and are incorporating some of our suggestions directly into the product. In my view, our sales and support team can better position solutions than some of our competitors because everything we sell has gone through our own internal review.

It gives us the confidence to stand behind our data loss and response platforms recommendations to clients, because we are customers, too.

The post Q&A: An Opinion on Data Loss Detection and Response Platforms appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/cybersecurity-data-loss-and-response-platforms/feed/ 1
Conquering Security Risks in the Cloud https://mytechdecisions.com/network-security/conquering-security-risks-in-the-cloud/ https://mytechdecisions.com/network-security/conquering-security-risks-in-the-cloud/#comments Thu, 04 Apr 2019 18:00:57 +0000 https://mytechdecisions.com/?p=15395 Are you ready to tackle the challenges of protecting data in the cloud?

The post Conquering Security Risks in the Cloud appeared first on My TechDecisions.

]]>
The cloud is responsible for a lot of good things in the world of enterprise. It has simplified storage, reduced costs, and maximized productivity. But there are still many lingering challenges and concerns as new cyber threats evolve.

According to findings from Barracuda, backing up data in the cloud is getting more complex: 57 percent of respondents of the study are responsible for backing up more than two sites, and 35 percent are using multiple cloud services.

In the study titled Closing Backup and Recovery Gaps, Barracuda surveyed more than 1,000 IT professionals, business executives, and backup administrators worldwide to find out more about their data protection strategies.

 

Here are some of the key findings:

  • IT decision makers are warming up to the cloud, and the use of the cloud as a secondary backup location is on the rise.
    • 64 percent of global organizations say they replicate backup data to the cloud
    • 36 percent still do not follow this best practice
  • IT teams view email, SQL, and proprietary application data as the most common workloads to protect with backup, but SaaS data is not viewed as critical, which puts business continuity at risk.
    • Only 16 percent of respondents report that they back up their SaaS data.
  • Office 365 is one of the most popular cloud-based productivity platforms, but Office 365 confusion is exposing firms to significant risk
    • More than 60 percent of SMBs are using Office 365 to drive business success
    • 40 percent are not using any third-party backup tools to protect mission-critical data because they believe Office 365 provides all the backup they need, which is unlikely to be true

 

“While more IT professionals are embracing ways the cloud can support data protection, such as replicating backup data to the cloud, many are making dangerous assumptions about SaaS and cloud data that are putting organizations at risk,” says Chris King, director, Product Management, Data Protection at Barracuda Networks. “IT still needs to consider how data is protected, even after migrating to cloud or SaaS applications.”

The post Conquering Security Risks in the Cloud appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/network-security/conquering-security-risks-in-the-cloud/feed/ 1