data Archives - My TechDecisions https://mytechdecisions.com/tag/data-1/ The end user’s first and last stop for making technology decisions Fri, 17 Nov 2023 19:20:50 +0000 en-US hourly 1 https://mytechdecisions.com/wp-content/uploads/2017/03/cropped-TD-icon1-1-32x32.png data Archives - My TechDecisions https://mytechdecisions.com/tag/data-1/ 32 32 What is It About AI That Brings Excitement, Fear and Curiosity All at Once? https://mytechdecisions.com/news-1/what-is-it-about-ai-that-brings-excitement-fear-and-curiosity-all-at-once/ https://mytechdecisions.com/news-1/what-is-it-about-ai-that-brings-excitement-fear-and-curiosity-all-at-once/#respond Mon, 20 Nov 2023 14:05:35 +0000 https://mytechdecisions.com/?p=49118 Artificial Intelligence (AI) has taken over most business conversations. The progress is astounding. The use cases are promising. The possibilities are intriguing. The concept is not new. We’ve been toying around with AI since the 1950’s. Research and experimentation slowed down a bit in early 2000’s and now it is skyrocketing for obvious reasons. The […]

The post What is It About AI That Brings Excitement, Fear and Curiosity All at Once? appeared first on My TechDecisions.

]]>
Artificial Intelligence (AI) has taken over most business conversations. The progress is astounding. The use cases are promising. The possibilities are intriguing. The concept is not new. We’ve been toying around with AI since the 1950’s. Research and experimentation slowed down a bit in early 2000’s and now it is skyrocketing for obvious reasons. The capabilities of data capturing, storing, and computing are not a limitation anymore. Let’s skip the rest of the history lesson and focus on today. AI is here. It is happening. And it is leaving mixed feelings around its presence, its essence, and its role. What do we do with it? How do we use it? How much can we trust it? How can it help our businesses? And when will it take our jobs?

The Exciting Part of AI

Seeing images and sceneries get constructed in front of our eyes and resulting in flawless visuals is breath taking. Receiving on-point responses to our various inquiries and engaging in a constructive conversation with a web application is fascinating. Viewing personalities recorded on video having a discussion they never had in real life is jaw dropping. This is where exciting starts. But it is not what AI can only do. It can do real work making us more productive. It can make supply chain decisions. It can make medical diagnosis. It can make personalized curriculums for students based on their individual capabilities.

At the core of it, AI is possible because of two things: an advanced algorithm and data—lots of data. AI algorithms differ from what we’ve traditionally been accustomed to in previous technology innovations. They do not rely on if/then/else logic. There is no human-inferred logic at all. It is left to the machine to draw decisions based on its deep learning with the help of neural networks and data. AI feeds on data. The more the data, the better AI is in developing synopsis and characterization.

AI has become very good in making decisions on content development, customer insight, production processes and even medical diagnosis. It is one thing for AI to do what we do. It another thing for AI to do it better than us. AI now has lower error rates than those of a human. That’s where excitement turns into fear.

The Frightening Part of AI

The fear from AI falls into three camps. The first is around the bias generated by AI. AI reflects on what it learns from data. It is hard to ensure that data is inclusive of all situations. If it is a loan approval AI, is it influenced by demographics or environmental conditions? There is no standard validation of AI data. Thus, there is a concern that although AI can be precise based on the data feeding it, it can be biased. And if there is probability of bias, how can we trust its objective reasoning when it comes to sensitive areas.

The second fear from AI is loss of jobs. If AI is destined to be as good and better than a human in resolving issues, accomplishing tasks, and making decisions, it is only time before AI replaces people in the workforce. Forecasts range from 10% to 70% loss of jobs by 2030. We haven’t been good at forecasting the impact of previous innovations; nonetheless, there will be change. It is not necessarily all bad. There will be opportunities with roles never existed before, like prompt engineering. Humans can adapt to changes. The only difference here is that it is happening at an astounding speed.

Living with AI makes the third fear. AI as we know it today does not yet fit its theoretical definition of mimicking how a human brain works in thinking, rationalizing and directing behavior. People can do their taxes, paint a canvas, engage in jury duty and hold a conversation on ancient literature. We may not do all of them with expertise, but we can cross-learn and think about multitude of topics. In contrast, AI today is deep and narrow. AI image generator cannot review financial statements. But then again, it is unlikely we have a judge who is also a surgeon. We don’t know if, or when, AI capabilities will eventually converge with that of humans. But if they do, what will become of our purpose? Even with the current state of multiple specialized AI solutions where each can generate efficiencies in respective areas, will a business with a single owner employing AI to serve business operations be a state we are willing to accept and live with?

The Curiosity to Keep AI Going

We’ve been curious about AI since 2011 when IBM Watson beat Jeopardy champion and earlier in 1997 when IBM Deep Blue supercomputer beat the world’s champion in chess. The fears may be big, but the curiosity is bigger. What is else is possible? And how much is possible?

Neither of the fears above has a strong rebuttal, so they are all valid. AI can redefine who we are and what we do, especially when AI starts generating and managing new AI without any human guidance. Time magazine’s End of Humanity cover and an AI Pause request letter from AI enthusiasts make these fears more real. But it is all hypothetical. We don’t know what we don’t know. That’s why it is hard to pause now. We are all curious to know. Curiosity will continue to drive us to learn, experiment and innovate. The result can be something we never dreamed of before. Will AI help us travel in time? Will we gain the knowledge to explore outer galaxies at the speed of light because of the extra artificial neural power we will have? Will this become our new purpose and identity rather than a job title?

What to Do Today

As an individual, it is important to get comfortable with AI. If not to use, then to know well. We will all be touched by it in the near future, if not already, one way or another. We should ask questions—not out of fear but out of curiosity. And we should actively participate in developing it and making it better as it pertains to our domain and our expertise.

A business or an organization should engage in AI adoption to advance operations efficiency and decisioning precision. But in parallel, it should equally focus on social responsibility. Social responsibility is a mandatory track for the diffusion and evolution of AI. Understanding anticipated impacts and actively working to transition societies to a workable future state is the responsibility of every business benefiting from AI. The work around social responsibility is lagging, hence the fear. For it to catch up, AI advancement does not have to slow down or pause. We just have to turbo charge social awareness and practical solutioning. That goes beyond setting limiting laws and protocols. Ironically, AI may be option to help set its own social guardrail.

In practice, an organization needs to find its AI opportunities. After understanding that AI is not a blanket solution across the chain of operations, targeting efficiencies in a particular area (acquisition, production, fulfillment) is where progress and success can be achieved. From there, a respective model or algorithm is identified and secured. The marketplace has sufficient models to choose from with simple setup. Third party data can be leveraged for the AI, but if an organization wants or depends on its distinctive customer base insights, then increased efforts of first party data collection, curation, cleansing, storing, and analyzing is necessary. This will empower their AI solution to provide tailored decisions and directions supporting their unique efficiency and growth trajectory.


Raghid El-Yafouri is the Technology and Digital Transformation Consultant at Bottle Rocket Studios.

The post What is It About AI That Brings Excitement, Fear and Curiosity All at Once? appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/news-1/what-is-it-about-ai-that-brings-excitement-fear-and-curiosity-all-at-once/feed/ 0
Spike in Cyberattacks Exposes Vulnerabilities in University Security Measures https://mytechdecisions.com/physical-security/spike-in-cyberattacks-exposes-vulnerabilities-in-university-security-measures/ https://mytechdecisions.com/physical-security/spike-in-cyberattacks-exposes-vulnerabilities-in-university-security-measures/#respond Mon, 21 Aug 2023 15:35:18 +0000 https://mytechdecisions.com/?p=48986 Note: The views expressed by guest bloggers and contributors are those of the authors and do not necessarily represent the views of, and should not be attributed to My TechDecisions. As expected from authorities anticipating an increase in threats to the education sector, cyberattacks are continuing to wreak havoc on colleges and universities across the United States. As […]

The post Spike in Cyberattacks Exposes Vulnerabilities in University Security Measures appeared first on My TechDecisions.

]]>
Note: The views expressed by guest bloggers and contributors are those of the authors and do not necessarily represent the views of, and should not be attributed to My TechDecisions.

As expected from authorities anticipating an increase in threats to the education sector, cyberattacks are continuing to wreak havoc on colleges and universities across the United States. As of the beginning of May, there had already been 27 confirmed ransomware attacks against U.S. institutions. These ransomware numbers only tell part of the story as data breaches, malware attacks, and more account for an even greater number of threats, not all of which are reported to the public as they occur.

The second quarter of 2023 has seen a flurry of cyberattacks strike higher education institutions, including West Virginia’s Bluefield University, Tennessee’s Chattanooga State Community College, and Georgia’s Mercer University, among others. Beyond the obvious consequences of ransom payments and leaked personal data, some of the most severe attacks in recent memory have culminated in the delay and cancelation of classes, as well as the closure of one college in Illinois entirely.

With attacks against higher education on the rise year-over-year, campuses have become one of the top targets for attempted data breachesransomware attacks, malware, and more. Feeling the effects of various financial and/or technological hurdles, most schools are not currently equipped with the security controls to adequately defend themselves from increasingly sophisticated cyber threats that continue to hamper the community.

This increase in cyberactivity should serve as a wake-up call for higher education institutions to reevaluate and enhance their cybersecurity postures. Here are some of the top considerations for higher education leaders seeking to plug the gaps in their cybersecurity strategy.

Securing Data

One of the recurring themes in attacks against higher education is the vulnerability of sensitive data. From student, staff, and faculty information to sensitive school records, there are countless data assets that, if breached, can be weaponized against institutions.

Data exfiltration, or unauthorized data transfer, is a leading threat to data security in higher education. To help prevent data loss, colleges and universities need to be able to monitor user and entity behavioral analytics (UEBA) and they need to be able to watch their network using a network detection and response (NDR) tool. This allows schools to detect, qualify, and remediate any anomalous activity at the individual level, as well as malicious or unauthorized attempts at exfiltration.

Managing Access

For colleges and universities, student information, research data, and assessment criteria are all critical to daily operations. However, it can be common for institutions to encounter unauthorized access to these types of crucial information due to a lack of IT resources and necessary safeguards. This can result in the loss of confidentiality, integrity, and availability of technological assets, among other things.

To better facilitate and manage user access to sensitive data, schools should implement an effective IT security strategy intentionally designed to protect critical assets. This strategy should include the compartmentalization of data and provide a least privileged approach to accessing that data. Utilizing a least privileged approach, users are only granted access to the data required for their specific roles. This helps to prioritize the protection of intellectual property that is so valuable to higher education institutions. In doing so, schools can better protect the privacy of their students and employees and their reputations.

Detecting Threats

Even with cybersecurity mechanisms in place, no security threat can be resolved if it falls undetected. Colleges and universities must be able to detect, alert and automate security response capabilities when threats arise. Institutions should consider adopting security orchestration, automation, and response (SOAR) tools to help standardize and scale their incident response.

By relying on SOAR, schools can automate workflows to accelerate various stages of the threat investigation and response processes. Given the severity of a particular threat, it can be escalated to key decision-makers for a manual response or remediated automatically (or semi-automatically) from a playbook of preselected actions. Ultimately, SOAR is intended to help security teams cut through the noise and allow them to prioritize and direct their attention toward the most pressing threats.

Protecting and Prospering

Given the attack patterns of the last two years, cyberattacks in higher education are not going away overnight. Colleges and universities continue to be targeted by malicious actors for a reason. As long as institutions remain underequipped to monitor and respond to cybersecurity threats, they will find themselves with a target on their back.

Regardless of an institution’s budgetary constraints, there are tried and true precautions that can be taken to better protect their campus. Implementing threat detection, stricter access controls, and stronger data security measures are all foundational components of an effective cybersecurity strategy. By solidifying that foundation, colleges and universities can do their part to avoid being next in the line of higher education victims.

Another version of this article originally appeared on our sister-site Campus Safety on August 14, 2023. It has since been updated for My TechDecisions’ audience.


Kevin Kirkwood is Deputy CISO for LogRhythm.

The post Spike in Cyberattacks Exposes Vulnerabilities in University Security Measures appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/physical-security/spike-in-cyberattacks-exposes-vulnerabilities-in-university-security-measures/feed/ 0
AWS Launches General Availability of Amazon Security Lake https://mytechdecisions.com/network-security/aws-launches-general-availability-of-amazon-security-lake/ https://mytechdecisions.com/network-security/aws-launches-general-availability-of-amazon-security-lake/#respond Wed, 31 May 2023 15:12:08 +0000 https://mytechdecisions.com/?p=48642 AWS is launching the general availability of Amazon Security Lake, a new service designed to automatically centralize an organization’s security data from across their AWS environments, leading SaaS providers, on-premises environments, and cloud sources into a purpose-built data lake. According to AWS, this allows customers to act on security data faster and helps them simplify […]

The post AWS Launches General Availability of Amazon Security Lake appeared first on My TechDecisions.

]]>
AWS is launching the general availability of Amazon Security Lake, a new service designed to automatically centralize an organization’s security data from across their AWS environments, leading SaaS providers, on-premises environments, and cloud sources into a purpose-built data lake.

According to AWS, this allows customers to act on security data faster and helps them simplify security data management across hybrid and mutlicloud environments.

The Seattle-based tech giant says Amazon Security Lake converts and conforms incoming security data to the Open Cybersecurity Scheme Framework (OCSF) open standard to make it easier for security teams to automatically collect, combine and analyze security data from more than 80 sources. Those sources include AWS, security partners and analytic providers.

Some of those source, subscriber and service partners include Barracuda, Cisco Secure, CrowdStrike, Darktrace, ExtraHop, Lacework, Netscout, Netskope, Okta, Palo Alto Networks, Ping Identity, Trellix, Trend Micro, VMware Ario Automation for Secure Clouds, Wiz, Zscaler, Rapid7, IBM Security, Splunk, Accenture, Booz Allen Hamilton, Deloitte, PwC and many more. Read the full list of partners here.

AWS calls Amazon Security Lake part of a “broad set of AWS Cloud security services built on AWS infrastructure to help make it the most flexible and secure cloud trusted by millions of customers, including some of the most security-sensitive organizations, including some of the most security-sensitive organizations, and is supported by a broad community of security partners to help customers elevate their security in the cloud.”

The company says Amazon Security Lake essentially aggregates and optimizes large volumes of disparate log and event data to help enable faster threat detection, investigation and response so organizations can effectively address potential threats more quickly using their preferred analytics tools.

Amazon Security Lake is designed to help companies aggregate and normalize security data into one consistent schema to help analyze it and understand their vulnerabilities and monitor threats, which can be difficult in hybrid IT environments.

This can also help organizations centralize their security operations and eliminate the need to duplicate and process the same data multiple times in different security solutions, AWS says.

In addition, monitoring new users, tools, and data sources, means managing a complex set of data access rules and security policies to track how data is used while ensuring that employees can still access the information needed to do their jobs. Some security teams create a central repository for all of their security data in a data lake, but AWS says these systems require specialized skills and can take a long time to build.

AWS says the service builds the security data lake using Amazon Simple Storage Service (Amazon S3) and AWS Lake Formation to automatically set up security data lake infrastructure in a customer’s AWS account, providing full control and ownership over security data.

Amazon Security Lake is generally available today in US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), Europe (London), and South America (São Paulo) with availability in additional AWS Regions coming soon.

In a statement, Jon Ramsey, the vice president for Security Services at AWS, said security has been the company’s priority since the beginning.

“We also know that customers need trusted partners to extend the benefits of the cloud and make sure their organizations are secure end-to-end,” Ramsey said. “With more than 80 sources providing data to Amazon Security Lake, security teams can achieve greater visibility into potential security threats and how to respond to them, further protecting the workloads, applications, and data that are critical to driving business forward.”

The post AWS Launches General Availability of Amazon Security Lake appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/network-security/aws-launches-general-availability-of-amazon-security-lake/feed/ 0
Hitachi Vantara Introduces Data Reliability Engineering Services to Optimize Data Ecosystems https://mytechdecisions.com/it-infrastructure/hitachi-vantara-introduces-data-reliability-engineering-services-to-optimize-data-ecosystems/ https://mytechdecisions.com/it-infrastructure/hitachi-vantara-introduces-data-reliability-engineering-services-to-optimize-data-ecosystems/#respond Tue, 30 May 2023 21:01:31 +0000 https://mytechdecisions.com/?p=48639 Hitachi Vantara, the data management and digital solutions subsidiary of Hitachi, is launching Hitachi Data Reliability Engineering, a new suite of consulting services designed to help organizations improve the quality and consistency of business-critical data. This comes as organizations are struggling with the increasing complexity of data and IT environments, connected devices, and applications, in […]

The post Hitachi Vantara Introduces Data Reliability Engineering Services to Optimize Data Ecosystems appeared first on My TechDecisions.

]]>
Hitachi Vantara, the data management and digital solutions subsidiary of Hitachi, is launching Hitachi Data Reliability Engineering, a new suite of consulting services designed to help organizations improve the quality and consistency of business-critical data.

This comes as organizations are struggling with the increasing complexity of data and IT environments, connected devices, and applications, in addition to a surge of AI-enabled tools.

According to Hitachi, Data Reliability Engineering (DRE) allows organizations to embed quality data into applications and enhance internal processes and customer-centric business strategies.

The solution combines cutting-edge tools and proven DataOps processes and employs metadata engineering, data lineage, data cost optimization, self-healing mechanisms and AI-driven automation to provide visibility, reliability and resilience throughout the data lifecycle, the company says.

According to Hitachi, DRE ensures high-quality data systems and pipelines, via an automated and secure self-service approach that helps to deliver consistent, trustworthy data.

Roger Lvin, president of digital solutions at Hitachi Vantara, says many organizations are  grappling with the unprecedented volume and complexity of their data environments and don’t have the resources to maintain trustworthy, highly functional data to fuel their complex analytics and modern application needs.

“Hitachi DRE is Site Reliability Engineering (SRE) for data,” Lvin said in a statement. “Addressing the incredible pace of Generative AI and tsunami of data from connected devices, it has become an imperative to manage data pipelines safely and accurately through AI-driven automation. Hitachi DRE’s brand-new approach allows our customers to regain control of their data and maximize the value it provides to their organization.”

The post Hitachi Vantara Introduces Data Reliability Engineering Services to Optimize Data Ecosystems appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/hitachi-vantara-introduces-data-reliability-engineering-services-to-optimize-data-ecosystems/feed/ 0
OpenAI Rolls Out New Data Control Tools for ChatGPT; Business Offering Soon https://mytechdecisions.com/it-infrastructure/chatgpt-data-control/ https://mytechdecisions.com/it-infrastructure/chatgpt-data-control/#respond Tue, 25 Apr 2023 18:41:05 +0000 https://mytechdecisions.com/?p=48104 Seemingly in response to concerns over data privacy and security concerns related to how data is used to train the AI models powering ChatGPT, OpenAI is introducing the ability to turn off chat history in ChatGPT to prevent conversations from being used to train and improve models. According to the AI research and development firm […]

The post OpenAI Rolls Out New Data Control Tools for ChatGPT; Business Offering Soon appeared first on My TechDecisions.

]]>
Seemingly in response to concerns over data privacy and security concerns related to how data is used to train the AI models powering ChatGPT, OpenAI is introducing the ability to turn off chat history in ChatGPT to prevent conversations from being used to train and improve models.

According to the AI research and development firm that has ushered in a new way of working defined by highly intelligent “copilots” capable of generating text, images and other media, the controls will be rolling out to all users.

The controls can be found in ChatGPT’s settings and can be changed at any time. Previously, OpenAI had an opt-out process for users who wanted to protect their data.

When users disable chat history, new conversations will be retained for 30 days and will be reviewed only when needed to monitor for abuse. Then, they will be permanently deleted, the company says in a new blog.

In addition, the company says it is working on a new ChatGPT Business subscription for professionals who want more control over their data, as well as enterprises who want to manage their end users.

OpenAI says the Chat GPT Business subscription offering–which will be launched in the coming months– for the follow its API’s data usage policies, meaning that end users’ data won’t be used to train models by default.

The company is also introducing a new Export option in settings to make it easier for users to export their ChatGPT data and understand what information ChatGPT stores. Users that use this will receive a file with conversations and all other relevant data in email.

OpenAI has previously said that its large language models (LLMs) are trained on a broad range of data, including publicly available content, licensed content and content generated by human reviewers. The company has pledged to not use data to sell services, advertise, or build profiles of people. Instead, data is used to help improve the models powering new AI tools.

“While some of our training data includes personal information that is available on the public internet, we want our models to learn about the world, not private individuals,” the company said in a blog earlier this month. “So we work to remove personal information from the training dataset where feasible, fine-tune models to reject requests for personal information of private individuals, and respond to requests from individuals to delete their personal information from our systems.”

Data privacy and security have been major concern of IT and security leaders, with some even calling for organizations to block unsanctioned use of ChatGPT and similar tools.

The post OpenAI Rolls Out New Data Control Tools for ChatGPT; Business Offering Soon appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/chatgpt-data-control/feed/ 0
Rocket Software Launches Rocket Content Automation for Data Modernization https://mytechdecisions.com/it-infrastructure/rocket-software-rocket-content-automation-data-modernization/ https://mytechdecisions.com/it-infrastructure/rocket-software-rocket-content-automation-data-modernization/#respond Wed, 05 Apr 2023 17:12:03 +0000 https://mytechdecisions.com/?p=47757 Enterprise software firm Rocket Software is unveiling Rocket Content Automation, a new tool designed to enable critical operational connections between disparate systems and data sources without introducing mainframe migration or compliance risks. According to the Waltham, Mass.-based company, Rocket Content Automation helps drive collaboration between business and IT by creating seamless automation experiences across the […]

The post Rocket Software Launches Rocket Content Automation for Data Modernization appeared first on My TechDecisions.

]]>
Enterprise software firm Rocket Software is unveiling Rocket Content Automation, a new tool designed to enable critical operational connections between disparate systems and data sources without introducing mainframe migration or compliance risks.

According to the Waltham, Mass.-based company, Rocket Content Automation helps drive collaboration between business and IT by creating seamless automation experiences across the Rocket Content Services portfolio and modernizing enterprises’ content and data landscapes.

The company says the solution helps organizations save money and resources by connecting mission-critical content across mainframe, distributed and cloud technologies with a hybrid approach that accelerates modernization efforts.

“Modernization efforts can take years to implement, but Rocket Content Automation accelerates the timeline to integrated, end-to-end processes that more closely align internal processes to support customer-facing initiatives,” the company says in a news release.

Rocket Software says Content Automation is a closed-loop environment that provides an end-to-end audit of all activity into a single dashboard, enabling organizations to scale the solution with its growth. The solution combines data from different sources into a single dashboard and drives integration with other systems.

Additional benefits of Rocket Content Automation include a standards-compliant automation engine that works with ERPs, cloud and on-premise systems and solutions. The tool also features revamped modern APIs, user experience tools and information governance services of Rocket’s Content Platform.

In addition, Rocket Content Automation includes self-service capabilities with dashboards and reports using familiar tools like Tableau and Power BI.

Chris Wey, Rocket Software’s president of data modernization, calls automation the “lynchpin” for data and information modernization.

“With Rocket Content Automation, customers can connect their legacy systems to modern business processes designed with modern tools,” Wey says in a statement. “Rocket Software is the only software provider to offer a unified solution for automation that is integrated into the product architecture and built for security and scalability.”

The post Rocket Software Launches Rocket Content Automation for Data Modernization appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/rocket-software-rocket-content-automation-data-modernization/feed/ 0
AWS Launches Clean Rooms to Help Companies Collaborate on Datasets https://mytechdecisions.com/it-infrastructure/aws-clean-rooms-companies-collaborate-datasets/ https://mytechdecisions.com/it-infrastructure/aws-clean-rooms-companies-collaborate-datasets/#respond Wed, 05 Apr 2023 16:56:50 +0000 https://mytechdecisions.com/?p=47753 Amazon Web Services is launching the general availability of AWS Clean Rooms, an analytics service of AWS Applications designed to help organizations and their partners more easily and securely collaborate on their collective datasets without sharing or copying each other’s data. According to the tech giant, the solution allows customers to quickly create a secure […]

The post AWS Launches Clean Rooms to Help Companies Collaborate on Datasets appeared first on My TechDecisions.

]]>
Amazon Web Services is launching the general availability of AWS Clean Rooms, an analytics service of AWS Applications designed to help organizations and their partners more easily and securely collaborate on their collective datasets without sharing or copying each other’s data.

According to the tech giant, the solution allows customers to quickly create a secure data clean room on the AWS Cloud and collaborate with partners using a broad set of built-in, privacy-enhancing controls for clean room that allow organizations to customize restrictions on the queries run by each clean room participant.

The restrictions include query controls, query outputs, and query logging, as well as advanced cryptographic computing tools to keep data encrypted as queries as processed.

AWS says Clean Rooms currently supports up to five collaboration members, including the collaboration creator.

The company uses a marketing use case as an example, saying brands, publishers and partners need to collaborate using datasets that are stored across many channels and applications to improve campaigns and better engage customers.

“At the same time, they also want to protect sensitive consumer information and eliminate the sharing of raw data,” AWS says in blog. “Data clean rooms can help solve this challenge by allowing multiple companies to analyze their collective data in a private environment.”

The tool works by analyzing Amazon S3 data in place, which AWS says eliminates the need for companies to copy and load their data into destinations outside their respective AWS environments of the collaboration members or using third-party services.

AWS Clean Rooms also includes a broad set of privacy and security tools to protect each party’s data, such as analysis rules that allow customers to tailor queries to specific business needs.

Any functionality offered by AWS Clean Rooms can be accessed via the API using AWS SKDs or AWS CLI, the company says. This allows for the integration of AWS Clean Rooms into products or workflows.

Customers can also access query logs to ensure their data is being used as intended, and a cryptographic computing feature gives customers the option to perform client-side encryption for sensitive data.

The general availability release comes after the company first announced the tool at AWS re:Invent last year and released the preview in January.

The post AWS Launches Clean Rooms to Help Companies Collaborate on Datasets appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/aws-clean-rooms-companies-collaborate-datasets/feed/ 0
Gartner: Less than Half of Data & Analytics Teams Effectively Drive Value to Business https://mytechdecisions.com/news-1/gartner-less-than-half-of-data-analytics-teams-effectively-drive-value-to-business/ https://mytechdecisions.com/news-1/gartner-less-than-half-of-data-analytics-teams-effectively-drive-value-to-business/#respond Wed, 22 Mar 2023 14:38:21 +0000 https://mytechdecisions.com/?p=47563 Less than half of data and analytics (D&A) leaders (44%) reported that their team is effective in providing value to their organization, according to Gartner, Inc.’s latest survey among 566 D&A leaders globally. Gartner analyst say chief data and analytics officers (CDAOs) must focus on presence, persistence and performance to succeed in their role and […]

The post Gartner: Less than Half of Data & Analytics Teams Effectively Drive Value to Business appeared first on My TechDecisions.

]]>
Less than half of data and analytics (D&A) leaders (44%) reported that their team is effective in providing value to their organization, according to Gartner, Inc.’s latest survey among 566 D&A leaders globally. Gartner analyst say chief data and analytics officers (CDAOs) must focus on presence, persistence and performance to succeed in their role and deliver measurable business results.

“D&A is in the business of driving stakeholder value,” said Donna Medeiros, senior director analyst, at Gartner in a statement. “The most successful CDAOs are outperforming their peers by projecting an executive presence and building an agile and strategic D&A function that shapes data-driven business performance and operational excellence.

Successful CDAOs Traits 

The survey found that D&A leaders who rated themselves as “effective” or “very effective” across 17 different executive leadership traits correlated with those reporting high organizational and team performance. For example, 43% of top-performing D&A leaders reported effectiveness in committing time to their own professional development, compared with 19% of low performers.

“Successful CDAOs must be elite leaders,” said Alan Duncan, distinguished VP analyst, Gartner, in a statement. “Top-performing CDAOs invest in their success by developing skills to thrive in ambiguous circumstances, articulate compelling value stories and identify D&A products and services that can drive business impact.”

CDAOs New Demands

The survey found that CDAOs are tasked with a broad range of responsibilities, including defining and implementing D&A strategy (60%), oversight of D&A strategy (59%), creating and implementing D&A governance (55%) and managing data-driven culture change (54%).

Furthermore, many D&A functions are receiving increased investment, including data management (65%), data governance (63%) and advanced analytics (60%). The mean reported D&A budget is $5.41 million, and 44% of D&A teams increased in size in the last year.

“The demands being placed upon D&A, as well as increased investment, reflect a growing confidence in CDAOs’ abilities and recognition of the data office as an indispensable business function,” said Medeiros. “However, this leads to more work as pressure grows for D&A to achieve tangible business results.”

Roadblocks to the Success of D&A Initiatives 

Given the scope and complexity of demands being placed on D&A teams, the lack of available talent has quickly become a top impediment to success, as reported by 39% of respondents. The top six roadblocks reported in the survey are all human-related challenges, as shown in figure below.

Gartner top roadblock s to the success of D&A Initiatives
Top Roadblocks to the Success of D&A initiatives. Courtesy Gartner.

To build an effective D&A team, CDAOs must have a robust talent management strategy that goes beyond hiring ready-made talent, says Gartner. This should include education, training and coaching for data-driven culture and data literacy, both within the core D&A team and the broader business and technology communities.

D&A Performance & Business Strategy Alignment

The survey found that 78% of respondents rank corporate or organizational strategy and vision as one of the top three inputs to the D&A strategy. Additionally, 68% are prioritizing D&A initiatives based on alignment to strategic goals.

“CDAOs who prioritize strategy over tactics are the most successful,” said Duncan. “Because the CDAO serves multiple stakeholders across the business, they must align with organizational strategic priorities and focus on selling the D&A vision to the CEO, CIO and CFO as key influencers.”

The post Gartner: Less than Half of Data & Analytics Teams Effectively Drive Value to Business appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/news-1/gartner-less-than-half-of-data-analytics-teams-effectively-drive-value-to-business/feed/ 0
Orchestrating the Data Symphony: Start with ‘What’ and ‘Why’ Before ‘How’ https://mytechdecisions.com/compliance/orchestrating-the-data-symphony-start-with-what-and-why-before-how/ https://mytechdecisions.com/compliance/orchestrating-the-data-symphony-start-with-what-and-why-before-how/#respond Thu, 09 Mar 2023 03:06:51 +0000 https://mytechdecisions.com/?p=47374 Managing data is a lot like conducting an orchestra. Many diverse parts combine to create a symphony, and data analysts are the conductors responsible for achieving it. They do this by harmonizing multiple data points into a single repository, and then extracting insights thoughtfully. Data teams, i.e., data engineers and data analysts, are the core […]

The post Orchestrating the Data Symphony: Start with ‘What’ and ‘Why’ Before ‘How’ appeared first on My TechDecisions.

]]>
Managing data is a lot like conducting an orchestra. Many diverse parts combine to create a symphony, and data analysts are the conductors responsible for achieving it. They do this by harmonizing multiple data points into a single repository, and then extracting insights thoughtfully. Data teams, i.e., data engineers and data analysts, are the core problem solvers who provide the right information at the right time to assist in empowering the C-suite. The resulting insights enable pharmaceutical business executives to plan effectively for the future and course correct for their organizations.

Defining the Data Harmonization Challenge

Data can be extremely powerful. The volume of data has burgeoned over the past decade, presenting both unique challenges and enormous potential for life science companies to generate valuable insights that can enable major healthcare advancements. Data is the key to improving the speed and agility of clinical research— and ultimately uncovering breakthrough treatments and therapies. However, much of this data currently languishes in company silos, and integrating these assets is a significant problem pharmaceutical companies face.

Furthermore, managing information on a global scale while balancing different countries’ regulations and restrictions is a daunting task. Executives that are able to do so have a better view of their operations and can make informed decisions on how best to achieve goals, such as commercializing a new therapy or conducting a clinical trial. Two approaches exist for solving these issues: the challenge lies in finding a middle ground.

Connecting Opposite Ends of the Spectrum

The first approach to harmonization is to clean and organize all the data before starting to generate insights. This involves building a consolidated, organized data structure with governance, quality metrics, stewardship and more to develop business and end-user trust. However, this approach can take considerable time to execute, making it impractical for immediate value generation in the decision-making process.

The second option is to move quickly, using the data in real-time or speeding up the organization process as the market did during the COVID-19 pandemic. This fast-moving approach avoids waiting years to get the data ready for insight generation, but it can compromise data integrity. Ideally, we must move from these two opposite ends of the spectrum to the center. Pharmaceutical companies must find a path that is not so rigid that the business cannot move forward while maintaining compliance without compromising security, legislative boundaries or putting data assets at risk.

Related: How Can IT Companies Manage Their Spend With Digital Procurement

Data teams must partner with internal stakeholders and executives to determine their intent for the information. This begins with identifying what the data can do and why they need this intelligence before considering how they will achieve it.

Taking an ‘Intent’ Approach to Insights

Before deploying data across the company with a goal of generating broader insights, data analysts should start by asking what the business is trying to solve and how the solution will drive innovation for maximum impact, then applying relevant data harmonization principles to generate practical insights.

These questions and the resulting answers help them understand the prospective business impact and the rationale behind generating and collecting the data. This strategy filters out some noise and enables analysts to focus on delivering the insights they need. It minimizes the time, effort, and cost to propagate and treat the data and maximizes its value.

Using Software to Accelerate the Process

Appropriate software enables data teams to use pre-built connectors to pull real-world evidence and data from applications, such as CRM programs, marketing software and social media channels. This function integrates pre-ready, pre-curated first-, second- and third-party data into the system and accelerates the generation of analytics and insights. The software can then send the insights back to end-user applications accessible by customers, representatives and clinical trial administrators for implementation. However, caution should be taken when selecting software based on hype and capabilities. Software should be deployed into the ecosystem with a clear understanding of how it will solve business problems, integrate with other products in the ecosystem and operationalize the system.

Given today’s rapidly evolving, dynamic healthcare and life science ecosystem, positive outcomes rely on a flawless overall data life cycle to maximize the end-user and patient experience. Data indicates that the average engagement for a prescriber is roughly 19 minutes overall. This means that in just 19 minutes, commercial teams must deliver the exact right messaging in the correct context. Ensuring that messaging resonates with prescribers in a meaningful way requires software that extracts actionable insights from the data at hand.

Looking Ahead to an Exciting Future

Pharmaceutical companies now have a much greater understanding of the power of data. This has led to more structure and better organization in data collection, storage and management. Their focus has now shifted to harmonizing data and generating insights, and technological innovations are making it easier to try new tactics. For example, data analysts now have the tools to perform advanced tasks such as distributed processing, high-volume processing, and graph technology. These options represent exciting opportunities for analysts to produce excellent new insights for end users.

In sum, data harmonization is part science and part art, and data analysts are the conductors who pull it all together to make the beautiful music happen.


Avinob Roy is vice president & general manager of product offerings at IQVIA. Durham, N.C.-based IQVIA, formerly Quintiles and IMS Health, Inc., is an American multinational company serving the combined industries of health information technology and clinical research.

The post Orchestrating the Data Symphony: Start with ‘What’ and ‘Why’ Before ‘How’ appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/compliance/orchestrating-the-data-symphony-start-with-what-and-why-before-how/feed/ 0
Varonis Introduces Least Privilege Automation for Microsoft 365, Google Drive and Box https://mytechdecisions.com/network-security/varonis-introduces-least-privilege-automation-for-microsoft-365-google-drive-and-box/ https://mytechdecisions.com/network-security/varonis-introduces-least-privilege-automation-for-microsoft-365-google-drive-and-box/#respond Wed, 18 Jan 2023 18:20:20 +0000 https://mytechdecisions.com/?p=46586 New York-based data security and analytics provider, Varonis Systems, Inc, announced a new least privilege automation capability for Microsoft 365, Google Drive and Box. The new capability continuously removes unnecessary data risk without human intervention. The autonomous remediation engine furthers Varonis’ mission to deliver effortless data security outcomes to customers. According to Varonis, unlike other solutions […]

The post Varonis Introduces Least Privilege Automation for Microsoft 365, Google Drive and Box appeared first on My TechDecisions.

]]>
New York-based data security and analytics provider, Varonis Systems, Inc, announced a new least privilege automation capability for Microsoft 365, Google Drive and Box. The new capability continuously removes unnecessary data risk without human intervention. The autonomous remediation engine furthers Varonis’ mission to deliver effortless data security outcomes to customers.

According to Varonis, unlike other solutions that take an all-or-nothing approach, its cloud-native platform makes intelligent decisions about who needs access to data and who doesn’t based on usage, data sensitivity and exposure. Organizations can customize remediation policies to fit their security and compliance requirements, and least privilege automation continually enforces them without impacting collaboration.

“When excessive data access goes unchecked, a single compromised user or rogue insider can inflict untold damage on a business,” says Jim Reavis, co-founder and chief executive officer, cloud security alliance. “Reducing the data blast radius is a top priority for CISOs, but manual remediation isn’t possible with today’s pace of data growth and collaboration.”

The average company’s cloud environment has more than 40 million unique permissions and 157,000 sensitive records exposed to the internet. Least privilege automation ends collaboration risk by removing public and organization-wide exposure created via sharing links and unused entitlements.

“With the launch of our new SaaS platform, our mission is to solve our customers’ critical data security challenges with automation — and that starts by ending excessive data access with the industry’s first fully autonomous remediation engine,” says Varonis Chief Technology Officer David Bass. “We offer the only scalable way to eliminate collaboration risk and continually keep data exposure low across today’s most critical data stores.”

The post Varonis Introduces Least Privilege Automation for Microsoft 365, Google Drive and Box appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/network-security/varonis-introduces-least-privilege-automation-for-microsoft-365-google-drive-and-box/feed/ 0