data security Articles / Blogs / Perficient https://blogs.perficient.com/tag/data-security/ Expert Digital Insights Fri, 27 Jun 2025 21:45:04 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png data security Articles / Blogs / Perficient https://blogs.perficient.com/tag/data-security/ 32 32 30508587 Understanding Clean Rooms: A Comparative Analysis Between Databricks and Snowflake https://blogs.perficient.com/2025/06/27/databricks-vs-snowflake-clean-rooms-financial-research-collaboration/ https://blogs.perficient.com/2025/06/27/databricks-vs-snowflake-clean-rooms-financial-research-collaboration/#respond Fri, 27 Jun 2025 21:45:04 +0000 https://blogs.perficient.com/?p=383614

Clean rooms” have emerged as a pivotal data sharing innovation with both Databricks and Snowflake providing enterprise alternatives.

Clean rooms are secure environments designed to allow multiple parties to collaborate on data analysis without exposing sensitive details of data. They serve as a sandbox where participants can perform computations on shared datasets while keeping raw data isolated and secure. Clean rooms are especially beneficial in scenarios like cross-company research collaborations, ad measurement in marketing, and secure financial data exchanges.

Uses of Clean Rooms:

  • Data Privacy: Ensures that sensitive information is not revealed while still enabling data analysis.
  • Collaborative Analytics: Allows organizations to combine insights without sharing the actual data, which is vital in sectors like finance, healthcare, and advertising.
  • Regulatory Compliance: Assists in meeting stringent data protection norms such as GDPR and CCPA by maintaining data sovereignty.

Clean Rooms vs. Data Sharing

While clean rooms provide an environment for secure analysis, data sharing typically involves the actual exchange of data between parties. Here are the major differences:

  • Security:
    • Clean Rooms: Offer a higher level of security by allowing analysis without exposing raw data.
    • Data Sharing: Involves sharing of datasets, which requires robust encryption and access management to ensure security.
  • Control:
    • Clean Rooms: Data remains under the control of the originating party, and only aggregated results or specific analyses are shared.
    • Data Sharing: Data consumers can retain and further use shared datasets, often requiring complex agreements on usage.
  • Flexibility:
    • Clean Rooms: Provide flexibility in analytics without the need to copy or transfer data.
    • Data Sharing: Offers more direct access, but less flexibility in data privacy management.

High-Level Comparison: Databricks vs. Snowflake

Implementation
Databricks Snowflake
  1. Setup and Configuration:
    • Utilize existing Databricks workspace
    • Create a new Clean Room environment within the workspace
    • Configure Delta Lake tables for shared data
  2. Data Preparation:
    • Use Databricks’ data engineering capabilities to ETL and anonymize data
    • Leverage Delta Lake for ACID transactions and data versioning
  3. Access Control:
    • Implement fine-grained access controls using Unity Catalog
    • Set up row-level and column-level security
  4. Collaboration:
    • Share Databricks notebooks for collaborative analysis
    • Use MLflow for experiment tracking and model management
  5. Analysis:
    • Utilize Spark for distributed computing
    • Support for SQL, Python, R, and Scala in the same environment
  1. Setup and Configuration:
    • Set up a separate Snowflake account for the Clean Room
    • Create shared databases and views
  2. Data Preparation:
    • Use Snowflake’s data engineering features or external tools for ETL
    • Load prepared data into Snowflake tables
  3. Access Control:
    • Implement Snowflake’s role-based access control
    • Use secure views and row access policies
  4. Collaboration:
  5. Analysis:
    • Primarily SQL-based analysis
    • Use Snowpark for more advanced analytics in Python or Java
Business and IT Overhead
Databricks Snowflake
  • Lower overhead if already using Databricks for other data tasks
  • Unified platform for data engineering, analytics, and ML
  • May require more specialized skills for advanced Spark operations
  • Easier setup and management for pure SQL users
  • Less overhead for traditional data warehousing tasks
  • Might need additional tools for complex data preparation and ML workflows
Cost Considerations
Databricks Snowflake
  • More flexible pricing based on compute usage
  • Can optimize costs with proper cluster management
  • Potential for higher costs with intensive compute operations
  • Predictable pricing with credit-based system
  • Separate storage and compute pricing
  • Costs can escalate quickly with heavy query usage
Security and Governance
Databricks Snowflake
  • Unity Catalog provides centralized governance across clouds
  • Native integration with Delta Lake for ACID compliance
  • Comprehensive audit logging and lineage tracking
  • Strong built-in security features
  • Automated data encryption and key rotation
  • Detailed access history and query logging
Data Format and Flexibility
Databricks Snowflake
  • Supports various data formats (structured, semi-structured, unstructured)
  • Supports various file formats (Parquet, Iceberg, csv,json, images, etc.)
  • Better suited for large-scale data processing and transformations
  • Optimized for structured and semi-structured data
  • Excellent performance for SQL queries on large datasets
  • May require additional effort for unstructured data handling
Advanced Analytics, AI and ML
Databricks Snowflake
  • Native support for advanced analytics and AI/ML workflows
  • Integrated with popular AI/ML libraries and MLflow
  • Easier to implement end-to-end AI/ML pipeline
  • Requires additional tools or Snowpark for advanced analytics
  • Integration with external ML platforms needed for comprehensive ML workflows
  • Strengths lie more in data warehousing than in ML operations
Scalability
Databricks Snowflake
  • Auto-scaling of compute clusters and serverless compute options
  • Better suited for processing very large datasets and complex computations
  • Automatic scaling and performance optimization
  • May face limitations with extremely complex analytical workloads

Use Case Example: Financial Services Research Collaboration

Consider a research department within a financial services firm that wants to collaborate with other institutions on developing market insights through data analytics. They face a challenge: sharing proprietary and sensitive financial data without compromising security or privacy. Here’s how utilizing a clean room can solve this:

Implementation in Databricks:

  • Integration: By setting up a clean room in Databricks, the research department can securely integrate its datasets with other institutions; allowing sharing of data insights with precise access controls.
  • Analysis: Researchers from various departments can perform joint analyses on combined datasets without ever directly accessing each other’s raw data.
  • Security and Compliance: Databricks’ security features such as encryption, audit logging, and RBAC will ensure that all collaborations comply with regulatory standards.

Through this setup, the financial services firm’s research department can achieve meaningful collaboration and derive deeper insights from joint analyses, all while maintaining data privacy and adhering to compliance requirements.

By leveraging clean rooms, organizations in highly regulated industries can unlock new opportunities for innovation and data-driven decision-making without the risks associated with traditional data sharing methods.

Conclusion

Both Databricks and Snowflake offer robust solutions for implementing this financial research collaboration use case, but with different strengths and considerations.

Databricks excels in scenarios requiring advanced analytics, machine learning, and flexible data processing, making it well-suited for research departments with diverse analytical needs. It offers a more comprehensive platform for end-to-end data science workflows and is particularly advantageous for organizations already invested in the Databricks ecosystem.

Snowflake, on the other hand, shines in its simplicity and ease of use for traditional data warehousing and SQL-based analytics. Its strong data sharing capabilities and familiar SQL interface make it an attractive option for organizations primarily focused on structured data analysis and those with less complex machine learning requirements.

Regardless of the chosen platform, the implementation of Clean Rooms represents a significant step forward in enabling secure, compliant, and productive data collaboration in the financial sector. As data privacy regulations continue to evolve and the need for cross-institutional research grows, solutions like these will play an increasingly critical role in driving innovation while protecting sensitive information.

Perficient is both a Databricks Elite Partner and a Snowflake Premier PartnerContact us to learn more about how to empower your teams with the right tools, processes, and training to unlock your data’s full potential across your enterprise.

 

]]>
https://blogs.perficient.com/2025/06/27/databricks-vs-snowflake-clean-rooms-financial-research-collaboration/feed/ 0 383614
Data Breaches: The Prime Target in Today’s Digital Landscape https://blogs.perficient.com/2024/10/29/data-breaches-the-prime-target-in-todays-digital-landscape/ https://blogs.perficient.com/2024/10/29/data-breaches-the-prime-target-in-todays-digital-landscape/#respond Tue, 29 Oct 2024 19:53:49 +0000 https://blogs.perficient.com/?p=371230

Data isn’t just an asset—it’s the lifeblood of most organizations. As businesses continue to amass vast amounts of information, the exposure to potential breaches grows exponentially. According to IBM, the global cost of data breaches continues to rise, with the average incident now costing companies $4.88 million in 2024, up 10% from the previous year.

Yet surprisingly, security often remains an afterthought in data management strategies, creating vulnerabilities that can prove costly.

The Growing Threat of Data Breaches

Data breaches represent far more than security incidents—they’re existential threats that can unravel years of carefully built revenue streams and customer trust. As organizations’ digital footprints expand and data volumes grow exponentially, the stakes continue to rise. Companies face mounting pressure to protect sensitive information while maintaining operational efficiency, a balance that becomes increasingly precarious as traditional data-sharing methods evolve and regulatory frameworks grow more complex. The potential impact of these breaches scales with our growing reliance on digital systems, making them one of the most significant risks facing modern organizations.

Why Data is a Prime Target

  1. Value: Personal and corporate data can be sold on the dark web or used for identity theft.

  2. Ransom: Cybercriminals can encrypt data and demand payment for its release.

  3. Competitive Advantage: Stolen intellectual property can give competitors an unfair edge.

  4. Political Motives: State-sponsored attacks may target sensitive government or infrastructure data.

The Challenge of Modern Data Sharing

Many organizations still rely on physical data sharing methods, creating unnecessary risks in an increasingly complex digital world. Modern approaches offer more sophisticated solutions, including the ability to quickly revoke user access and remove shared data—a crucial feature for modern, enterprise-scale organizations. The modern data sharing capabilities offer greater security, stronger governance and enable effective data sharing needs in real-time. However, companies often struggle with the “how” of implementing these solutions, particularly from a business process perspective.

Building a Security-First Framework

At Perficient, we understand that effective data security isn’t inherited—it’s built through careful planning and regular evaluation. Our approach centers on three key principles:

  1. Tactical Assessment and Planning: We specialize in quickly assessing an organization’s current security posture and developing actionable plans for improvement. No two companies are identical, so this starts with understanding where you are and creating a custom roadmap to where you need to be.
  2. Relationship-Driven Implementation: Success in data security isn’t just about the technology—it’s also about people and processes surrounding it. We work closely with key constituents across your organization, recognizing the industry-specific requirements and regulations that often drive security needs.
  3. Principle of Least Privilege: We advocate for and implement the practice of providing users only the minimum access necessary for their roles, significantly reducing potential exposure points. This is incredibly important not only in creation of roles and user accounts but constantly reapplied on a routine basis.

Taking Action

Organizations looking to strengthen their data security posture should start by:

  • Evaluating current data sharing processes and identifying potential vulnerabilities
  • Implementing modern data-sharing solutions that offer greater control and visibility
  • Developing clear protocols for access management and regular security assessments
  • Create industry-specific frameworks that align with regulatory requirements
  • Constantly reevaluate the security posture to realign to business needs
  • Conduct regular cybersecurity training to educate employees about phishing scams, password security, and recognizing suspicious activity. Employees are often the first line of defense against breaches caused by human error.

Moving Forward

As data continues to grow in volume and importance, organizations can’t afford to treat security as an afterthought. By taking a proactive approach to data security and working with experienced partners, businesses can better protect their most valuable asset while maintaining the efficiency they need to compete in today’s market.

Security isn’t a one-time implementation—it’s an ongoing process requiring regular evaluation and adjustment. Every data implementation, regardless of its primary purpose, has security implications that need to be carefully considered. This is why we emphasize the importance of building frameworks that incorporate more secure steps from the ground up.

Remember: data security isn’t just about preventing breaches—it’s about building a foundation for sustainable business success. Learn how Perficient can support your data security needs.

]]>
https://blogs.perficient.com/2024/10/29/data-breaches-the-prime-target-in-todays-digital-landscape/feed/ 0 371230
Maximize Your Data Management with Unity Catalog https://blogs.perficient.com/2024/08/23/unity-catalog-migration-tools-benefits/ https://blogs.perficient.com/2024/08/23/unity-catalog-migration-tools-benefits/#comments Fri, 23 Aug 2024 19:50:17 +0000 https://blogs.perficient.com/?p=368029

Databricks Unity Catalog is a unified and open governance solution for data and AI, built into the Databricks Data Intelligence Platform.

UnitycatalogUnity Catalog offers a comprehensive solution for enhancing data governance, operational efficiency, and technological performance. By centralizing metadata management, access controls, and data lineage tracking, it simplifies compliance, reduces complexity, and improves query performance across diverse data environments. The seamless integration with Delta Lake unlocks advanced technical features like predictive optimization, leading to faster data access and cost savings. Unity Catalog plays a crucial role in machine learning and AI by providing centralized data governance and secure access to consistent, high-quality datasets, enabling data scientists to efficiently manage and access the data they need while ensuring compliance and data integrity throughout the model development lifecycle.

Unity Catalog brings governance to data across your enterprise. Lakehouse Federation capabilities in Unity Catalog allow you to discover, query, and govern data across data platforms including MySQL, PostgreSQL, Amazon Redshift, Snowflake, Azure SQL Database, Azure Synapse, Google’s BigQuery, and more from within Databricks without moving or copying the data, all within a simplified and unified experience. Unity Catalog supports advanced data-sharing capabilities with Delta Sharing, enabling secure, real-time data sharing across organizations and platforms without the need for data duplication. Additionally, Unity Catalog facilitates the creation of secure data Clean Rooms, where multiple parties can collaborate on shared datasets without compromising data privacy. Its support for multi-cloud and multi-region deployments ensures operational flexibility and reduced latency, while robust security features, including fine-grained access controls, automated compliance auditing, and encryption, help future-proof your data infrastructure.

These capabilities position your organization for scalable, secure, and efficient data management, driving innovation and maintaining a competitive edge. However, this fundamental transition will need to be implemented with minimal disruption to ongoing operations. This is where the Unity Catalog Migration Tool comes into play.

Unity Catalog Migration Tool

UCX, or the Unity Catalog Migration Tool, is an open source project from Databricks Labs  designed to streamline and automate the Unity Catalog migration process. UCX automates much of the work involved in transitioning to Unity Catalog, including migrating metadata, access controls, and governance policies. Migrating metadata ensures the enterprise will have access to data and AI assets after the transition. In additional to data, the migration tool ensures that security policies and access controls are accurately transferred and enforced in the Unity Catalog. This capability is critical for maintaining data security and compliance during and after migration

Databricks is continually developing UCX to better ensure that all your data assets, governance policies, and security controls are seamlessly transferred to Unity Catalog with minimal disruption to ongoing operations. Tooling and automation helps avoid costly downtime or interruptions in data access that could impact business performance, thereby maintaining continuity and productivity. While it is true that automating these processes significantly reduces the time, effort, and cost required for migration, the process is not automatic. There needs to be evaluation, planning, quality control, change management and additional coding and development tasks performed along with, and outside of, the tool. This knowledge and expertise is where Unity Catalog Migration Partners come into play.

Unity Catalog Migration Partner

An experienced Unity Catalog migration partner leads the process of transitioning your data assets, governance policies, and security controls by planning, executing, and managing the migration process, ensuring that it is smooth, efficient, and aligned with your organization’s data governance and security requirements. Their duties typically include assessing the current data environment, designing a custom migration strategy, executing the migration while minimizing downtime and disruptions, and providing post-migration support to optimize Unity Catalog’s features. Additionally, they offer expertise in data governance best practices and technical guidance to enhance your organization’s data management capabilities.

Databricks provides its system integrators with tools, guidance and best practices to ensure a smooth transition to Unity Catalog. Perficient has built upon those valuable resources to enable a more effective pipeline with our Unity Catalog Migration Accelerator.

Unity Catalog Migration Accelerator

Our approach to Unity Catalog migration is differentiated by our proprietary Accelerator, which includes a suite of project management artifacts and comprehensive code and data quality checks. This Accelerator streamlines the migration process by providing a structured framework that ensures all aspects of the migration are meticulously planned, tracked, and executed, reducing the risk of errors and delays. The built-in code and data quality checks automatically identify and resolve potential issues before they become problems, ensuring a seamless transition with minimal impact on business operations. By leveraging our Accelerator, clients benefit from a more efficient migration process, higher data integrity, and enhanced overall data governance, setting us apart from other Unity Catalog migration partners who may not offer such tailored and robust solutions.

In summary, Unity Catalog provides a powerful solution for modernizing data governance, enhancing performance, and supporting advanced data operations like machine learning and AI. With our specialized Unity Catalog migration services and unique Accelerator, we offer a seamless transition that optimizes data management and security while ensuring data quality and operational efficiency. If you’re ready to unlock the full potential of Unity Catalog and take your data infrastructure to the next level, contact us today to learn how we can help you achieve a smooth and successful migration. Contact us for a complimentary Migration Analysis and let’s work together on your data and AI journey!

]]>
https://blogs.perficient.com/2024/08/23/unity-catalog-migration-tools-benefits/feed/ 1 368029
How Bilahari Appukuttan Nair Enhances Perficient’s Data Integrity and Security Measures https://blogs.perficient.com/2024/07/12/how-bilahari-appukuttan-nair-enhances-perficients-data-integrity-and-security-measures/ https://blogs.perficient.com/2024/07/12/how-bilahari-appukuttan-nair-enhances-perficients-data-integrity-and-security-measures/#respond Fri, 12 Jul 2024 16:46:22 +0000 https://blogs.perficient.com/?p=365593

Our colleagues at Perficient are incredibly talented, compassionate, and committed to accelerating innovation and making meaningful connections around the world. We recently sat down with Bilahari (Hari) Appukuttan Nair, HCM manager, to discover how he utilizes his Human Resources expertise to drive growth for Perficient and our global teams.  

Located in Bangalore, India, Hari is a key contributor for the coordination of office-wide activities and events, as well as a project leader for Perficient’s Global Compliance Training Program. Every colleague has a responsibility to protect our business, safeguard the integrity of client data, and foster a support work environment for all people.

Through annual compliance training, we’re ensuring Perficient remains a supportive and safe workplace. Continue reading to learn about the incredible insight Hari brings to our India team, the difference his contributions are making, and how he exemplifies Perficient’s vision and mission.  

What is your role? Describe a typical day in the life. 

Hari celebrating Holi with colleagues in Perficient’s Bangalore office.

I joined Perficient in May 2022, and have been here for two years. In my role, I am responsible for the Bangalore business unit’s Human Resources functions, ensuring seamless operations and addressing employee needs effectively. I generate comprehensive Management Information System (MIS) reports to support informed decision-making. I plan and execute engagement activities and corporate events to cultivate a positive work environment.

Additionally, I conduct and analyze employee surveys to gain valuable insights and drive continuous improvement. Conducting performance review discussions is an important part of my responsibilities, and I also play a key role in the monthly payroll process. 

Relating to the Global Compliance Training Program, I am responsible for tracking colleague completion status, providing biweekly updates to senior leadership, coordinating efforts with global points of contact, and assisting with any issues related to the compliance training platform, Percipio.  

How do you explain your job to family, friends, or children? 

I help the people who work at my company. I plan events to boost employee morale by solving problems, motivating colleagues, planning and executing reward programs, and providing essential training to new team members.

Also, I track progress and continuously improve initiatives, processes, and policies. Ultimately, my role involves organization, support, and creating a positive environment for everyone to thrive.

 

 

Whether big or small, how do you make a difference for our clients, colleagues, communities, or teams? 

Hari participating in an event supporting Wish Tree.

The goal of my job is to motivate skilled, engaged consultants and technologists to complete great work for our clients. As a compliance training lead, I ensure everyone is equipped with the necessary knowledge of our data security and workplace integrity policies. I help create a positive work environment through engagement activities, prompt issue resolution, and being there for our colleagues. I also organize corporate social responsibility initiatives that contribute to societal well-being.

What are your proudest accomplishments, personally and professionally? Any milestone moments at Perficient? 

I created an HR dashboard for our Bangalore office, which has earned appreciation from Perficient leadership. Following the guidance of our vice president, I expanded this initiative to our Chennai and Hyderabad locations. I am responsible for leading a team that successfully launched Perficient’s 2024 Compliance Training globally. Through a series of engagement initiatives, I have contributed to improving the employee engagement score for our Bangalore location. 

Colleagues at Perficient celebrating Hari’s birthday.

What advice would you give to colleagues who are starting their career with Perficient? 

Embrace learning, build trust, and take ownership. Seek feedback, stay flexible, and always strive for growth. 

How do you shatter boundaries?  

I consistently strive to step out of my comfort zone and exceed expectations in my HR role. I’m proud to have successfully completed the Learning to Lead program, and I must extend my gratitude to the Talent Development team for forging this invaluable opportunity. Additionally, I am familiarizing myself with a few tools that will enhance my contributions further.  

Continuous learning remains a top priority for me, as I firmly believe that challenging the status quo is essential for achieving excellence. I believe this way I can contribute to Perficient’s ongoing success and growth.  

READ MORE: Learn How Perficient is Enabling Colleague Career Development 

Why do you think we obsess over client outcomes?  

I believe the success of our customers is a direct reflection of our own success. We understand client needs and exceed their expectations, which showcases our commitment to excellence.

This client-centric approach not only enhances our reputation and helps us build lasting relationships, but also fosters a culture of continuous improvement and innovation within our organization. This is supported by our thousands of skilled strategists and technologists, ensuring we stay ahead in a competitive market. 

LEARN MORE: Perficient and Our Colleagues Are Enabling Client Success  

How do you forge our future?  

I am proud to be a part of this beautiful team dedicated to fostering an environment where every voice is heard and valued. We actively seek input from our colleagues through gathering quarterly surveys, scheduling meetings to ensure open communication, and organizing engaging events to keep spirits high. 

Celebrating the successes and achievements of our colleagues is a priority. I support leadership with insightful reports and dashboards, and I had the opportunity to take the lead in compliance training to ensure everyone is equipped with the necessary knowledge. Thus, I contribute to creating a more resilient and forward-thinking Perficient with engaged and motivated employees who are well-prepared to navigate future opportunities and challenges. 

SEE MORE PEOPLE OF PERFICIENT 

It’s no secret our success is because of our people. No matter the technology or time zone, our colleagues are committed to delivering innovative, end-to-end digital solutions for the world’s biggest brands, and we bring a collaborative spirit to every interaction. We’re always seeking the best and brightest to work with us. Join our team and experience a culture that challenges, champions, and celebrates our people.  

Visit our Careers page to see career opportunities and more!  

Go inside Life at Perficient  and connect with uson LinkedIn,   YouTube, Twitter,   FacebookTikTok, andInstagram.  

]]>
https://blogs.perficient.com/2024/07/12/how-bilahari-appukuttan-nair-enhances-perficients-data-integrity-and-security-measures/feed/ 0 365593
Perficient Listed in Forrester Now Tech: Data Management Service Providers, Q4 2021 https://blogs.perficient.com/2021/10/08/perficient-listed-in-forrester-now-tech-data-management-service-providers-q4-2021/ https://blogs.perficient.com/2021/10/08/perficient-listed-in-forrester-now-tech-data-management-service-providers-q4-2021/#respond Fri, 08 Oct 2021 14:09:33 +0000 https://blogs.perficient.com/?p=298645

Data plays an essential role in today’s digital economy and keeping up with modern data management processes is key to staying competitive. In fact, enterprises with advanced data practices are more productive, innovate faster, are able to enter new markets quickly, and are more likely to be directly monetizing their data compared to their less-mature peers. But achieving that level of competency often requires the assistance of a data management service provider.

Partnering with a data management service provider can help your organization:

  • Establish a strategy and operating model for data
  • Build an enterprise data foundation
  • Mature and scale data governance

In the Now Tech: Data Management Service Providers, Q4 2021 report, Forrester defines data management service providers as:

“Service firms that provide talent, technology, and best practices through strategy and deployment partnerships in order to improve an enterprise’s use of data management to drive insights and business results.”

Forrester Now Tech: Data Management Service Providers, Q4 2021 Report

Identifying the right service provider to partner with can help you realize the benefits of data and increase your data competency.

In the report, Forrester segmented vendors based on market presence and functionality. Market presence was determined by data management service revenue and vendors were placed into one of three categories: large established players, midsize players, and small players. Functionality was based on varying capabilities and broken down into four segments: platform providers, data and analytics services, specialized service providers, and system integrators.

Each vendor was asked a number of detailed questions about their services, including geographic presence, industry expertise, and data management experience and expertise. Based on the responses, Forrester supplies information about these service providers to help you determine the best vendor for your data management needs.

Perficient’s Primary Functionality Lies in Specialized Services

Forrester listed Perficient in its midsize category ($100M to $1B in annual category revenue) as a specialized service provider. According to the report, “Specialized service providers concentrate on data and governance foundations. These firms have extensive data engineering, data management, data security, and data governance expertise for data-driven initiatives prioritized by CIOs, chief data officers, and enterprise architects. Engagements focus on data strategy, data architecture, data operations (DataOps), and data governance, helping enterprises transition into insight-driven businesses.”

Perficient’s listing in this Now Tech includes our geographic presence (100% North America); industry focus areas (healthcare and life sciences, financial services, and retail); and sample customers (Novant Health, StorageMart, and United Wholesale Mortgage).

Perficient’s Approach to Data Management

One of the greatest attributes of data is that it becomes more valuable the more you use it. But seeing that value is difficult if you’re not managing your data properly.

As a specialized service provider, we’re helping leading companies create actionable business insights based on accurate, scalable, and comprehensive data.  We bring the thought leadership, technology expertise, and processes to help our customer become data-driven organizations capable of leveraging data for competitive advantage. We do this through:

  • Taking the time to understand your business
  • Collecting, organizing and managing data from all over your organization
  • Delivering insights via intelligent applications
  • Deploying data and insights to any user via any interface

Learn More about Perficient

We’re ready to help you realize the benefits of data no matter where you are on your journey. Our experience, our technology partnerships, and most importantly our people are what make us a great partner. Visit us on Perficient.com and learn more about how we can help you master the realities of the data-driven world. And listen to the Intelligent Data Podcast where we interview thought leaders on a variety of topics around using data and technology to reshape your business.

You can read the entire Forrester Now Tech: Data Management Service Providers, Q4 2021 report via the Forrester website where it’s available to Forrester subscribers and for purchase.

]]>
https://blogs.perficient.com/2021/10/08/perficient-listed-in-forrester-now-tech-data-management-service-providers-q4-2021/feed/ 0 298645
EU GDPR Compliance – Securing Data in Oracle HCM Cloud https://blogs.perficient.com/2021/06/02/eu-gdpr-compliance-securing-data-in-oracle-hcm-cloud/ https://blogs.perficient.com/2021/06/02/eu-gdpr-compliance-securing-data-in-oracle-hcm-cloud/#respond Wed, 02 Jun 2021 15:00:21 +0000 https://blogs.perficient.com/?p=292774

Does your Organization conduct business and have a workforce in the EU? If so, EU General Data Protection Regulation (GDPR) applies to you.

EU General Data Protection Regulation (GDPR) is the data privacy and security law that came into effect in May 2018. It requires Organizations that gather and process personal data in the EU to follow strict data privacy and security standards. The GDPR will impose heavy fines for violating the law. Penalties run into millions of dollars and damage your organization’s reputation.

How Oracle HCM Cloud Can Help With EU GDPR Compliance

Oracle HCM Cloud is an ideal solution to stay in compliance with the GDPR and manage your workforce effectively. You can mitigate the risks of data breaches and stay in compliance with the GDPR data privacy requirements and the strict data security standards. Implementing the rich and powerful features available within the HCM Cloud product offerings can help you stay compliant.

GDPR requires access to personal data must be limited to only those employees in your organization who need it. Oracle Cloud HCM Security Profiles provide a mechanism to control and limit the access to personal data and it’s processing and reporting. Security profiles are defined and assigned to specific job roles. These then get assigned to systems users that can view, transact, and report on personal data. Data can be secured by the area of responsibility of users and their business unit to further limit access. Security profiles provide the capability to further control access to other HCM data objects such as Organizations, payroll, positions, etc. Data security preview, diagnostic, and audit function tools are available to test and verify that users are correctly configured with the data roles.

Oracle Risk Management Access Certification and Advanced Controls 

Oracle Risk Management Access Certification offering with the Cloud HCM enables your Organization to perform a periodic audit users’ access. You can define and set up a certification project within the Risk Management to audit all existing roles or new user-role assignments since the last audit. Auditors also have the ability to receive email notifications as reminders and respond to links from the emails to navigate and complete the audit tasks to stay on track with the audit process.

Advanced Controls offering allows the separation of duties and proactive monitoring of users’ risky behaviors through their points of access. It mitigates risks of unwanted transactions and data breaches within the applications. Access models can be set up to define the risk logic using the combination of user roles and application privileges assigned to users who may perform undesired personal data processing or transactions. Controls can be set up based on the models which trigger incidents when an access violation occurs. Auditors can further investigate and resolve the incidents within the application. Intuitive graphic visualizations are available within the offering to aid in investigating and resolving the incidents. Simulations can also be created to see the steps that can be taken to resolve access conflicts identified by incidents and to prevent risky role assignments in the future.

Oracle Database Vault and Transparent Data Encryption

GDPR requires that Organizations handle data security by implementing technology safeguards such as encryption. Oracle Cloud HCM offers two data protection features, Oracle Database Vault and Transparent Data Encryption (TDE), as part of the Oracle Advanced Data Security option. Oracle Database Vault mitigates the risk of unauthorized access by system administrators from behind the scenes. It enables keystroke auditing to monitor any suspicious data access activity by hackers. TDE secures the sensitive personal data on the file system from being accessed or used wrongfully by encrypting the data. The master key of the encryption is stored within the Oracle Wallet. The master key can be retained with the organization’s data protection authorities to comply with the data protection regulation. These technologies can be implemented by subscribing to the Break-Glass service in Oracle Applications Cloud.

With the comprehensive and powerful set of tools and product offerings of Oracle HCM Cloud, your Organization can safely store and process personal data while staying with EU GDPR compliance requirements.

Perficient has the skills and expertise to help your Organization address the data privacy and security needs of EU GDPR.

]]>
https://blogs.perficient.com/2021/06/02/eu-gdpr-compliance-securing-data-in-oracle-hcm-cloud/feed/ 0 292774
Why Healthcare is Moving to Cloud: Data Security https://blogs.perficient.com/2019/08/13/why-healthcare-is-moving-to-cloud-data-security/ https://blogs.perficient.com/2019/08/13/why-healthcare-is-moving-to-cloud-data-security/#respond Tue, 13 Aug 2019 14:30:33 +0000 https://blogs.perficient.com/?p=243245

The following is the first blog in a series about why healthcare organizations are moving to the cloud.

Gone are the days of healthcare organizations wondering if they need to utilize the cloud. They must now decide how to best utilize it.

Business and tech leaders report that increasing cloud usage is one of their top priorities, and adoption rates in healthcare mirror that trend. In early 2019, HIMSS reported that 39% of IT workloads were deployed in the cloud in healthcare organizations. That number is expected to reach 50% by early next year.

Cloud adoption is prevailing in healthcare for a multitude of reasons. Some are attracted by the broad range of cloud-based offerings, while others are interested in a specific cloud-enabled solution. Those organizations that begin their cloud journey with a specific solution in mind typically end up broadening their scope to enjoy even greater benefits.

In this series, we will look at some of the major reasons for healthcare organizations moving to the cloud, starting with data security in the cloud.

Specialists keep data secure

Data security has been a hot-button topic ever since cloud storage came into prominence, but doubts over public-cloud security are subsiding, and with good reason. Cloud vendors take security extremely seriously because their ability to keep data safe is integral to their business.

The perimeter surveillance instituted by cloud vendors is automated, built, and maintained by security architects with extensive cybersecurity expertise to protect from external threats. These same security architects work in tandem across networks and continents to tackle invasive threats. They perform thorough and frequent system audits with vigilance on access controls for both internal and external threats. The skill and capability of these dedicated security specialists, along with the high number of specialists, is more than many healthcare organizations can afford to dedicate to security themselves.

Stats from the healthcare industry reflect both the initial concern and improved security that cloud brings. According to HIMSS, 26% of healthcare organizations find cybersecurity concerns restrict their cloud usage. However, only 7% have experienced any security concerns known to be related to the cloud.

This illustrates that the cloud is more secure than many believe, and the security benefits go beyond keeping data safe. Disaster recovery significantly improves with cloud, with multiple off-site data centers ensuring that interrupted functionality can be restored in seconds. A single, isolated event is unlikely to cause a large-scale outage. This is often a vast improvement in disaster recovery compared to on-premises data centers.

For many healthcare organizations, protected health information or personally identifiable information isn’t at stake with the cloud. Many useful cloud-based solutions are utilized to pass basic information to patients, such as provider office locations, phone and contact information, hours of operation, provider credentials and publicly available photos, maps, or clinical patient instruction documents. The benefit of cloud-based applications is they can serve up this key information to patients and consumers in a way that is seamless and intuitive as they navigate through a public website or access digital health applications and tools.

Learn more

Downloaded our guide from here and continue to check out our blogs to learn more about why healthcare organizations are moving to the cloud.

]]>
https://blogs.perficient.com/2019/08/13/why-healthcare-is-moving-to-cloud-data-security/feed/ 0 243245
How to Build a Winning Data Platform https://blogs.perficient.com/2019/06/25/build-a-data-platform/ https://blogs.perficient.com/2019/06/25/build-a-data-platform/#respond Tue, 25 Jun 2019 11:00:02 +0000 https://blogs.perficient.com/?p=241032

Recently, at Informatica World 2019, I heard the importance of data platform in building AI capabilities for the organization. What is interesting is that Informatica, known for their products delivering the “Switzerland of Data”, is now using AI capabilities to enhance their own suite of products with CLAIRE capabilities. In further exploring a few other articles on the importance of Data, I also came across Monica Rogati’s Data Science Hierarchy of Needs and was impressed by the way she relates the AI structure to Maslow’s Hierarchy of Needs.

In a way, the “self-actualization” that Maslow defines as “achieving one’s full potential” is the AI capability. However, to get there, you need the basics of data platform foundation. Now an important distinction between Monica Rogati’s Data Science Hierarchy and my pyramid structure is the assumption that you would use the capabilities from software products such as Informatica which offers you GUI-based capabilities where you can focus more time on governance, analysis, and quality and less time on writing custom coding. So please consider that as you are reading this article.

Data Platform Model

Data Platform Path

FIND
It’s paramount to identify and clearly define the “use case” that the AI team is going after. Without a meaningful use case, just building machine learning and automation for the sake of exploration doesn’t provide any value. Once the use case is defined, find where the data resides in the enterprise or outside the enterprise (benchmark, 3rd party, etc.)

COLLECT
With commercial and open source tools available in the data marketplace, you can quickly build data integration to collect real-time or batch data into a data lake. Don’t overthink quality of data at this point.

UNDERSTAND
Once you collect data into a data lake, understand the data you collected by profiling the datasets and mapping them back to your use case. You can also define tags in your data to put a business context of your datasets. In addition, take effort to classify the data you collected into categories that make meaningful business sense.

INTEGRATE & TRANSFORM
Once you tag and classify your datasets, integrate data from multiple sources into one data model that can support your defined use cases. In some cases, this can also be enhancement of your existing data model to support multiple use cases.

ENRICH
Integration should also include data enrichment. So many open datasets such as weather, traffic patterns, currency, disaster, health conditions are available for the public to consume. In addition, third party datasets such as Dun & Bradstreet can help validate customer addresses.

SCALE
It’s clear that to integrate such large, disparate datasets and build data models out of those datasets, your cloud or on premise data platform should be able to perform at scale. So use performance tuning and storage/compute techniques that will provide on-time results.

EXPERIENCE
Good quality data doesn’t mean anything without showing results in a format that can be consumed by different audience levels (line level to executives). Reporting platforms such as Power BI, Tableau, and Microstrategy have been market leaders for a reason with their ability to build beautiful visualizations with streaming or large batch datasets. Hence large cloud vendors such as Salesforce have been acquiring BI companies like Tableau to enhance their visualization. Visualization for your data platform using BI software like MicroStrategy, Tableau, Power BI, etc.

Defining Metrics

One other important factor is to define the metrics and measures clearly to take actions based on facts.

MONITOR
Building the data platform is not a one time activity. Data similar to infrastructure needs continuous monitoring and improvement based on feedback from business subject matter experts (SME) who also act as data SMEs. Therefore, as you build your data platform use monitoring services and build notifications and alerts based on thresholds driven by business needs. Additionally, you can rate your data based on the relevance of the datasets to your decision making process. This will improve the quality of the data that is important for the organization. This activity will also improve prioritizing critical datasets over others similar to putting tighter SLA’s on important systems and their recovery procedures.

AI & DEEP LEARN
All the steps above will lead into building Machine Learning algorithms and automation processes that will provide relevant opportunities and direct impact to your organization’s bottom line.

While the above sequence of events will manage your data throughout the lifecycle of data preparation, data security and data governance play a key role to manage the data lifecycle as well. In addition, Dev Ops will provide agility to building data platform to keep the business moving and changing as mergers and acquisitions dominate the current landscape.

]]>
https://blogs.perficient.com/2019/06/25/build-a-data-platform/feed/ 0 241032
Oracle Fusion SaaS Security with Oracle Analytics Cloud https://blogs.perficient.com/2018/12/28/oracle-fusion-saas-security-with-oracle-analytics-cloud/ https://blogs.perficient.com/2018/12/28/oracle-fusion-saas-security-with-oracle-analytics-cloud/#comments Fri, 28 Dec 2018 22:56:32 +0000 https://blogs.perficient.com/?p=234386

The question that is often asked is: Can we leverage the same security we already have in Oracle Fusion SaaS (which includes users, duties, roles and security policies) to secure data in Oracle Analytics Cloud (OAC – an Oracle PaaS)? The answer is Yes. To understand better how this is possible, keep reading. This blog follows my previous 2 blog posts about Integrating Oracle SaaS Data into OAC and Creating OAC Data Replication from Oracle SaaS. While the prior posts describe how to load SaaS data into OAC, this blog focuses on how to make OAC inherit Oracle Fusion SaaS security, and therefore avoid the hassle of manually maintaining security setups in multiple places.

Before delving into the details, it is important to differentiate between securing Oracle SaaS data that is flowing over to OAC directly through a Data Set Connection vs the Oracle SaaS data that is replicated into an OAC Data Warehouse, through any of the data copying techniques (Data Sync, Data Replication, Data Flows, or other ETL means).

1. OAC Data Set Connection against Oracle SaaS: This approach leverages the OAC Oracle Application Connection Adapter. It allows authenticating with either a shared admin user or with an end-user login. Choosing to make end-users login with their own Oracle Fusion App credentials automatically enforces their Fusion App security roles and data policies to any reporting that they do against the Fusion App. Therefore, with a Data Set Connection, no additional configuration is necessary to inherit Fusion App security into OAC, since it all kicks in once an end-user logins in with their Fusion credentials.

2. OAC Data Warehouse Connection: This approach is querying a replica of the Fusion App data that has been copied over to a data warehouse. Accordingly, the replicated data requires that object and data level security controls be defined in OAC. Luckily, while doing this requires a one-time manual configuration, it relies on whatever security role assignments and data policies are setup in the source Fusion App.

The rest of this blog post elaborates on the second type of connection, and how to make OAC inherit Fusion App security against a data warehouse.

I am going to start my explanation by describing how authentication works and then move on to discuss how to setup authorization for both object security as well as data security.

Authentication:

  • OAC Authentication: Depending on how your OAC instance is provisioned, you may be using either the OAC embedded Weblogic server as an Identity Provider or Oracle Identity Cloud Service (IDCS). IDCS foundation is something you already have as part of your OAC subscription (if you have Universal Credits), and you are highly encouraged to start using it, if you haven’t already. You will need to use IDCS as the Identity Provider for OAC to establish Single Sign-On (SSO) with your Fusion App. IDCS is where users, roles, and role assignments are defined. In addition, IDCS serves as a common Identity Provider across multiple Oracle Cloud services, as well as non-Oracle Cloud and on-prem systems that may need to be integrated for user federation and SSO. For the purpose of this blog, the main idea is to enable IDCS to inherit users, roles and role assignments from your Fusion App so they can be shared with OAC.
  • Fusion App Authentication: Ideally IDCS is configured as an Identity Provider for the Oracle Fusion App as well. However, there is a good possibility that this is not the case. Many of the Fusion Cloud App subscribers didn’t have IDCS at the time they provisioned their Oracle SaaS apps. Therefore, they ended up using the built-in Fusion App Identity Provider to manage Fusion user accounts. If this sounds familar, there is no need to worry. Below I will elaborate on how the inheritance of security setups from Fusion Apps to OAC is possible in both scenarios.

Authorization: There are 2 different levels of authorizations that need to be configured: Object Level and Data Level Security.

  1. Object Level Security: This defines what Catalog objects (dashboards, reports, data visualization projects, etc…) and what subject areas and data models an OAC user has access to, and with what type of permission (such as read-only or editable). To seamlessly make OAC objects secured per Fusion App security, we first identify which Fusion App roles we want to use for the purpose of setting up OAC object permissions. For example, if the Fusion App is HCM, you may want to inherit the HR Specialist, HR Analyst and Payroll Manager roles. Users who have these roles in Fusion will automatically be granted access to corresponding objects in OAC. Such a configuration is a great time saver from a maintenance perspective on the analytics side. Making OAC inherit Fusion App roles and role assignments relies on making IDCS serve as a bridge between Fusion Apps and OAC. This integration looks a little different depending on whether you are using IDCS or the Fusion App as the Identity Provider for the Fusion App. Here is how things work in both of these scenarios:
    • Scenario 1: IDCS is an Identity Provider to OAC only while Fusion App is using its built-in Identify Provider for user management. With this scenario, IDCS is configured to act as a Service Provider for the Fusion App (in other words, Fusion App is the Identity Provider for IDCS.) Passwords continue to be stored and maintained in the Fusion App. Users, roles, and role to user assignments will all be defined in the Fusion App and then synchronized over to IDCS. New creations, updates and inactivation of Fusion App users flow through automatically into IDCS and OAC. This automatic synchronization from Fusion Apps to IDCS happens through Oracle Enterprise Scheduler (ESS) jobs. More details about setting up the synchronizations are available in this oracle doc.
    • Scenario 2: IDCS as Identity Provider to both OAC and Fusion App. In this case user accounts and their passwords are stored and maintained in IDCS. Users may be defined in either IDCS or the Fusion App and synced to the other side. However, Roles and Role Assignments are always defined in the Fusion App, as usual, and synchronized over to IDCS as in Scenario 1.In a nutshell, whether it’s the first or second scenario, Fusion App administrators continue maintaining security in the same way they do prior to activating OAC. There is no overhead required from an ongoing maintenance perspective on their part.
  2. Data Level Security: This is what tells OAC which user has access to what subsets of the data. For example, restrict access to information based on Position, Supervisor Hierarchy, or an HCM Payroll list. Like with securing OAC objects, it is highly advisable to tie OAC data level security to the Fusion App Data Security Policies. Invest the time upfront to make a one-time setup and avoid the hassle of dual and complicated maintanance. You would need to first identify the data security objects to secure out of the Fusion app (such as by location, or by Business Unit). Fusion Data Roles combine a user’s job with the data a user accesses (for example, a Country level HR Specialist). The Data Roles are defined in Fusion App security profiles. So we need to make IDCS, and accordingly OAC, inherit the Fusion Data Roles and apply security filters on such roles in the OAC Data Model. For setting up data security in OAC, we need to be aware of the Fusion Public View Objects (PVOs) that provide the user-permitted security data object identifiers (such as the list of departments a logged in user has access to). Once the Fusion source is identified, we then form extraction SQL to load a Data Warehouse side security metadata table inherited from the Fusion App setup. After the warehouse security table is loaded, we then define OAC Session Variables to query the Fusion App PVOs. (Note that unlike OBIEE on-prem, OAC doesn’t support a direct connection to Fusion SaaS ADF PVOs from the OAC Data Model, hence the security session variable initialization blocks need to be defined against Data Warehouse tables. Refer to this Oracle Doc to see if the OAC Data Model supports direct connection to Oracle Applications in later updates.) When initially applying security filters on the inherited data roles in OAC, we mimic the security policies defined in the Fusion App. Note that securing OAC Application Roles by applying data level security filters to OAC Subject Areas may be done either in the OAC Thin Data Modeler or the OAC Client Repository Administration Tool. For more complex data-level security across several Application Roles, the Client Admin Tool offers a better way of defining such filters.

To conclude, integrating Oracle Fusion SaaS Security into OAC is an essential part of a successful Oracle Analytics implementation. Performing a comprehensive security integration with SaaS that covers the various layers including users, objects and data is crucial. The success of the implementation is determined by how secure corporate data is and how feasible it is to avoid the maintenance overhead that would have been necessary without a well-planned and integrated security solution for Oracle SaaS and PaaS.

]]>
https://blogs.perficient.com/2018/12/28/oracle-fusion-saas-security-with-oracle-analytics-cloud/feed/ 2 234386
Is the Scene Safe? https://blogs.perficient.com/2018/09/18/is-the-scene-safe/ https://blogs.perficient.com/2018/09/18/is-the-scene-safe/#respond Tue, 18 Sep 2018 14:00:51 +0000 https://blogs.perficient.com/?p=230304

Being a paramedic during the formative years of my working life, I’ve been surprised at how many of the lessons that I learned on the job have translated to the business world.

 

EMS can be a dangerous profession; more EMS workers are killed per year than firefighters.[1] One study showed that 2/3 reported some form of abuse on the job in the previous year.[2] In my time in the field, I was hurt several times, got stitches once, was in a number of physical altercations with people that were abusing alcohol or drugs, and pulled weapons off of several patients. I had friends that were hurt in ambulance wrecks and associates killed in helicopter crashes.

 

Ingrained from the first day of EMT school is scene safety. In practical exams, proctors will fail you if you fail to address it in your initial approach to the scenario. On the streets it’s considered poor form to strain the system by taking your rig out of service, especially if you’re adding to the patient count on a scene. While we couldn’t anticipate every threat, situational awareness combined with active threat mitigation meant that most of the time we were ready for what was waiting for us.

 

In business, safety should also be front-of-mind. In addition to concerns about physical safety, IT security is a consistent failure point. Lessons from others’ failures seem to be lost instead of learning points. Even at an average cost of over $7m per data breach in 2017, we see continued failures to invest in the necessary mechanisms to create a secure data environment.[3] Whether it’s outdated, unpatched, or inadequate infrastructure, policy design or enforcement failures, or the continuing challenge of employee irresponsibility, inadequate data security will continue to be an existential risk for many businesses. In fact, 60% of small- and medium-sized businesses (the primary target of attacks) that suffer a cybersecurity attack will fail after 6 months.[4]

 

We the employees are the weakest brick in the cybersecurity wall – from the front-line workers through the C-Suite, we all contribute to an unsecure environment. 95% of incidents are a result of mistakes made by people with system access.[5] Poor training is clearly part of the issue. People are still clicking links they don’t recognize, opening PDFs and other files from people they don’t know, and providing data over the phone. Something as simple as password management is still a major problem – 80% of breaches resulted from password issues.[6] However, there’s evidence that the problem is more a result of policies that drive password-defeating behavior like post-its on monitors rather than actual password strength issues.[7] The major breaches have resulted from phishing attacks, not password breaches.[8]

 

If business leadership wants to prevent their business from being the next victim in the hurricane of data and security breaches, dedication and investment in security must be a committed focus from the C-suite. When security is left to a position of unfunded lip service, a costly and potentially business-ending breach may be more an eventuality than a risk.

[1] https://www.jems.com/articles/print/volume-36/issue-11/health-and-safety/studies-show-dangers-working-ems.html?c=1

[2] https://io9.gizmodo.com/5872364/the-hidden-dangers-of-being-a-paramedic

[3] https://www.businessinsider.com/sc/data-breaches-cost-us-businesses-7-million-2017-4

[4] https://www.inc.com/thomas-koulopoulos/the-biggest-risk-to-your-business-cant-be-eliminated-heres-how-you-can-survive-i.html

[5] https://hbr.org/2015/07/why-cybersecurity-is-so-difficult-to-get-right

[6] https://www2.trustwave.com/GlobalSecurityReport.html

[7] https://arstechnica.com/information-technology/2011/10/when-passwords-attack-the-problem-with-aggressive-password-policies/

[8] ibid

]]>
https://blogs.perficient.com/2018/09/18/is-the-scene-safe/feed/ 0 230304
GDPR Implications: Email & Marketing Automation https://blogs.perficient.com/2018/05/22/gdpr-implications-email-marketing-automation/ https://blogs.perficient.com/2018/05/22/gdpr-implications-email-marketing-automation/#respond Tue, 22 May 2018 19:00:10 +0000 https://blogs.perficient.com/perficientdigital/?p=15536

As the General Data Protection Regulation (GDPR) deadline approaches, we’re seeing two general trends: 1) Marketers are moving away from their reliance on third-party data and 2) Upcoming changes to European data privacy rules restricting the use of cookies, meaning that companies will face challenges in tracking prospective customers across the web. As GDPR is essentially a set of sweeping rules that govern the handling of European Union users’ data, no matter where they are located, there are GDPR implications for North American and Global Companies with European customers and universal global customer databases being used for ongoing marketing communications.

GDPR Data Elements – First Party Data

Every business uses some form of customer data, and it is usually first-party, meaning that the company has collected and stored the data itself. This first-party data is then typically used to manage ongoing communications with current customers. This is all well and good, but when companies want to find new customers, they must use third-party cookies to both target and then retarget prospective customers across multiple touch points or channels. The reason for this is that marketers typically wanted to buy categories of potential buyers based on attributes like socioeconomic status, age, income, role/function, persona, etc.
A second reason third-party data became important was for different ad systems to use the data to create customer/prospect models and relying on cookie sharing to build cookie pools of customers to target and market to. With the anonymous cookie pooling approach to data that we’ve seen in the marketplace, the future will be about further protecting consumer data and doing so in ways where marketers can be flexible. On this point, first-party data satisfies more GDPR requirements and will be one of the key reasons companies will reconsider the primary use of third-party data in their data management strategies.
We are now seeing a definite shift by brands and their omnichannel marketing models as they turn away from the use of third-party data as data privacy concerns by consumers rise. We expect this shift to continue as companies are forced to rethink their brand engagement strategies and marketing channel mix and begin to look for more direct engagement tools to reach their customers. This, in turn, will change how data management platform providers handle third-party data and shift their models to meet this trend. This will have a negative impact on smaller third-party ad networks and will further elevate the Facebook/Google data duopoly within the overall ad network ecosystem. Organic email opt-in and universal management of user preference center models will evolve. Additionally, email marketing tactics like email list acquisition and brokering or email appending services or non-organic email acquisition techniques will change.
It is our view that GDPR is already playing a critical role in how global brands think about and collect data about their customers and we fully expect this trend to continue as similar regulations are put in place in the US market in the coming years.

GDPR Data Elements – Third Party Data

All of this said, most experts agree that third-party data will remain in some form as a part of marketers’ future customer data strategies. The role played by third-party data, and the first scenario (data append) isn’t going anywhere. But now with GDPR, the recent Facebook data issues, and the general climate around customer data privacy, data ownership, and control, companies will be in search of alternatives to cookie sharing and the most likely solutions may be in enhanced consent policies and expanded use of first-party data as alternatives.

]]>
https://blogs.perficient.com/2018/05/22/gdpr-implications-email-marketing-automation/feed/ 0 269222
Preparing for GDPR https://blogs.perficient.com/2018/05/22/preparing-for-gdpr/ https://blogs.perficient.com/2018/05/22/preparing-for-gdpr/#respond Tue, 22 May 2018 14:32:27 +0000 https://blogs.perficient.com/perficientdigital/?p=15534

With the General Data Protection Regulation (GDPR) set to go into effect on May 25th, 2018, we wanted to run through some highlights for those who haven’t already spent many months knee-deep in research and preparation.
So what is GDPR? In a nutshell, GDPR is an effort to standardize data protection regulations across all EU member states. One of the chief goals is to ease compliance by establishing a single rule of law, which is definitely preferable to having 28 different sets of regulations. The main focus is to protect citizens’ personal data, and to do so; the GDPR focuses on aspects of data security, transparency, accountability, and user consent.
A key principle underlying the regulation is the assumption that the consumer (“data subject”) owns and should have full control over their personal data. Any entity collecting, storing, or processing this data may do so only with the consent of the data subject.  It is no longer sufficient to simply provide an “opt-out” option. The days of simply placing text like “by using this site, you agree…” in a website’s terms of service are ending. To achieve GDPR compliance, the data subject must grant consent by making “a statement or by a clear affirmative action.”
When requesting consent, the purpose(s) for the data collection and processing must be clearly communicated. Personal data should only be collected for “specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes.”
The first thing you need to determine is whether or not your organization is considered “in-scope” with regards to the GDPR. Article 3 and Recital 23 of the GDPR provide guidance on this question. Like many aspects of the GDPR, these provisions are the subject of considerable debate. Only your organization, with legal counsel, can decide whether this applies to your business. The most important thing to understand is that you are not automatically exempt because of your business’s physical location.
Even if your legal team has determined that no immediate action is required for your organization to comply with GDPR, it would be prudent to take this opportunity to get your data collection and processing in order. It’s safe to assume that the future will bring similar regulations in other jurisdictions and that these regulations will resemble GDPR. Completing some prep work now will put you in a better position to handle compliance moving forward.
We’ve covered just some of the large aspects of GDPR. Here are some other things GDPR does:

  • Allows data subjects to request any personal data that has been collected about them
  • Allows data subjects to withdraw their consent, thus requiring deletion of their personal data
  • Requires that data breaches be reported to a supervisory authority
  • Requires the establishment of the Data Protection Officer within certain organizations
  • Prohibits making decisions about the data subject “based solely on automated processing, including profiling” without explicit consent to do so
  • Prohibits making a service conditional upon consent, unless the processing is necessary for providing your service.

Regardless of your immediate next steps, we recommend thinking about and discussing the following with your legal team as you evolve your digital services in the future:

  • For all data being collected by your organization, understand:
    • With whom it is being shared, and why?
    • How it is being processed, and why?
    • How it is being stored, and why?
  • Is your privacy policy and consent gathering process updated to ensure that the details are clearly communicated?

Thinking about and understanding the answers to these questions will aid in preparation around future data protection regulation needs.
Disclaimer: The content above is provided for informational purposes only. The information shared here is not meant to serve as legal advice. You should work closely with legal and other professional counsel to determine exactly how the GDPR may or may not apply to you. There are still several aspects and definitions within the regulations that are open to interpretation. For this reason, it is crucial that your organization evaluate your data programs as appropriate.

]]>
https://blogs.perficient.com/2018/05/22/preparing-for-gdpr/feed/ 0 269221