Salesforce Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/salesforce/ Expert Digital Insights Fri, 31 Jan 2025 18:22:29 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Salesforce Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/salesforce/ 32 32 30508587 Sales Cloud to Data Cloud with No Code! https://blogs.perficient.com/2025/01/31/sales-cloud-to-data-cloud-with-no-code/ https://blogs.perficient.com/2025/01/31/sales-cloud-to-data-cloud-with-no-code/#respond Fri, 31 Jan 2025 18:15:25 +0000 https://blogs.perficient.com/?p=376326

Salesforce has been giving us a ‘No Code’ way to have Data Cloud notify Sales Cloud of changes through Data Actions and Flows.   But did you know you can go the other direction too?

The Data Cloud Ingestion API allows us to setup a ‘No Code’ way of sending changes in Sales Cloud to Data Cloud.

Why would you want to do this with the Ingestion API?

  1. You are right that we could surely setup a ‘normal’ Salesforce CRM Data Stream to pull data from Sales Cloud into Data Cloud.  This is also a ‘No Code’ way to integrate the two.  But maybe you want to do some complex filtering or logic before sending the data onto Sales Cloud where a Flow could really help.
  2. CRM Data Streams only run on a schedule with every 10 minutes.  With the Ingestion API we can send to Data Cloud immediately, we just need to wait until the Ingestion API can run for that specific request.  The current wait time for the Ingestion API to run is 3 minutes, but I have seen it run faster at times.  It is not ‘real-time’, so do not use this for ‘real-time’ use cases.  But this is faster than CRM Data Streams for incremental and smaller syncs that need better control.
  3. You could also ingest data into Data Cloud easily through an Amazon S3 bucket.  But again, here we have data in Sales Cloud that we want to get to Data Cloud with no code.
  4. We can do very cool integrations by leveraging the Ingestion API outside of Salesforce like in this video, but we want a way to use Flows (No Code!) to send data to Data Cloud.

Use Case:

You have Sales Cloud, Data Cloud and Marketing Cloud Engagement.  As a Marketing Campaign Manager you want to send an email through Marketing Cloud Engagement when a Lead fills out a certain form.

You only want to send the email if the Lead is from a certain state like ‘Minnesota’ and that Email address has ordered a certain product in the past.  The historical product data lives in Data Cloud only.  This email could come out a few minutes later and does not need to be real-time.

Solution A:

If you need to do this in near real-time, I would suggest to not use the Ingestion API.  We can query the Data Cloud product data in a Flow and then update your Lead or other record in a way that triggers a ‘Journey Builder Salesforce Data Event‘ in Marketing Cloud Engagement.

Solution B:

But our above requirements do not require real-time so let’s solve this with the Ingestion API.  Since we are sending data to Data Cloud we will have some more power with the Salesforce Data Action to reference more Data Cloud data and not use the Flow ‘Get Records’ for all data needs.

We can build an Ingestion API Data Stream that we can use in a Salesforce Flow.  The flow can check to make sure that the Lead is from a certain state like ‘Minnesota’.  The Ingestion API can be triggered from within the flow.  Once the data lands in the DMO object in Data Cloud we can then use a ‘Data Action’ to listen for that data change, check if that Lead has purchased a certain product before and then use a ‘Data Action Target’ to push to a Journey in Marketing Cloud Engagement.  All that should occur within a couple of minutes.

Sales Cloud to Data Cloud with No Code!  Let’s do this!

Here is the base Salesforce post sharing that this is possible through Flows, but let’s go deeper for you!

The following are those deeper steps of getting the data to Data Cloud from Sales Cloud.  In my screen shots you will see data moving between a VIN (Vehicle Identification Number) custom object to a VIN DLO/DMO in Data Cloud, but the same process could be used for our ‘Lead’ Use Case above.

  1. Create a YAML file that we will use to define the fields in the Data Lake Object (DLO).  I put an example YAML structure at the bottom of this post.
  2. Go to Setup, Data Cloud, External Integrations, Ingestion API.   Click on ‘New’
    Newingestionapi

    1. Give your new Ingestion API Source a Name.  Click on Save.
      Newingestionapiname
    2. In the Schema section click on the ‘Upload Files’ link to upload your YAML file.
      Newingestionapischema
    3. You will see a screen to preview your Schema.  Click on Save.
      Newingestionapischemapreview
    4. After that is complete you will see your new Schema Object
      Newingestionapischemadone
    5. Note that at this point there is no Data Lake Object created yet.
  3. Create a new ‘Ingestion API’ Data Stream.  Go to the ‘Data Steams’ tab and click on ‘New’.   Click on the ‘Ingestion API’ box and click on ‘Next’.
    Ingestionapipic

    1. Select the Ingestion API that was created in Step 2 above.  Select the Schema object that is associated to it.  Click Next.
      Newingestionapidsnew
    2. Configure your new Data Lake Object by setting the Category, Primary Key and Record Modified Fields
      Newingestionapidsnewdlo
    3. Set any Filters you want with the ‘Set Filters’ link and click on ‘Deploy’ to create your new Data Stream and the associated Data Lake Object.
      Newingestionapidsnewdeploy
    4. If you want to also create a Data Model Object (DMO) you can do that and then use the ‘Review’ button in the ‘Data Mapping’ section on the Data Stream detail page to do that mapping.  You do need a DMO to use the ‘Data Action’ feature in Data Cloud.
  4. Now we are ready to use this new Ingestion API Source in our Flow!  Yeah!
  5. Create a new ‘Start from Scratch’, ‘Record-Triggered Flow’ on the Standard or Custom object you want to use to send data to Data Cloud.
  6. Configure an Asynchronous path.  We cannot connect to this ‘Ingestion API’ from the ‘Run Immediately’ part of the Flow because this Action will be making an API to Data Cloud.  This is similar to how we have to use a ‘Future’ call with an Apex Trigger.
    Newingestionapiflowasync
  7. Once you have configured your base Flow, add the ‘Action’ to the ‘Run Asynchronously’ part of the Flow.    Select the ‘Send to Data Cloud’ Action and then map your fields to the Ingestion API inputs that are available for that ‘Ingestion API’ Data Stream you created.
    Newingestionapiflowasync2
  8. Save and Activate your Flow.
  9. To test, update your record in a way that will trigger your Flow to run.
  10. Go into Data Cloud and see your data has made it there by using the ‘Data Explorer’ tab.
  11. The standard Salesforce Debug Logs will show the details of your Flow steps if you need to troubleshoot something.

Congrats!

You have sent data from Sales Cloud to Data Cloud with ‘No Code’ using the Ingestion API!

Setting up the Data Action and connecting to Marketing Cloud Journey Builder is documented here to round out the use case.

Here is the base Ingestion API Documentation.

At Perficient we have experts in Sales Cloud, Data Cloud and Marketing Cloud Engagement.  Please reach out and let’s work together to reach your business goals on these platforms and others.

Example YAML Structure:

Yaml Pic

openapi: 3.0.3
components:
schemas:
VIN_DC:
type: object
properties:
VIN_Number:
type: string
Description:
type: string
Make:
type: string
Model:
type: string
Year:
type: number
created:
type: string
format: date-time

]]>
https://blogs.perficient.com/2025/01/31/sales-cloud-to-data-cloud-with-no-code/feed/ 0 376326
Is it really DeepSeek FTW? https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/ https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/#respond Thu, 30 Jan 2025 14:55:53 +0000 https://blogs.perficient.com/?p=376512

So, DeepSeek just dropped their latest AI models, and while it’s exciting, there are some cautions to consider. Because of the US export controls around advanced hardware, DeepSeek has been operating under a set of unique constraints that have forced them to get creative in their approach. This creativity seems to have yielded real progress in reducing the amount of hardware required for training high-end models in reasonable timeframes and for inferencing off those same models. If reality bears out the claims, this could be a sea change in the monetary and environmental costs of training and hosting LLMs.

In addition to the increased efficiency, DeepSeek’s R1 model is continuing to swell the innovation curve around reasoning models. Models that follow this emerging chain of thought paradigm in their responses, providing an explanation of their thinking first and then summarizing into an answer, are providing a step change in response quality. Especially when paired with RAG and a library of tools or actions in an agentic framework, baking this emerging pattern into the models instead of including it in the prompt is a serious innovation. We’re going to see even more open-source model vendors follow OpenAI and DeepSeek in this.

Key Considerations

One of the key factors in considering the adoption of DeepSeek models will be data residency requirements for your business. For now, self-managed private hosting is the only option for maintaining full US, EU, or UK data residency with these new DeepSeek models (the most common needs for our clients). The same export restrictions limiting the hardware available to DeepSeek have also prevented OpenAI from offering their full services with comprehensive Chinese data residency. This makes DeepSeek a compelling offering for businesses needing an option within China. It’s yet to be seen if the hyperscalers or other providers will offer DeepSeek models on their platforms (Before I managed to get his published, Microsoft made a move and is offering DeepSeek-R1 in Azure AI Foundry).  The good news is that the models are highly efficient, and self-image hosting is feasible and not overly expensive for inferencing with these models. The downside is managing provisioned capacity when workloads can be uneven, which is why pay-per-token models are often the most cost efficient.

We are expecting that these new models and the reduced prices associated with them will have serious downward pressure on per-token costs for other models hosted by the hyperscalers. We’ll be paying specific attention to Microsoft as they are continuing to diversify their offerings beyond OpenAI, especially with their decision to make DeepSeek-R1 available. We also expect to see US-based firms replicate DeepSeek’s successes, especially given that Hugging Face has already started work within their Open R1 project to take the research behind DeepSeek’s announcements and make it fully open source.

What to Do Now

This is a definite leap forward and progress in the direction of what we have long said is the destination—more and smaller models targeted at specific use cases. For now, when looking at our clients, we advise a healthy dose of “wait and see.” As has been the case for the last three years, this technology is evolving rapidly, and we expect there to be further developments in the near future from other vendors. Our perpetual reminder to our clients is that security and privacy always outweigh marginal cost savings in the long run.

The comprehensive FAQ from Stratechery is a great resource for more information.

]]>
https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/feed/ 0 376512
Security and Privacy in Experience Cloud: Best Practices for Protecting Customer Data https://blogs.perficient.com/2025/01/30/security-and-privacy/ https://blogs.perficient.com/2025/01/30/security-and-privacy/#respond Thu, 30 Jan 2025 12:35:03 +0000 https://blogs.perficient.com/?p=375932

Why Security and Privacy Matter

Security and privacy are not just best practices but legal requirements when handling customer data. Regulations like the General Data Protection Regulation (GDPR) in the EU, the California Consumer Privacy Act (CCPA) in California, and other global privacy laws mandate that businesses protect the privacy of individuals and ensure their data is stored and processed securely.

Failure to comply with these regulations can result in hefty fines, legal consequences, and damage to your reputation. On top of that, data breaches can undermine customer trust and loyalty.

Best Practices for Security and Privacy in Experience Cloud

Here are some best practices that businesses should follow to protect customer data in Salesforce Experience Cloud:

1. Use Strong Authentication Methods

One of the first lines of defense in protecting sensitive data is authentication. You must ensure that only authorized users can access the Experience Cloud portals.

  • Multi-Factor Authentication (MFA): MFA adds an extra layer of security by requiring users to verify their identity using more than just a password. This could include a one-time passcode sent to their mobile device, biometric scans, or security questions.
  • Single Sign-On (SSO): Implementing SSO allows users to log in once and access multiple systems without repeatedly entering their credentials. This reduces the chances of credentials being stolen or misused.
  • User Roles and Permissions: Assign clear roles and permissions to ensure that only the right people can access sensitive data. For example, a customer may only need access to their account information, while a support agent may require access to a broader data set.

2. Data Encryption

Encryption ensures that customer data is unreadable to unauthorized individuals, even if it is intercepted during transmission or if someone gains unauthorized access to your systems.

  • Encryption at Rest: This protects data stored on Salesforce servers. Experience Cloud automatically encrypts sensitive data at rest, so your data is secure even if someone gains access to the database.
  • Encryption in Transit: Ensure that data is encrypted during transmission (for example, when a customer submits information through your portal). Use secure protocols like HTTPS and TLS to protect the data as it travels across the internet.

3. Implement Secure Data Sharing

Salesforce Experience Cloud often involves collaboration between different users, such as customers, partners, and employees. However, not all users should have access to all data. Proper data-sharing rules help ensure that users only access the data they need to perform their tasks.

  • Sharing Settings: Customize sharing settings to define who can see specific records. For example, a customer should only see their own support cases and not those of others.
  • Record-Level Security: Use Salesforce’s built-in record-level security features like profiles and permission sets to control what data users can access and edit.
  • Public Link Restrictions: Be cautious when sharing public links to Experience Cloud resources. Use these links sparingly and apply restrictions where possible, especially when sensitive data is involved.

4. Compliance with Privacy Regulations

Ensuring compliance with data privacy laws is essential not just for protecting customer data but also to avoid legal risks. Salesforce provides tools that can help businesses comply with privacy regulations.

  • GDPR Compliance: Salesforce has features designed to help businesses comply with GDPR requirements, such as the ability to request and delete personal data upon customer request.
  • Data Retention Policies: Create policies for retaining and deleting customer data following legal requirements. For instance, you should not store personal data longer than necessary for business operations or legal reasons.
  • Right to Be Forgotten: Under GDPR, customers can request that their personal data be deleted. Ensure that your Experience Cloud implementation includes features that allow for easy removal of customer data when requested.

5. Regular Audits and Monitoring

Continuous monitoring of your Salesforce Experience Cloud environment is crucial for detecting security vulnerabilities or breaches. Regular audits can help you identify potential risks before they escalate.

  • Login History Tracking: Monitor login attempts to detect suspicious activity, such as failed login attempts or logins from unusual locations.
  • Audit Trails: Salesforce’s audit trail feature tracks changes to your system, helping you monitor who is accessing what data and when. Review these logs regularly to identify irregularities.
  • Security Health Checks: Use Salesforce’s built-in Health Check tool to evaluate your system’s security settings and ensure you follow the best practices.

6. Educate and Train Your Team

Security isn’t just about technology; it’s also about the people who use it. Educate your employees and users about security best practices.

  • Phishing Awareness: Teach your team to recognize phishing emails and avoid clicking on suspicious links or downloading unverified attachments.
  • Password Best Practices: Encourage strong, unique passwords and the use of password managers to store credentials securely.
  • Security Training: Regularly conduct security training sessions for all stakeholders, including partners, to ensure they understand the importance of data protection.

7. Backup and Disaster Recovery Plan

Despite all the precautions, data loss can still occur. Implement a robust backup and disaster recovery plan to quickly recover data during an attack or system failure.

  • Regular Backups: Ensure all critical data is backed up regularly, and store backups in secure locations.
  • Disaster Recovery Procedures: Develop and test a disaster recovery plan to ensure that you can restore your systems and data with minimal downtime if a breach or other incident occurs.

Visit the articles below to learn more about Salesforce Experience Cloud:

]]>
https://blogs.perficient.com/2025/01/30/security-and-privacy/feed/ 0 375932
The Art of Writing Test Classes in Salesforce Apex https://blogs.perficient.com/2025/01/29/the-art-of-writing-test-classes-in-salesforce-apex/ https://blogs.perficient.com/2025/01/29/the-art-of-writing-test-classes-in-salesforce-apex/#respond Wed, 29 Jan 2025 07:12:17 +0000 https://blogs.perficient.com/?p=376132

Imagine you are building a skyscraper. Before you allow people to move in, you ensure it can withstand earthquakes, high winds, and other stress factors. Similarly, when you develop in Salesforce Apex, you need to test your code to ensure it works seamlessly under all scenarios. This is where the art of writing test classes comes into play. For beginners, understanding test classes is not just about code coverage; it’s about quality and confidence in your applications.

Let’s dive into the story of crafting test classes—one step at a time, simplifying complexities and answering your questions along the way.

Why Are Test Classes Important?

Think of test classes as safety checks for your Salesforce org. Without them, you might deploy code that breaks critical business processes. Here are some key reasons why they are essential:

The Art Of Writing Test Classes In Salesforce Apex Visual Selection

  • Error Prevention: Test classes ensure your code behaves as expected, even in edge cases.
  • Code Coverage: Salesforce requires at least 75% code coverage to deploy Apex to production.
  • Regression Testing: They help ensure new changes don’t break existing functionality.
  • Improved Confidence: Good test classes give developers and stakeholders peace of mind.

Now that we know why test classes matter, let’s learn how to write them effectively.

Getting Started with Test Classes

The Basics

Test classes in Salesforce are written in Apex and are annotated with @isTest. These classes validate the behavior of your Apex code by simulating different scenarios. Here’s a simple example:

@isTest
public class AccountHandlerTest {
    @isTest
    static void testCreateAccount() {
        // Arrange: Set up test data
        Account acc = new Account(Name = 'Test Account');
        
        // Act: Perform the action to test
        insert acc;

        // Assert: Verify the outcome
        Account insertedAcc = [SELECT Id, Name FROM Account WHERE Id = :acc.Id];
        System.assertEquals('Test Account', insertedAcc.Name);
    }
}

Key Concepts to Remember

  1. Isolation: Test classes don’t affect real data in your org. Salesforce provides a separate testing environment.
  2. Data Creation: Always create test data in your test classes. Don’t rely on existing data.
  3. Assertions: Use System.assert methods to validate outcomes. For example, System.assertEquals(expected, actual) ensures the code produces the expected result.

Writing Effective Test Classes

1. Follow the Arrange-Act-Assert Pattern

This is a golden rule in testing. Break your test methods into three clear sections:

  • Arrange: Prepare the data and environment.
  • Act: Perform the action you want to test.
  • Assert: Verify the result.

Example:

@isTest
static void testCalculateDiscount() {
    // Arrange
    Opportunity opp = new Opportunity(Name = 'Test Opp', StageName = 'Prospecting', CloseDate = Date.today());
    insert opp;

    // Act
    Decimal discount = DiscountCalculator.calculateDiscount(opp.Id);

    // Assert
    System.assert(discount > 0, 'Discount should be greater than zero.');
}

2. Use Test.startTest and Test.stopTest

Salesforce limits the number of queries and DML operations you can perform in a single transaction. To ensure your test methods don’t hit these limits, use Test.startTest() and Test.stopTest(). This also helps test asynchronous methods like future calls or batch jobs.

@isTest
static void testFutureMethod() {
    // Arrange
    Account acc = new Account(Name = 'Test Account');
    insert acc;

    // Act
    Test.startTest();
    MyFutureClass.myFutureMethod(acc.Id);
    Test.stopTest();

    // Assert
    Account updatedAcc = [SELECT Status__c FROM Account WHERE Id = :acc.Id];
    System.assertEquals('Processed', updatedAcc.Status__c);
}

3. Test for Positive, Negative, and Edge Cases

Cover all possible scenarios:

  • Positive Test: Validate expected behavior for valid inputs.
  • Negative Test: Handle invalid inputs gracefully.
  • Edge Cases: Test boundaries (e.g., null values, empty lists).

4. Use Mocking for Callouts

Salesforce doesn’t allow HTTP callouts in test methods. Instead, use the HttpCalloutMock interface to simulate responses.

@isTest
static void testCallout() {
    Test.setMock(HttpCalloutMock.class, new MockHttpResponseGenerator());

    // Act
    HttpResponse response = MyCalloutService.makeCallout();

    // Assert
    System.assertEquals(200, response.getStatusCode());
}

5. Check Governor Limits

You can use Limits methods in your test classes to ensure your code efficiently handles large datasets and respects governor limits.

System.assert(Limits.getQueries() < 100, 'Query limit exceeded!');

Common Questions Answered

Q1. How many test classes should I write?

Write enough test classes to cover all your code paths. Every method and branch of logic should be tested.

Q2. How can I achieve 100% code coverage?

While 100% coverage isn’t always realistic, aim to cover every possible branch in your code. Use tools like Developer Console to identify uncovered lines.

Q3. Can I use real data in test classes?

No, it’s a best practice to create your own test data to ensure reliability and isolation.

Q4. How do I handle exceptions in test methods?

Use try-catch blocks to validate that exceptions are thrown when expected.

try {
    MyClass.myMethod(null);
    System.assert(false, 'Expected an exception but none was thrown.');
} catch (Exception e) {
    System.assertEquals('Expected Exception Message', e.getMessage());
}

Wrapping Up

Writing test classes is not just a task to check off your list; it’s an art that ensures the reliability of your Salesforce applications. Start with small, clear methods, cover various scenarios, and always validate your outcomes. Over time, you’ll find yourself building robust, error-free solutions that stand the test of time—just like a well-built skyscraper.

Happy coding!

]]>
https://blogs.perficient.com/2025/01/29/the-art-of-writing-test-classes-in-salesforce-apex/feed/ 0 376132
Salesforce Apex Tokenization: Enhancing Data Security https://blogs.perficient.com/2025/01/29/salesforce-apex-tokenization-enhancing-data-security/ https://blogs.perficient.com/2025/01/29/salesforce-apex-tokenization-enhancing-data-security/#respond Wed, 29 Jan 2025 06:45:31 +0000 https://blogs.perficient.com/?p=373899

In today’s digital landscape, ensuring data security is not just a best practice—it’s a necessity. As organizations store increasing amounts of sensitive information, protecting that data becomes paramount. As a leading CRM platform, Salesforce offers various mechanisms to secure sensitive data, and one of the advanced techniques is Apex Tokenization. This blog will explore tokenization, how it works in Salesforce, and the best practices for securely implementing it.

What is Tokenization?

Tokenization involves substituting sensitive data with a non-sensitive identifier, a token. These tokens are unique identifiers that retain essential information without exposing the actual data. For instance, a randomly generated token can be used rather than storing a customer’s credit card number directly. This process protects the original data, making it harder for unauthorized parties to access sensitive information.

Tokenization

Benefits of Tokenization

Tokenization offers several significant benefits for organizations:

  • Enhanced Security: Tokens are meaningless outside their intended system, significantly reducing the risk of data breaches.
  • Compliance: Tokenization helps businesses meet regulatory requirements like PCI DSS (Payment Card Industry Data Security Standard), GDPR (General Data Protection Regulation), and HIPAA (Health Insurance Portability and Accountability Act), ensuring that sensitive data is protected.
  • Scalability: Tokens can be used across multiple systems to maintain data integrity without compromising security.

Tokenization in Salesforce

Salesforce provides a robust platform for implementing tokenization within your Apex code. While Salesforce does not offer native tokenization APIs, developers can integrate external tokenization services or create custom solutions using Apex. This flexibility allows businesses to ensure their data is protected while still benefiting from Salesforce’s powerful CRM capabilities.

Key Use Cases for Tokenization in Salesforce

  • Payment Information: Replace credit card details with tokens to reduce the risk of data breaches.
  • Personally Identifiable Information (PII): Tokenize sensitive customer data, such as Social Security Numbers, to protect individual privacy.
  • Data Sharing: Share tokens instead of actual data across systems to maintain confidentiality.

Implementing Tokenization in Apex

Here’s a step-by-step guide to implementing tokenization in Apex:

1. Define Custom Metadata or Custom Settings

Use Custom Metadata or Custom Settings to store configurations like tokenization keys or API endpoints for external tokenization services.

2. Create an Apex Class for Tokenization

Develop a utility class to handle tokenization and detokenization logic. Below is an example:

public class TokenizationUtil {
    // Method to convert sensitive data into a secure token
    public static String generateToken(String inputData) {
        // Replace with actual tokenization process or external service call
        return EncodingUtil.base64Encode(Blob.valueOf(inputData));
    }

    // Method to reverse the tokenization and retrieve original data
    public static String retrieveOriginalData(String token) {
        // Replace with actual detokenization logic or external service call
        return Blob.valueOf(EncodingUtil.base64Decode(token)).toString();
    }
}

3. Secure Data During Transit and Storage

Always ensure data is encrypted during transmission by using HTTPS endpoints. Additionally, it securely stores tokens in Salesforce, leveraging its built-in encryption capabilities to protect sensitive information.

4. Test Your Tokenization Implementation

Write comprehensive unit tests to verify tokenization logic. Ensure coverage for edge cases, such as invalid input data or service downtime.

@IsTest
public class TokenizationUtilTest {
    @IsTest
    static void testTokenizationProcess() {
        // Sample data to validate the tokenization and detokenization flow
        String confidentialData = 'Confidential Information';

        // Converting the sensitive data into a token
        String generatedToken = TokenizationUtil.tokenize(confidentialData);

        // Ensure the token is not the same as the original sensitive data
        System.assertNotEquals(confidentialData, generatedToken, 'The token must differ from the original data.');

        // Reversing the tokenization process to retrieve the original data
        String restoredData = TokenizationUtil.detokenize(generatedToken);

        // Verify that the detokenized data matches the original data
        System.assertEquals(confidentialData, restoredData, 'The detokenized data should match the original information.');
    }
}

Best Practices for Apex Tokenization

  • Use External Tokenization Services: Consider integrating with trusted tokenization providers for high-security requirements. You could look into options like TokenEx or Protegrity.
  • Encrypt Tokens: Store tokens securely using Salesforce’s native encryption capabilities to add an extra layer of protection.
  • Audit and Monitor: Implement logging and monitoring for tokenization and detokenization processes to detect suspicious activity.
  • Avoid Storing Sensitive Data: Where possible, replace sensitive fields with tokens instead of storing raw data in Salesforce.
  • Regulatory Compliance: Ensure your tokenization strategy aligns with relevant compliance standards (e.g., PCI DSS, GDPR, HIPAA) for your industry.

Conclusion

Tokenization is a powerful technique for enhancing data security and maintaining compliance in Salesforce applications. You can safeguard sensitive information by implementing tokenization in your Apex code while enabling seamless operations across systems. Whether through custom logic or integrating external services, adopting tokenization is essential to a more secure and resilient Salesforce ecosystem.

]]>
https://blogs.perficient.com/2025/01/29/salesforce-apex-tokenization-enhancing-data-security/feed/ 0 373899
How to Subscribe to Salesforce Dashboards? https://blogs.perficient.com/2025/01/27/how-to-subscribe-to-salesforce-dashboards/ https://blogs.perficient.com/2025/01/27/how-to-subscribe-to-salesforce-dashboards/#respond Mon, 27 Jan 2025 07:10:13 +0000 https://blogs.perficient.com/?p=376037

Hello Trailblazers!

Salesforce Dashboards are powerful tools that allow users to visualize and analyze data at a glance. To stay updated on key metrics without manually checking dashboards, Salesforce provides a subscription feature. Subscribing to dashboards ensures that you and your team receive timely updates via email, helping you stay informed and make data-driven decisions.

In this blog, we’ll learn how to subscribe to Salesforce Dashboards.

Before you Begin:

In the earlier sections of this Salesforce Dashboards series, we explored what Salesforce Dashboards are, the step-by-step process to create them, and an in-depth look at Dynamic Dashboards in Salesforce. So to ensure a thorough understanding and gain the maximum benefit from this series, I highly recommend reviewing those parts before moving forward.

Benefits of Subscribing to Salesforce Dashboards

  1. Automated Updates: Receive dashboard data directly in your email without manual intervention.
  2. Timely Insights: Get updates on key metrics at regular intervals.
  3. Collaboration: Share insights with team members effortlessly by including them in subscriptions.
  4. Customization: Choose specific schedules and recipients for dashboard updates.

Prerequisites for Subscribing to Dashboards

  1. Permissions: Ensure you have the “Subscribe to Dashboards” permission enabled. Check with your Salesforce Administrator if you are unsure.
  2. Access to Dashboard: You must have view access to the dashboard you want to subscribe to.
  3. Email Configuration: Your Salesforce org must have email delivery settings configured.

At the end of this blog, I have demonstrated how you can receive automated email updates for Salesforce Dashboards by subscribing to them. So stay tuned for all the details!

Steps to Subscribe to a Salesforce Dashboard

Step 1: Navigate to the Dashboard

  1. Go to the Dashboards tab in Salesforce.
  2. Locate the dashboard you want to subscribe to using the search bar or browsing the folders.
  3. Click to open the desired dashboard as illustrated below.

Img1

Step 2: Click on the Subscribe Button

  1. Once the dashboard is open, locate the Subscribe button at the top right corner of the screen.
  2. Click on the Subscribe button to begin the subscription process.

Img2

 

Step 3: Configure Subscription Settings

  1. Set Frequency: Choose how often you want to receive the dashboard updates. Options include:
    • Daily
    • Weekly
    • Monthly
  2. Select Time: Specify the time of day as shown below when the dashboard email should be sent.
  3. Choose Conditions (Optional):
    • Add filters or conditions for triggering the subscription.
    • For example, “Send only if revenue is below $50,000.”

Img3

 

Step 4: Add Recipients

  1. Include Yourself: By default, you will be subscribed to the dashboard.
  2. Add Team Members: Add colleagues or other Salesforce users who should receive the email. Enter their names or select them from the user list.

Note: Only users with access to the dashboard can be added as recipients.

Img4

Step 5: Save the Subscription

  1. Review your subscription settings to ensure everything is correct.
  2. Click Save to activate the subscription.

So you can see your subscription in the Subscribed column as shown below.
Img5

So in this way, you can subscribe to the Salesforce Dashboards.

Note: If you’re interested in learning “how to subscribe to Salesforce Reports”, please explore the detailed blog by clicking on the provided link.

Managing Dashboard Subscriptions

  1. View Existing Subscriptions:
    • Open the dashboard and click on the Subscribe button.
    • You can see and manage your existing subscriptions.
  2. Edit Subscription Settings:
    • Adjust frequency, time, or recipients as needed.
    • Save changes to update the subscription.
  3. Unsubscribe:
    • If you no longer wish to receive dashboard emails, click on Unsubscribe to stop the updates.

Img6

Best Practices for Dashboard Subscriptions

  1. Limit Recipients: Only include essential stakeholders to avoid overwhelming users with emails.
  2. Optimize Frequency: Choose a schedule that aligns with the dashboard’s relevance and data update frequency.
  3. Use Filters Wisely: Apply conditions to ensure emails are sent only when specific criteria are met.
  4. Test Email Delivery: Verify that emails are being sent and received correctly.

Result – How do you receive emails for Salesforce Dashboards?

Here, I’m showing the result of receiving the Salesforce Dashboard after subscribing to it.

Click to view slideshow.

Troubleshooting Subscription Issues

  1. Not Receiving Emails:
    • Check your spam or junk folder.
    • Confirm that your email address is correct in Salesforce.
    • Verify that your organization’s email server is not blocking Salesforce emails.
  2. Permission Issues:
    • Ensure you have the necessary permissions to subscribe to dashboards.
    • Contact your Salesforce Administrator for assistance.
  3. Dashboard Access Issues:
    • Confirm that you have access to the dashboard and its data.

 

Conclusion

Subscribing to Salesforce Dashboards is a simple yet effective way to stay informed about your business metrics. So by following the steps outlined in this guide, you can automate dashboard updates, share insights with your team, and make timely decisions.

Happy Reading!

 “Self-learning is the art of unlocking your potential, where curiosity becomes your guide and perseverance your greatest teacher.”

 

Related Posts:

  1. Subscribe to Dashboards in Lightning Experience
  2. Subscribe to Dashboards by Group or Role

You Can Also Read:

1. Introduction to the Salesforce Queues – Part 1
2. Mastering Salesforce Queues: A Step-by-Step Guide – Part 2
3. How to Assign Records to Salesforce Queue: A Complete Guide
4. An Introduction to Salesforce CPQ
5. Revolutionizing Customer Engagement: The Salesforce Einstein Chatbot

 

]]>
https://blogs.perficient.com/2025/01/27/how-to-subscribe-to-salesforce-dashboards/feed/ 0 376037
How to Create a Bucket Column for the Picklist Type Field in Salesforce Report https://blogs.perficient.com/2025/01/23/how-to-create-a-bucket-column-for-the-picklist-type-field-in-salesforce-report/ https://blogs.perficient.com/2025/01/23/how-to-create-a-bucket-column-for-the-picklist-type-field-in-salesforce-report/#comments Thu, 23 Jan 2025 10:31:55 +0000 https://blogs.perficient.com/?p=375095

Hello Trailblazers!

Salesforce provides powerful reporting tools to analyze and visualize data effectively. Among these tools, the Bucket Field stands out as a feature that enables categorization of data directly within reports.

In this blog post, we’ll focus on creating a Bucket Column specifically for Picklist type fields in Salesforce Reports, offering a step-by-step guide to help users categorize data efficiently.

Before you Begin:

In the previous part of this blog we’ve explored “What is Bucket Field in Salesforce?”, ways to create it and many more. Before proceeding I highly recommend revisiting the earlier section to gain a comprehensive understanding of the fundamentals related to it.

What is a Bucket Column in Salesforce?

A Bucket Column/Field is a feature that allows users to group values of a specific field into categories (buckets) without altering the underlying Salesforce object. Here in this blog, we’re particularly using it for picklist fields, where predefined values can be grouped into broader categories to simplify analysis.

For example:

  • Group opportunity stages into “Early,” “Mid,” and “Closed” categories.
  • Categorize lead sources into “Digital,” “Offline,” and “Referral” groups.

Benefits of Bucket Columns for Picklist Fields

  1. Ease of Use: Group picklist values dynamically without modifying the schema.
  2. Enhanced Insights: Aggregate data into meaningful categories for better analysis.
  3. Time-Saving: No need for custom fields or formula fields to categorize data.
  4. Improved Collaboration: Share reports with categorized data easily across teams.

Note: If you’re interested in learning “How to Share Reports or Report Folders in Salesforce,” you can explore the detailed guide provided in this link. It offers step-by-step instructions to help you seamlessly manage report sharing and collaborate effectively.

Steps to Create a Bucket Column for Picklist Fields

Step 1: Open or Create a Report

  1. Navigate to the Reports tab in Salesforce.
  2. Click New Report or open an existing report that contains the picklist field you want to bucket.
  3. Select the relevant report type (e.g., Opportunities, Leads).
  4. Here we are selecting standard “Leads” report type as shown in the figure below.

Img1

Note: If you are interested in learning “What is a Custom Report Type in Salesforce?” and learn how to create one, I recommend you to explore the detailed guide available through the provided link.

Step 2: Add a Bucket Column

Once you select the report type, it will open the Report Builder.

  1. Navigate to the Outline section in the left-hand panel.
  2. Locate the Columns section and click the dropdown menu.
  3. From the dropdown options, select Add Bucket Column to proceed, as shown in the figure below.

Img2

Note: We’ve previously explored an alternative method for creating a bucket column. If you’d like to learn that approach, kindly refer to the earlier part of this blog post. The relevant link is provided in the “Before You Begin” section and is also included at the end for your convenience.

Step 3: Configure the Bucket Column

  1. Name Your Bucket Column: Enter a descriptive name, such as “Lead Source Group.”
  2. Select the Source Field: Choose the desired picklist field that you want to use for creating a bucket column. (e.g., Lead Source).
  3. Define Buckets:
    • Click Add Bucket to create a new category.
    • Enter a name for the bucket (e.g., “Digital Sources“).
    • Select picklist values to include in this bucket (e.g., “Website,”).
    • Click on “Move to” and choose “Digital Sources” from the options, as illustrated in the figure below.Img3
  4. Repeat for Other Buckets: Create additional buckets for other categories (e.g., “Offline Sources,” “Referral Sources”). It should be like this:Img4
  5. Click Apply to save your configuration.

Step 4: Use the Bucket Column in the Report

  1. Drag the newly created bucket column into the report canvas anywhere you want.
  2. Use it for grouping, filtering, or summarizing data as needed.
  3. If you summarize the report by newly created Bucket Column/Field, it will look like this:

Img5

 

Note: If you would like to learn more about “How to create Summary Reports in Salesforce?”, then please follow the provided link.

Step 5: Save and Run the Report

  1. Save the report by clicking Save
  2. Provide a meaningful name, description, and folder location for the report.
  3. Click Save & Run to visualize your categorized data.

Best Practices for Bucket Columns

  1. Keep Categories Meaningful: Ensure that bucket names are intuitive and easy to understand.
  2. Test with Small Datasets: Verify the categorization before applying it to larger datasets.
  3. Document Your Configuration: Provide descriptions for each bucket to clarify their purpose.
  4. Limit Buckets: Avoid creating too many buckets to maintain report clarity and focus.

 

Conclusion

Bucket Columns in Salesforce Reports are a simple yet powerful way to organize and analyze data dynamically. For Picklist fields, they provide a flexible solution to group values into meaningful categories without altering the underlying schema.

By following the steps in this guide, you can quickly set up bucket columns and unlock deeper insights into your Salesforce data.

Happy Reading!

 “A disciplined mind leads to a focused life; when you control your actions, you control your destiny.”

 

Related Posts:

  1. Bucket Field in Salesforce
  2. Bucket Field Limitations

You Can Also Read:

1. Introduction to the Salesforce Queues – Part 1
2. Mastering Salesforce Queues: A Step-by-Step Guide – Part 2
3. How to Assign Records to Salesforce Queue: A Complete Guide
4. An Introduction to Salesforce CPQ
5. Revolutionizing Customer Engagement: The Salesforce Einstein Chatbot

 

]]>
https://blogs.perficient.com/2025/01/23/how-to-create-a-bucket-column-for-the-picklist-type-field-in-salesforce-report/feed/ 1 375095
Personalization to Boost Customer Engagement https://blogs.perficient.com/2025/01/21/personalization/ https://blogs.perficient.com/2025/01/21/personalization/#respond Tue, 21 Jan 2025 13:24:23 +0000 https://blogs.perficient.com/?p=375624

In today’s digital age, personalization is key to keeping customers engaged and satisfied. Salesforce Experience Cloud offers various features that allow
businesses to deliver tailored experiences, making customers feel valued and enhancing their connection with your brand. Here’s a simple guide on how to leverage personalization to boost engagement.

1. Personalization: Creating Tailored Experiences

Personalization is all about showing customers the content and information that matters most to them. With Salesforce’s powerful data and AI capabilities, businesses can offer personalized experiences in real-time, improving customer satisfaction and loyalty. Here are some ways to use personalization:

Custom Dashboards

Custom Dashboard is one way to personalize the experience is by displaying custom dashboards for your customers. These dashboards can show relevant data, such as recent orders or support cases, giving users easy access to information that matters most to them.

Targeted Content

You can provide custom content based on customers’ interests, regions, and purchasing history. For example, showing product recommendations or blog posts that align with their past behavior can increase engagement and make customers feel understood.

User-Specific Recommendations

Salesforce also allows you to suggest knowledge articles, products, or services based on a customer’s past interactions. This makes the experience more relevant and helpful, improving both satisfaction and engagement.

2. Self-Service Capabilities: Empowering Customers

Customers want quick solutions, and many prefer to solve issues on their own without waiting for support. Salesforce Experience Cloud Communities offer a great way to provide self-service options that empower customers to help themselves. This reduces frustration and enhances their overall experience.

Knowledge Base

A well-organized knowledge base helps customers find articles, FAQs, and troubleshooting guides whenever they need them. By offering these resources, you can reduce the need for customers to contact support and encourage them to find solutions independently.

Case Management

Through case management tools, customers can submit support cases, track their progress, and view solutions directly within the portal. This feature streamlines the support process and makes it easier for customers to get the help they need.

Community Forums

Allow customers to interact with each other through community forums. These forums enable peer-to-peer support, where customers can ask questions, share experiences, and even provide answers to others. This helps increase interaction and creates a community feel around your brand.

3. Real-Time Collaboration and Communication

Effective communication plays a big role in building strong customer relationships. Salesforce Experience Cloud Communities offer various ways for businesses to engage with customers in real-time, creating a more interactive experience.

Chatter

Salesforce’s Chatter is an enterprise social network that allows customers to communicate directly with your brand. Customers can ask questions and get quick responses, which helps build trust and improve customer satisfaction.

Forums and Discussion Boards

Forums and discussion boards give customers a space to ask questions, engage in discussions, and receive answers from both peers and experts. This type of communication increases engagement and helps customers feel more connected to your brand.

Live Chat

With integrated live chat tools, customers can quickly interact with support representatives. Whether they need help with a product or have a question about services, live chat provides instant solutions, reducing wait times and improving engagement.

4. Gamification: Making Engagement Fun

Gamification is a fun way to keep customers engaged by rewarding them for certain actions. Salesforce Experience Cloud makes it easy to implement gamification strategies that encourage users to interact more with your content and community.

Badges and Points

Award badges or points to customers for actions like submitting feedback, sharing knowledge, or completing tasks. These small rewards can make a big difference in keeping customers engaged.

Leaderboards

Display a leaderboard that shows the top contributors to incentivize participation and foster a sense of friendly competition. This encourages customers to engage more and be part of an active community.

Challenges and Quizzes

Create challenges or quizzes where customers can earn rewards for answering questions or completing activities. Gamification adds an element of excitement and makes the customer experience more enjoyable.

5. Seamless Integration with Other Salesforce Products

Salesforce Experience Cloud integrates seamlessly with other Salesforce products, such as Sales Cloud, Service Cloud, and Marketing Cloud, offering a 360-degree view of your customer. This integration enables businesses to deliver a unified experience across various touchpoints.

Customer Insights

Use data from other Salesforce tools to personalize the customer experience within Experience Cloud. By understanding your customer’s needs and preferences, you can tailor content and recommendations to be more relevant.

Automation

Automate key processes such as customer onboarding, case management, and communication to improve efficiency and create a smoother experience for customers.

Marketing Campaigns

By integrating with Marketing Cloud, businesses can deliver personalized email campaigns, newsletters, and promotions directly to community members, keeping them engaged with your brand.

6. Mobile-Optimized Communities: Engaging Customers on the Go

In today’s mobile-first world, ensuring that your community is mobile-optimized is essential. Salesforce Experience Cloud Communities are fully responsive, meaning they offer a great user experience no matter the device.

Anytime, Anywhere Access

Customers can engage with your community whenever they want, from any device—whether it’s a smartphone, tablet, or desktop.

Push Notifications

Use push notifications to send timely updates and alerts directly to your customers’ mobile devices, keeping them informed and engaged.

Mobile-Friendly Interfaces

A smooth, intuitive mobile experience ensures that customers will keep coming back. By optimizing your community for mobile, you ensure customers can easily interact with your brand, no matter where they are.

7. Rich Analytics for Continuous Improvement

To keep improving your community and customer engagement, Salesforce Experience Cloud provides powerful analytics tools that track customer behavior. By analyzing these metrics, you can identify trends and make informed decisions.

Community Activity Tracking

Monitor how customers interact with your content and see which topics generate the most interest. This insight helps you improve your content strategy.

Engagement Metrics

Measure customer engagement by tracking logins, forum activity, and support cases. This helps you understand how customers are using your community and where improvements can be made.

Alos, visit the articles below:

Salesforce Documentation : Experience Cloud

Experience Cloud Key Features

Setting Up and Customizing Experience Cloud

]]>
https://blogs.perficient.com/2025/01/21/personalization/feed/ 0 375624
Streamline Your Code with Salesforce Apex Collection Conversion Hacks https://blogs.perficient.com/2025/01/21/streamline-your-code-with-salesforce-apex-collection-conversion-hacks/ https://blogs.perficient.com/2025/01/21/streamline-your-code-with-salesforce-apex-collection-conversion-hacks/#respond Tue, 21 Jan 2025 09:24:44 +0000 https://blogs.perficient.com/?p=375916

Imagine you’re building a Lego masterpiece. You’ve got blocks of all shapes and sizes—cylinders, squares, rectangles—but they all need to come together in harmony to create something amazing. Salesforce Apex collections work in a similar way. Collections help you organize and manipulate data efficiently, and sometimes, you need to convert one collection type into another to get the job done.

Today, I’ll take you on a story-driven tour of Salesforce Apex collections and their conversions. By the end, you’ll know exactly when and how to use these tools like a pro—even if you’re just starting out.

Understanding the Cast of Characters

 Visual Selection

In the Apex world, there are three main types of collections:

  1. Lists: Think of these as a row of chairs in a theater. Each chair (element) has a fixed position (index).
    Example: A list of account names—['Acme Inc.', 'TechCorp', 'DreamWorks'].
  2. Sets: Sets are like your box of unsorted chocolates—no duplicates allowed, and order doesn’t matter.
    Example: A set of unique product IDs—{P001, P002, P003}.
  3. Maps: Maps are like dictionaries, with keys and their corresponding values. You can quickly look up information using a key.
    Example: A map of employee IDs to names—{101 => 'John', 102 => 'Alice', 103 => 'Bob'}.

Why Convert Collections?

Let’s say you’re tasked with creating a report of unique leads from multiple campaigns. You initially gather all the leads in a List, but you notice duplicates. To clean things up, you’ll need a Set. Or perhaps you have a Map of IDs to records, and your boss asks for a simple list of names. Voilà—collection conversions to the rescue!

Key Scenarios for Conversions:

  • Removing duplicates (List → Set)
  • Extracting values or keys from a Map (Map → List/Set)
  • Searching with custom logic or preparing data for another operation

The Magic of Conversion

Here’s where the fun begins! Let’s dive into common collection conversions and their Apex implementations.

1. List to Set

Scenario: You have a list of product categories, but some are repeated. You need unique categories for a dropdown.

List<String> categories = new List<String>{'Electronics', 'Books', 'Books', 'Toys'};
Set<String> uniqueCategories = new Set<String>(categories);

System.debug(uniqueCategories);
// Output: {Electronics, Books, Toys}

Key Takeaway: A Set automatically removes duplicates.

2. Set to List

Scenario: You have a Set of user IDs and need to process them in a specific order.

Set<Id> userIds = new Set<Id>{'005xx000001Sv7d', '005xx000001Sv7e'};

List<Id> orderedUserIds = new List<Id>(userIds);

System.debug(orderedUserIds);
// Output: A list of IDs in no specific order

Tip: If order matters, sort the list using List.sort().

3. Map to List (Keys or Values)

Scenario: You have a Map of account IDs to names but need only the names.

Map<Id, String> accountMap = new Map<Id, String>{'001xx000003NGc1' => 'Acme Inc.','001xx000003NGc2' => 'TechCorp'};

List<String> accountNames = accountMap.values();

System.debug(accountNames);
// Output: ['Acme Inc.', 'TechCorp']

Bonus: To get the keys, use accountMap.keySet() and convert it to a list if needed.

4. List to Map

Scenario: You have a list of contacts and want to create a Map of their IDs to records.

List<Contact> contacts = [SELECT Id, Name FROM Contact LIMIT 5];

Map<Id, Contact> contactMap = new Map<Id, Contact>(contacts);

System.debug(contactMap);
// Output: A map of Contact IDs to records

Key Takeaway: This is super handy for quick lookups!

5. Set to Map

Scenario: You need a Map of product IDs (keys) to default stock values.

Set<String> productIds = new Set<String>{'P001', 'P002'};

Map<String, Integer> stockMap = new Map<String, Integer>();

for (String productId : productIds) {
stockMap.put(productId, 100); // Default stock is 100
}
System.debug(stockMap);
// Output: {P001=100, P002=100}

Common Pitfalls (and How to Avoid Them)

 Visual Selection (1)

  1. Null Collections: Always initialize your collections before using them.
    Example: List<String> names = new List<String>();
  2. Duplicate Data: Remember that Sets discard duplicates, but Lists don’t. Convert wisely based on your use case.
  3. Order Dependency: Lists maintain insertion order; Sets and Maps don’t. If order is critical, stick with Lists.
  4. Type Mismatches: Ensure the types match when converting. For example, converting a List<String> to a Set<Integer> will fail.

A Pro’s Perspective

Once you’ve mastered these basics, you’ll start seeing patterns in your day-to-day Salesforce development:

  • Cleaning up data? Convert Lists to Sets.
  • Need efficient lookups? Use Maps and their keys/values.
  • Preparing for DML operations? Leverage List to Map conversions for easy processing.

Quick Tip: If you find yourself repeatedly converting collections, consider creating utility methods for common tasks.

The Final Word

Collections are the backbone of Salesforce Apex, and converting between them is an essential skill. Whether you’re cleaning data, optimizing queries, or preparing for integrations, understanding how and when to convert collections can save you hours of frustration.

Now it’s your turn—try these examples in a developer org or a Trailhead playground. The more you practice, the more intuitive this will become. And remember, every pro was once a beginner who didn’t give up!

Happy coding! 🚀

]]>
https://blogs.perficient.com/2025/01/21/streamline-your-code-with-salesforce-apex-collection-conversion-hacks/feed/ 0 375916
Preparing for Salesforce Spring ’25 Release Updates: Miscellaneous Updates https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-miscellaneous-updates/ https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-miscellaneous-updates/#respond Tue, 21 Jan 2025 06:45:45 +0000 https://blogs.perficient.com/?p=375241

The Salesforce Spring ’25 release is just around the corner, bringing a mix of exciting new features, critical updates, and changes designed to enhance the platform’s usability and security. However, with each new release comes the responsibility of preparation, ensuring that your Salesforce environment remains seamless and efficient. Whether you’re a developer, admin, or end user, staying informed and proactive is key to leveraging these updates to their full potential. This blog explores the essential updates in the Spring ’25 release, offering practical steps and tips to help you prepare your Salesforce org, minimize disruption, and make the most of the new features.

Spring Release 2025 Blog Template 1 708x428 1

Image source: Salesforce

General Preparation Tips

  • Use Sandboxes

    Perform all updates in sandboxes before production deployment.

  • Schedule Changes

    Make changes during non-working hours to allow rollback if issues arise.

  • Review Managed Packages

    Check compatibility with package providers for impacted features.

These steps will ensure your Salesforce org is ready for the Spring ’25 release updates with minimal disruptions.

Key Changes in Spring ’25

  1. .Change Einstein Activity Capture Permissions for Sales Engagement Basic Users

  • Impact

    Users with the Sales Engagement Basic User permission set will lose access to Einstein Activity Capture. They need to be assigned the Standard Einstein Activity Capture permission set. Cloned permission sets will also lose access.Einstein Activity1

    Image Source :Salesforce Help

  • Preparation

    • No Code:
      • Create a Permission Set List View showing permission sets with “Use Einstein Activity Capture.”
    • Pro Code:
      • Search metadata for AutomaticActivityCapture references.
    • Adjust permissions in a sandbox first. Ensure users have appropriate access without losing needed permissions from Sales Engagement Basic User.
  • Production Steps

After sandbox testing, update permissions in production and validate Einstein Activity Capture functionality.

  1. Enable ICU Locale Formats

  • Impact

    Salesforce is moving from Oracle’s JDK Locale Formats to ICU Locale Formats, which affect date, time, names, addresses, and number formats.
    Enforcement rolls out until Summer ’25. Users with en_CA locales should ensure ICU formats are enabled. For Formats, follow this guide on Salesforce Help.

  • Preparation

    • Check current formats using Salesforce Help.
    • For JDK Locales:
      • Enable a test run in a sandbox.
      • Update Apex classes, triggers, and Visualforce pages as needed.
    • Test impacted functionality thoroughly before enabling the update.
  • Production Steps

    • After sandbox validation, deploy updates outside working hours.
    • Enable the update well ahead of enforcement to allow rollback if needed.
  1. Verify Your Return Email Address for Sender Verification

  • Impact

    Return email addresses must be verified before being used. Users will receive verification emails once per release until Spring ’25.

  • Preparation

    • No Code and Pro Code:
      • Users should check their email settings at:
        /lightning/settings/personal/EmailSettings/home.
      • If the “Resend Verification Email” option is available, users should click it and follow the email instructions.Return Email1Image Source:  Salesforce
  • Production Steps:

Conclusion

Remember, preparation is not just about addressing immediate impacts but also about future-proofing your Salesforce org for sustained success. Stay informed, collaborate with your team, and leverage Salesforce tools to make this release work to your advantage. With the right approach, you can turn these updates into opportunities to optimize processes and enhance the user experience.

]]>
https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-miscellaneous-updates/feed/ 0 375241
Preparing for Salesforce Release Updates in Spring ’25: Integration Updates https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-release-updates-in-spring-25-integration-updates/ https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-release-updates-in-spring-25-integration-updates/#respond Tue, 21 Jan 2025 06:44:35 +0000 https://blogs.perficient.com/?p=375232

Salesforce introduces release updates in every cycle, which may modify how specific features or products behave. To ensure a smooth transition, it’s crucial to test these updates in a sandbox environment before they are activated in your production system. This is especially important as these updates will take effect from January 2025.

Hero Title Salesforce Spring 25 Release Dates 20241206 1400x788

Steps to Prepare

  1. Use a Sandbox for Testing:

    Enable updates in a sandbox environment to evaluate their impact on your systems and processes before deploying them live.

  2. Utilize Development Tools:

    If you have development expertise, leverage tools like Salesforce CLI (SFDX) to identify potential impacts on metadata. Additional tools such as Salesforce Inspector Reloaded or Workbench can be helpful for querying records or checking permissions.

  3. Monitor Release Notes:

    Salesforce may delay or cancel specific updates. Always refer to the official release notes for the latest information.

For more insights, refer to the Ultimate Guide to Salesforce Release Updates.

Key Updates in Spring ’25

The Spring ’25 release will bring several updates that will be auto-enabled. Review the details below to prepare your organization and ensure enough time for testing and adjustments.

💡 Tip: If your organization uses managed packages, contact the provider to verify that their metadata supports the upcoming changes. When planning changes in a production environment, schedule them outside of business hours to minimize disruptions.

  1. Enforcing Rollbacks for Apex Action Exceptions in REST API

Impact:

Previously, exceptions triggered by Apex actions via the REST API weren’t rolled back. Starting with this update, these exceptions will result in rollbacks.

Preparation Steps:

  • No-Code Users: Search for @InvocableMethod annotations in Apex.
  • Pro-Code Users: Examine Apex metadata for similar annotations.

Testing in Sandbox:

  • Use tools like Workbench or Postman to manually call REST API endpoints.
  • Enable the test run and ensure Apex actions behave as expected after triggering.

Testing in Production:

  • Schedule changes outside of business hours.
  • Deploy modifications from your sandbox, enable the test run, and confirm functionality.

For details, see: Enforce Rollbacks for Apex Action Exceptions in REST API.

  1. Migrating to a Multiple-Configuration SAML Framework

Impact:

Organizations using a single-configuration SAML framework must transition to a multiple-configuration framework. If your org doesn’t utilize this feature, no action is required.

Preparation Steps:

Salesforce provides comprehensive instructions within the setup page. Follow them to migrate seamlessly.

For guidance, refer to: Migrate to a Multiple-Configuration SAML Framework.

  1. Updating LinkedIn Lead Gen Integration Settings

Impact:

Due to changes in LinkedIn’s APIs, you must update the integration by December 16, 2024, to maintain connectivity.

Preparation Steps:

  1. Navigate to LinkedIn Lead Gen settings in Salesforce Setup.
  2. Disconnect the LinkedIn account.
  3. Enable the “Use LinkedIn Lead Sync APIs with Lead Forms” setting.
  4. Reconnect the LinkedIn account and verify settings.

💡 Tip: Testing this integration in a sandbox environment may disrupt live data. Assess your processes to ensure no leads are lost during testing or migration.

Learn more: Review and Update Settings to Capture Leads from LinkedIn.

  1. Maintaining Access to Salesforce Outlook Integration

Impact:

Microsoft’s updated authentication methods may affect the Salesforce Outlook integration.

Preparation Steps:

Coordinate with your Microsoft 365 admin to ensure required scopes are granted. Salesforce recommends enabling the Admin Consent Flow to simplify the process.

For more information, visit:

Summary

The Spring ’25 release introduces significant changes that require proactive preparation and testing. Plan ahead to avoid disruptions and ensure your Salesforce instance operates smoothly during and after the updates.

]]>
https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-release-updates-in-spring-25-integration-updates/feed/ 0 375232
Preparing for Salesforce Spring 25 Release Updates: Apex Updates https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-apex-updates/ https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-apex-updates/#respond Tue, 21 Jan 2025 06:43:45 +0000 https://blogs.perficient.com/?p=375225

Salesforce enforces release updates during each major release cycle, which may alter the behavior of specific features or products. To ensure smooth operations, it’s essential to prepare by testing these updates in a sandbox environment before they are automatically enabled in your production environment starting January 2025.

Spring Release 2025 Blog Template 1 708x428 1

Image Source: Salesforce

Preparation Steps

  1. Test in Sandbox First

Enable release updates in a sandbox to identify potential impacts on your system or workflows. This precaution helps ensure a seamless transition when updates are enforced in production.

  1. Tools for Developers

If you have development expertise, leverage tools like Salesforce DX (SFDX) to scan metadata for references affected by the updates. Additionally, tools such as Salesforce Inspector Reloaded or Workbench can be used to query records or permissions for deeper insights.

  1. Stay Updated with Release Notes

Salesforce occasionally delays or cancels certain updates. Regularly review the latest release notes to stay informed about changes.

Spring ’25 Release Updates

When the Spring ’25 release is deployed, several updates will be auto-enabled. Below are key updates, their impacts, and steps to prepare in sandbox and production environments.

  1. Enforce Permission Requirements on Built-In Apex Classes Used as Inputs

Impact:

Flows or Process Builders that invoke Apex methods could behave differently. Apex called in this manner will now execute in the current context if it interacts with built-in Apex classes as inputs.

Preparation Steps:

  • Identify Impacted Apex Methods
    • Search for @InvocableMethod in your Apex classes.
    • Review references in Flows, Process Builders, or Einstein features.
  • Sandbox Testing:
    • Enable the test run for this update.
    • Validate that all identified Invocable methods function as expected.
    • If applicable, test these methods via the REST API.
  • Production Deployment:
    • Plan deployment during non-working hours to minimize disruptions.
    • Test changes post-deployment to ensure functionality.

For detailed guidance on this update, consult the official Salesforce documentation and release notes.

  1. Enforce Rollbacks for Apex Action Exceptions in REST API

Impact:

REST API calls to Apex Actions will now be rolled back if an exception occurs, ensuring data consistency.

Preparation Steps:

  • Sandbox Testing:
    • Use tools like Workbench or Postman to manually trigger REST API calls.
    • Enable the test run and verify that Apex actions behave as expected under various scenarios.
  • Production Deployment:
    • Deploy changes during off-peak hours.
    • Enable the test run and validate impacted functionality in production.

For detailed guidance on this update, consult the official Salesforce documentation and release notes.

  1. Sort Apex Batch Action Results by Request Order

Impact:

Batch Apex job results will now be sorted based on the order of requests, changing the current arrangement where errors appear at the top.

Preparation Steps:

  • Sandbox Testing:
      • Identify Batch Apex jobs from Setup.
    • Enable the test run and validate logging mechanisms or custom error handling.
  • Production Deployment:
    • Deploy changes during off-hours.

Ensure Batch Apex jobs function as expected post-deployment.

  1. Use Apex-Defined Variables for All Intelligence Signal Types

Impact:

Service Cloud Voice flows that use intelligenceSignals must include an Apex-defined variable as an input to prevent errors.

Preparation Steps:

  • Sandbox Testing:
    • Ensure Service Cloud Voice is properly configured.
    • Modify affected flows to accept Apex-defined variables as inputs.
    • Enable the test run and validate use cases triggering these flows.
  • Production Deployment:

Deploy updated flows during off-hours and ensure all intelligence rules function correctly.

For detailed guidance on this update, consult the official Salesforce documentation and release notes.

Best Practices for Deployment

  • Always test updates in a sandbox before enabling them in production.
  • Schedule production changes outside business hours to minimize disruption.
  • Collaborate with managed package providers to ensure compatibility with the updates.

By following these guidelines and staying proactive, you can ensure a smooth transition to the new features and functionalities introduced by Salesforce release updates.

 

]]>
https://blogs.perficient.com/2025/01/21/preparing-for-salesforce-spring-25-release-updates-apex-updates/feed/ 0 375225