Salesforce Spring ’21 – Developer Highlights

Salesforce Spring ’21 – Developer Highlights

Salesforce Spring ’21 is nearly upon us, and these are my highlights for Developers from them. Obviously there is a lot more in the notes, so always encourage you to read through fully yourself.


  1. Track Changes between your local Project and a Sandbox

This hasn’t really been a big issue for me since I can rely on Git for determining changes in my metadata, however there may be scenarios where you want to make a lot of configuration changes and this will help you identify and track them better.

2. Create Scratch Orgs with More Features

Great to have more feature control. I just checked and it seems Person Account feature  (that I was looking for) has actually already been available for a while now!

3. SOQL – Select predefined groups of Fields ( SELECT * ) 

I’m not really sure about this one. The lazy SQL developer uses SELECT * and will usually cause issues down the track. Being able to use this in SOQL with SELECT FIELDS(ALL) is convenient, but I’m concerned that developers will just use that by default rather than consider which fields they truly need. This may or may not have performance implications, however it could easily have security implications where some PII data hasn’t been secured. On the other hand, if you need to select all the fields anyway then this will make for cleaner code.

4. Lightning – DOM API changes

Do you have automated tests for your org? (Selenium, Tosca, etc) Salesforce has been changing the way the HTML code is named, so you will probably want to check for breakages in your test suites ASAP.

5. Lightning – Customize lightning-map control

You now have greater control over the behavior of the map control such as disabling panning, buttons, etc, and can define your own map marker icons as well. This isn’t a huge leap, but great to see that Salesforce is still improving this control. There will always going to have the issue that it won’t be allowed to compete with MapAnything, but I find it a valuable control for limited scenarios.

6. Lightning UI components to be deprecated

This was announced a year ago for Spring ‘20, but now the time is almost upon us! As of May 1st 2021 all UI namespace components will no longer be updated or supported.

7. Salesforce Functions (beta)

This has been a long time coming. Formally known as EverGreen (I think, it was hard to keep track) there has always been some rather mixed messaging from Salesforce around this. Still, an interesting new beta feature that could provide you with alternatives to Apex, although not clear if additional licensing is required.

8. Apex Job Transaction Finalizer (beta)

Strange beta in that you can use this in Production from Spring ’21. Use a finalizer at the end of your Apex to handle or rerun failed jobs. I’ll need to spend some time on a POC to understand this one fully.

Creating a Salesforce Permission Set subset from multiple Profiles

This tool is for reducing the size of your Profiles by extracting common elements and placing in a common Permission Set.

Complex Salesforce orgs often have large Profiles, and Users can only have one Profile. This means that when you create a new App or implement a Release Update (for example), then your Admins will need to go through and ensure all Profiles are updated the same way. You will then also have to test each Profile fully.

I created a simple command line utility to identify the most common elements in a set of Salesforce Profiles and generate a common subset Permission Set as well as the newly reduced Profiles.

Salesforce provides Permission Sets and Permission Set Groups that allow you to define a set of Permissions not tied to any Profile. This means for example that if you have 100 class accesses that are needed by multiple Profiles, you could define one Permission Set and assign to everyone who has those Profiles.

In a really simple example, I cloned three new Profiles from the standard ‘Minimum Access – Salesforce’ Profile. (In a real world example this hierarchy would be better defined as Roles). Each has Profile level access to different Apex Classes in our Org:

Initial Profile state

We can see that all these Profiles have access to the X-Wing and Y-Wing classes, so we can add those to a Permission Set:

Reduced Profiles with common Permission Set

Furthermore, we can see that the ‘General’ and ‘Major’ profiles have other common classes, meaning we can divide up further:

Reduced Profiles with two common Permission Sets

Now, the Rebel Alliance has just acquired a new B-Wing vehicle for everyone. To grant access the Admin can now just add this to the main Permission Set, instead of to each Profile:

Adding a new permission element that everyone has access to

For such a simple example it is easy for an Admin to set this up by hand, however what if you have 15 profiles that are thousands of configuration lines long?

The tool generates a Permission Set, however if your common files are still large then I would suggest looking at creating a Permission Set Group and splitting the generated Permission Sets up inside it instead.


This post will only cover the generation of Profile and Permission Set metadata files. The process for pulling and pushing Profile metadata from an org really depends on how well your devops is set up. SFDX tools can pull and deploy Profiles, but not consistently without some issues.

Warning and Disclaimer

Always use this tool in sandboxes and never directly in Production! Try and design an improved Permission Set and Profile combination in a sandbox, and then test that extensively. You need to have personal confidence that the new Permission Set is going to work for your users.

If you do progress to Production, then also do have a roll back strategy to the old Profiles.

Finally, please ensure that you have all the training and knowledge required to pass the ‘Sharing & Visibility Designer’ Certification. You really need to understand how Profiles and Permission Sets work, and all potential impacts to your org. Unit Tests, manual testing and automated testing are rarely comprehensive enough to catch every scenario.


  1. Ensure Microsoft .Net 5.0 is installed on your system (Windows, Mac or Linux)
  2. Place all Profile metadata files in the \input directory (you can specify another directory for input and output if desired)
  3. Run this command (sample input files provided): dotnet SalesforceProfileRefactor.dll \input \output

MacOS Example:

Run the tool from the command line

You will see a new folder generated under ‘Output’ with the current date and time stamp. Inside is the common Permission Set as well as the newly reduced Profiles:

How the tool looks

Open the generated Report.csv in Excel (or similar), and you will see that it has moved 2 Apex Class access elements and 5 user Permission elements into the common Permission Set:

The output report in MS Excel

Changing the code:

The tool has been developed in Microsoft .Net 5.0 and should run on Windows, MacOS and Linux. You can change the code in community (free) versions of Microsoft Visual Studio 2019 for either Mac or Windows.


  1. I chose to do this in C# .Net because I already knew how to manipulate XML with it. Additionally Visual Studio is great at generating the XML schema wrapper class.
  2. At this early stage I doubt there are many people who need this utility. Even I am hoping to use it very rarely. Java would have been a more natural choice for the Salesforce community, but would have taken me longer.
  3. Code Optimization – I traded accuracy and reliability over code speed. I have a recent i7 running at 2.8 Ghz and the whole operation takes less than 7 seconds to process 600,000 Profile configuration lines, therefore I am not too concerned about it.
  4. GUI – I have some thoughts about extending this with a user interface to help visualize the process better, perhaps by comparing specific types of elements and attributes between files rather than all.
  5. Naming – Ensure that the name of the files is correct for either Sfdx or older metadata deployments. I might include this as a command line option in future.

Develop Salesforce DX with Visual Studio Code Remote

Develop Salesforce DX with Visual Studio Code Remote

Scott Hanselman has a great post on the new remote development feature with Visual Studio Code, and I wondered how this would work with Salesforce DX.

Most of my learning for Salesforce DX has been on my personal MacBook, mostly because my company laptop is heavily restricted. I can run Visual Studio Code from my user space under Windows 10, but all the other developer tooling can’t be installed / has issues.

The awesome thing about the new Remote feature is that I was able to install all the required development tooling to a Ubuntu 18 server that I provisioned on a Microsoft Azure subscription and can now control it with Visual Studio as the client.


Note: Please pay close attention as to where I have specified Client machine (your laptop, whatever) and your Remote Machine (Linux server with all dev tools)

Great! This is the basic setup, now the tricky bits so please also refer to the Microsoft SSH guidance:

  • Step 8: Client machine: Run the remote-ssh: Connect to Host command for your Linux server. Visual Studio Code will setup everything for you.
  • Step 9: Client machine: Install the Visual Studio Code Salesforce Extension on the remote machine (obviously not on the Client machine)
  • Step 10: Remote machine: Login to your remote machine with a remote desktop session and run a new instance of Visual Studio Code with Salesforce CI integration. Authorize an org with the ‘SFDX: Authorize an Org’ command
  • Step 11: Remote machine: Still in Remote desktop, run the following command in the Visual Code terminal: ‘sfdx force:org:display -u vscodeOrg –verbose’ (where vscodeOrg the name of your authorized org. Copy the value of “Sfdx Auth Url” which will be in the format force:// (Danger: This value will let anyone log into your org without username or password!)
  • Step 12: Remote machine: Save this value in a text file in an easy location. eg. /home/MyUser/src/login.txt
  • Step 13: Client machine: run the following command in the Visual Studio Code terminal window: ‘sfdx force:auth:sfdxurl:store -f login.txt -s -a vscodeOrg
  • Step 14: Client machine: (optional) Setup access to your source control repository in Visual Studio Code. Note that the source code will now be held on the remote machine, not your local machine.


And there you have it. You can no use you local machine as a development client, with all tools needed for Salesforce DX running remotely.


Thoughts & observations:

  1. The configuration could have been changed to use Docker Containers locally. I didn’t go that route because I’m not convinced my company laptop has the right amount of system resources. It would also be very hard to get Docker Desktop installed.
  2. Debugging javascript with this configuration might be too hard without the browser on the remote machine. Local extensions are probably the way to go.
  3. Setup is non-trivial. Reading through my steps I can see that this is not something that someone who has only ever used Salesforce would want to configure. If you don’t want to jump through the hoops above then I’d advise waiting for a few months for things to advance some more.
  4. There is exciting potential for ‘code from anywhere’ with this model. It is probably unrealistic (for example) to have the full Salesforce DX development suite on an Apple iPad, but definitely more realistic to have the Visual Studio Code client running and connecting to a development server in the cloud.




Delete all Opportunity Records in Salesforce

This seems rather simple, but it was hard to find out how to delete all Opportunity records from your Salesforce organization.

The main reason was that the easiest way is to use DataLoader to extract all the record ID’s and then bulk delete them. Unfortunately this method requires me to leave my laptop open for a few hours whilst it deletes the 1 million records.

As an alternative I adapted this Apex class that implements the Database Batchable interface. This then sets up the required number of individual batches to delete all the records (5,000 at a time).


global class example_Batch implements Database.Batchable<sObject>{

global Database.QueryLocator start(Database.BatchableContext BC) {

String query = ‘SELECT ID FROM Opportunity’;

return Database.getQueryLocator(query);

global void execute(Database.BatchableContext BC, List<sObject> scope) {



global void finish(Database.BatchableContext BC) {



It then runs in the background freeing up my laptop.

Screen Shot 2016-06-15 at 4.14.47 PM

CRM Data Migration Part 4: Connections with Informatica Cloud

CRM Data Migration Part 4: Connections with Informatica Cloud

This article will show how to connect source and target through the framework described earlier.

We need to define 3 connections, Legacy, Staging and Salesforce.

Informatica Cloud is not completely straightforward to setup with your environment, but then again not too difficult that a day of infrastructure configuration wouldn’t fix. You download a piece of software called ‘Secure Agent’ that will run the ETL process. It is worth noting that you can only have one Secure Agent for each ETL, so both source and target data source as well as Informatica Cloud itself needs connectivity from the server running the Agent.  I also found that quite a bit of time was needed to ensure all database connections were running smoothly.

Setting up a mapping configuration is quite straightforward. The tool will determine the sources and target schemas, and allow some transformations in the middle. Here I am joining the email and phone tables with the contacts before inserting into Staging.


Screen Shot 2016-04-26 at 7.07.03 PM

This is a good example of a compromise where I am just taking the first email address. In practice you may want to consider how to deal with all those extra addresses.

(If you were doing this for real, then it would be better to create views within the source SQL Server and just export out of those. There would be less scope for an ETL configuration error.)

You can then just keep running until you get the migration result that you are looking for. Ensure that the ‘Truncate’ option is turned on.

Screen Shot 2016-04-26 at 7.10.31 PM

Check everything is looking good in the Staging (using a SQL query):

Screen Shot 2016-04-26 at 7.16.48 PM.png

And then define a simple ETL for loading into Salesforce. Given that the Staging should be more or less how you want things in Salesforce this should be easy:

Screen Shot 2016-04-26 at 7.21.04 PM.png

Run and check out the new Contacts in Salesforce:

Screen Shot 2016-04-26 at 7.25.33 PM.png




Informatica Cloud and Windows Server 2012 R2 – CreateProcess error=14001

I just setup a new Windows 2012 R2 server with SQL Server 2012, and hooked it up to Salesforce with Informatica Cloud.

The configuration of the connections was very easy, as was the mapping of source to target fields.

However when I tried to run the mapping I got the following error:

“Internal error. The DTM process failed to start due to the following error: [CreateProcess error=14001, The application has failed to start because its side-by-side configuration is incorrect. Please see the application event log or use the command-line sxstrace.exe tool for more detail”

Apparently this is just down to the Informatica Secure Cloud Agent not having VC++ binaries installed. This is rather confusing given there isn’t very much on this error message. Anyway I found and installed the the VC++ and all worked well.


Informatica KB on the issue:


Link to the VC++ binaries that everyone seems to insist that you don’t need:


Quick Overview of the Salesforce Marketing Cloud Audience Builder

Quick Overview of the Salesforce Marketing Cloud Audience Builder

The Audience Builder in the Salesforce Marketing Cloud is a way to create specific groups of people to send emails to, as well as further split that group up into smaller more targeted groups for different email content.

There is an official overview on the ExactTarget help pages (Public), as well as an online course with video walkthroughs in the course catalog of the 3sixty website (private). The videos, although professionally created, are a bit hard to follow owing to the user interface having been recently revamped. There is also a ‘certification’ exam, which is rather easy if you had been paying attention through the course. I can see the exam being useful if you need to verify if someone actually knows how Audience Builder hangs together.

The first question would be where to find it? It does have to be enabled by ExactTarget, and if it has then you should see it under Data & Analytics (and if you can’t then you need to contact ExactTarget).

Where is audience builder

Before diving in, it is probably helpful to get familiar with some of the terms that are used in Audience Builder, such as Universe, Population and Contact. I made a quick cheat sheet to help remember, although if your attributes and dimensions are already set up then you just have to use them rather than know how to define them.

AB Attributes and Dimensions

Once you have that, then you must understand how Segments work on your resultant Audience. You should note how Priority between segments work, as well as Waterfall suppression of which Audience members enter which segment. This will help avoid unexpected questions about why certain members were not included in an intended email mailing.

AB Segments

If you’re now comfortable with that, it is worth to just look at the Audience Builder screen to get an idea how it works. The Overview screen just shows a long list of all currently defined Audiences:

Audience Builder Overview

If you go ahead and click any of these, then the Summary tab has everything that you would expect, which is a high level breakdown of your Audience, including how much of the Population your Audience is using and what is its current Segmentation:

Adience Builder Summary

The Filter tab is also straightforward. It defines the rule or rules that make up your Audience before segmenting. In this case all males between certain ages:

Audience Builder Filter Definition

The Segment screen allows you to further split up your Audience. You don’t need to actually define any Segments, however you can edit or add new Segments here if you wish. You can check the impact (in terms of Audience numbers) of any change here. Again, Waterfall Suppression comes in effect here, with priority to the lowest numbered Segments in the list.

Audience Builder Segments

Finally when you are ready, you can Preview or Publish your Audience on the Publish tab:

Audience Builder Publish

Once your Audience is published, then you are ready to start defining emails to send to them, as well as particular variations on each segment.

There is a good deal more detail if you drill down into each screen, however I feel that the online course will do it more justice. If you have access to the Salesforce Marketing Cloud then you should be able to see all the details in the associated 3sixty course.