Category: Uncategorized

Data Quality Analysis in Salesforce files with MuleSoft Anypoint

Github Repository:

I have created a solution that can be added to any record page (Case, Account, whatever) and find the data quality / encryption / password protect status of all the attachments.

It comprises of aSalesforce LWC Lightning control and MuleSoft Anypoint 4.3 runtime service and you to extend the ContentVersion standard object, and drag and drop the control onto the page designer.

The interactions of my solution are detailed in the Sequence Diagram below. The data quality process is initiated by the Salesforce LWC control calling AnyPoint with a list of DocumentId’s to check:

The MuleSoft Anypoint 4.3 Flow will take a list of Document ID’s and start processing them. Currently it uses a For loop to check each file:

You don’t have to call this process from Salesforce – just call the service from a REST client or your browser: https://<YOUR HOSTING URL>/FileCheck?”GUID1″,”GUID2″,”GUID3″ :

What does this version detect?

  • PDF – Password protection
  • Microsoft Word (doc and docx) – Password protection
  • Zip file password encryption

What may future versions include?

  • Microsoft Excel password protection
  • Microsoft PowerPoint password protection
  • Image validation
  • Corrupt files

This proof of concept introduces a framework for working with Salesforce files in MuleSoft. You can analyze your files with any Java code you like. Do you need to scan all word documents for ‘Copyright of Acme’? Just write another Java class.


  • Question: Is this Production ready?
  • Answer: No. This is currently at a ‘working proof of concept’ stage, but needs a lot more in terms of performance management, error handling and testing. You should only use this in your Salesforce developer sandboxes.

Problem Background

The Salesforce CRM platform does not yet have a coding capability to read and analyze large files. For example, if a user was to password protect a large PDF file would require a lot of inventive Apex coding from scratch to determine that it was password protected and unusable.

Java on the other hand can do this kind of file analysis, and take advantage of strong and capable open source libraries that are available to work for a variety of file formats. MuleSoft Anypoint is a popular integration product owned by Salesforce and used by many Salesforce orgs, that can further be extended with Java code.

There are other options, such as licensing Salesforce Heroku or maybe Serverless Functions when formally available. You can also create web services hosted on Microsoft Azure, AWS, or anything else. Many Salesforce customers have however invested in MuleSoft that does have strong Salesforce support out of the box. This design is not a compelling reason by itself to procure MuleSoft if you don’t have it already, however it is interesting if you already have MuleSoft and some spare capacity on it.


  1. All this can be run on free trial services. Salesforce and MuleSoft have signup pages.
  2. You can run the MuleSoft AnyPoint service locally on your machine, however Salesforce won’t be able to connect to it (or at least you will have a hard job making the connection). You will need to deploy to AnyPoint Cloud with an SSL certificate for a full end-to-end.
  3. I’m running this on a medium AnyPoint vCore in the cloud (the highest available on the trial service). Performance seems fine, but there has been no real performance testing yet. There probably is a limit around how much you can throw at this service before it starts failing.
  4. You will need to add your MuleSoft service’s URL to CSP Trusted Sites in Salesforce Admin
  5. The web service is called from a LWC component directly. This can be secured to the calling host, but in the next version I would probably put into Apex so that it is called from the Salesforce org rather than directly from the browser.
  6. The ability to detect whether a document is password protected was actually not as easy as I had imagined. Open source libraries are great, but they really lack a simple isDocumentEncrypted() function.
    • PDF password detection comes courtesy of¬†Apache PDF Box¬†
    • ZIP password detection is with the standard Java libraries.
    • Microsoft document password detection uses Apache POI
    • Other file types and other types of file quality detection can be added to the MuleSoft solution just by adding a new Java class and libraries
  7. The next step is to extend this to determine issues when users uploads a file through the UI. For example, rejecting an upload of an encrypted PDF
  8. I’m not so experienced with Anypoint, so my Flow is rather long. I’ll also look to break that up in anticipation of other services that will come and reuse common parts
  9. I should also create a ‘how to set up’ page with detailed instructions. Please ping me if that is of interest.
Develop Salesforce DX with Visual Studio Code Remote

Develop Salesforce DX with Visual Studio Code Remote

Scott Hanselman has a great post on the new remote development feature with Visual Studio Code, and I wondered how this would work with Salesforce DX.

Most of my learning for Salesforce DX has been on my personal MacBook, mostly because my company laptop is heavily restricted. I can run Visual Studio Code from my user space under Windows 10, but all the other developer tooling can’t be installed / has issues.

The awesome thing about the new Remote feature is that I was able to install all the required development tooling to a Ubuntu 18 server that I provisioned on a Microsoft Azure subscription and can now control it with Visual Studio as the client.


Note: Please pay close attention as to where I have specified Client machine (your laptop, whatever) and your Remote Machine (Linux server with all dev tools)

Great! This is the basic setup, now the tricky bits so please also refer to the Microsoft SSH guidance:

  • Step 8: Client machine: Run the remote-ssh: Connect to Host command for your Linux server. Visual Studio Code will setup everything for you.
  • Step 9: Client machine: Install the Visual Studio Code Salesforce Extension on the remote machine (obviously not on the Client machine)
  • Step 10: Remote machine: Login to your remote machine with a remote desktop session and run a new instance of Visual Studio Code with Salesforce CI integration. Authorize an org with the ‘SFDX: Authorize an Org’ command
  • Step 11: Remote machine: Still in Remote desktop, run the following command in the Visual Code terminal: ‘sfdx force:org:display -u vscodeOrg –verbose’ (where vscodeOrg the name of your authorized org. Copy the value of “Sfdx Auth Url” which will be in the format force:// (Danger: This value will let anyone log into your org without username or password!)
  • Step 12: Remote machine: Save this value in a text file in an easy location. eg. /home/MyUser/src/login.txt
  • Step 13: Client machine: run the following command in the Visual Studio Code terminal window: ‘sfdx force:auth:sfdxurl:store -f login.txt -s -a vscodeOrg
  • Step 14: Client machine: (optional) Setup access to your source control repository in Visual Studio Code. Note that the source code will now be held on the remote machine, not your local machine.


And there you have it. You can no use you local machine as a development client, with all tools needed for Salesforce DX running remotely.


Thoughts & observations:

  1. The configuration could have been changed to use Docker Containers locally. I didn’t go that route because I’m not convinced my company laptop has the right amount of system resources. It would also be very hard to get Docker Desktop installed.
  2. Debugging javascript with this configuration might be too hard without the browser on the remote machine. Local extensions are probably the way to go.
  3. Setup is non-trivial. Reading through my steps I can see that this is not something that someone who has only ever used Salesforce would want to configure. If you don’t want to jump through the hoops above then I’d advise waiting for a few months for things to advance some more.
  4. There is exciting potential for ‘code from anywhere’ with this model. It is probably unrealistic (for example) to have the full Salesforce DX development suite on an Apple iPad, but definitely more realistic to have the Visual Studio Code client running and connecting to a development server in the cloud.




Learning Vlocity

Vlocity is a new industry accelerator on top of Salesforce, and it provides a great deal of functionality that will significantly reduce your customization effort within Salesforce. You can design agent workflows, user interfaces and interactions with configurable scripts with more simplicity than coding up in Salesforce APEX or using flows. Additionally they provide processes for each industry to download, so you can just browse the library to get processes that allow you to (for example):

  • apply for Child Care benefit (Government)
  • change a SIM Card (Telecom)
  • get a travel policy quote (Insurance)

In addition there is a CPQ (Configure Price Quote) engine that ties in nicely with the processes above.

Salesforce is an awesomely open platform in terms of learning. Almost all the functionality you need to develop enterprise scale applications is available in a development environment that anyone can provision within a minute, and most of the learning material to use that is available in the learning site (Although just to be clear: although learning is free, using it in production does cost licensing! ).

Vlocity is for the moment behind walls, and you will need either a partner agreement or partake in a training course to get experience. There is a steep learning curve, since you need to understand how the different components work together (dataraptors, omniscripts and cards) before you can do something useful.

I found that the in-person training course is good and gets through a lot of material in 4 days, but you should straight afterwards go back and go through it all again building your own ideas before you truely ‘get’ it. You can follow the steps in the exercise book and not have a good feel for what is actually happening. For example, I wrote a ‘hello world’ card to understand the user interface better, a simple dataraptor that extracted Contact fields to understand the data model better, and finally my own Omniscript that solicited customer feedback to see how it all tied together.

And do bear in mind that after a steep learning curve, it will be a much easier way to implement complex process functionality into Salesforce!

Vlocity Cards “Hello World”

I’ve recently completed a Vlocity Administrator and Developer Essentials course, and am preparing for the final exam.

There is a lot to say about the Vlocity industry accelerators. The official documentation and training certainly throws you in at the deep end. Here are some blog posts that will break each technical component up into ‘Hello world’ components.

What are Vlocity Cards?

Vlocity Cards are visual components that work in the larger Vlocity framework. They can display data and actions. The interface can be customized with HTML, CSS and Javascript (AngularJS).


  • Salesforce org
  • Vlocity app installed (not freely available unfortunately)
  • Knowledge of Salesforce Administration



Display ‘Hello World’ on a Vlocity card to a Contact record page.


Step 1: Create Vlocity Layout & Card

We will create a layout, card and state with a really simple structure (In real life you would likely want lots of cards and lots of states to represent your data).

Screen Shot 2018-11-11 at 3.00.08 pm.png

We will create a simple layout that will query basic details from the Contact record and display them on a simple card. It will look like this:


Note about Data source: I used SOQL to make this as simple as possible. We will have a ‘Hello World’ DataRaptor blog post soon to go into the best way to acquire data into your cards.

Activate the Card and click the ‘Preview’ tab, and you should see a (badly) formatted card with a single ‘Call’ action:

Screen Shot 2018-11-11 at 2.49.47 pm.png


Step 2: Create a new Lightning Page for Contact

We want to see this card in action, so we create a new Lightning Page, and choose the ‘Vlocity Three Columns’ Template:

Screen Shot 2018-11-11 at 1.53.07 pm.png

Screen Shot 2018-11-11 at 3.03.43 pm.png

Activate the card (set as org default if you are just playing around in a sandbox) and then view a random Contact:

Screen Shot 2018-11-11 at 3.31.27 pm

Obviously the formatting is not right yet, so for the next blog post we will get into how to present this correctly.

Salesforce DX CLI Cheat Sheet

The Salesforce trailhead is good, although its verbose style makes it hard to use for a quick reference. This list is just for my benefit to quickly remember the relevant commands, but may be of wider use as well:

Salesforce DX Trailheads:


Action Command
Update the CLI sfdx update
Set the dev hub instance sfdx force:auth:web:login –setdefaultdevhubusername –setalias my-hub-org
Create a new project sfdx force:project:create -n myAwesomeNewProject
Create a scratch org sfdx force:org:create -s -f config/project-scratch-def.json -a GeoAppScratch
Open the scratch org sfdx force:org:open
Pull the configurations & customizations from your scratch org to your local project sfdx force:source:pull
Pull data from your scratch org to your local project sfdx force:data:tree:export -q “SELECT Name, Location__Latitude__s, Location__Longitude__s FROM Account WHERE Location__Latitude__s != NULL AND Location__Longitude__s != NULL” -d ./data
Push your local project back to your scratch org sfdx force:source:push
Create a new Lightning component in your local project sfdx force:lightning:component:create -n AccountListItem -d force-app/main/default/aura
Create a new Lightning event in your local project sfdx force:lightning:event:create -n AccountsLoaded -d force-app/main/default/aura
Create another scratch org sfdx force:org:create -f config/project-scratch-def.json -a GeoTestOrg
Push local project to new scratch org sfdx force:source:push -u GeoTestOrg
Assign permission set to new scratch org sfdx force:user:permset:assign -n Geolocation -u GeoTestOrg
Upload data to new scratch org sfdx force:data:tree:import -f data/Account.json -u GeoTestOrg


Manage Dev Hub and packaging

Action Command
Set the dev hub user name sfdx force:config:set
Create new project sfdx force:org:create -f config/project-scratch-def.json -a TempUnmanaged
Generate new password for scratch org sfdx force:user:password:generate -u TempUnmanaged
Display dev hub sratch org details sfdx force:org:display -u TempUnmanaged
Pull the source sfdx force:source:pull -u TempUnmanaged
Pull in package sfdx force:mdapi:retrieve -s -r ./mdapipackage -p DreamInvest -u TempUnmanaged -w 10
Open resulting package unzip -d
Convert to metadata sfdx force:mdapi:convert -r mdapipackage/
Delete the scratch org sfdx force:org:delete -u TempUnmanaged
Create a new sratch org sfdx force:org:create -s -f config/project-scratch-def.json
Push local project to new scratch org sfdx force:source:push
Assign permission set to new org sfdx force:user:permset:assign -n DreamInvest
Convert to metadata sfdx force:source:convert -d mdapioutput/
List all orgs in dev hub sfdx force:org:list
Deploy to new scratch org sfdx force:mdapi:deploy -d mdapioutput/ -u MyTPO -w 100

Microsoft Dynamics 365 Employee Self-Service Portal Access with Azure AD Integrated Apps settings

Microsoft Dynamics 365 Portal is a great new addition, having matured greatly from the ADX acquisition and possible to set up in a few clicks. They don’t offer a massive amount of extensibility, but does the job well (i.e. let a Contact log in, raise a case, check case progress, browse a knowledge base, add simple access to other entities).

I did face an interesting problem in a customer scenario where I set up an Employee Self Service Portal, and found that users could just not log in.

Screen Shot 2017-08-06 at 2.22.47 pm.png

Google didn’t provide any hits for the phrase “Microsoft CRM Portals needs permission to access resources in your organization that only an Admin can grant” – so seemed a good candidate for an article!

After some investigation, we found that an earlier Security Audit had recommended turning off the ‘Integrated Apps’ setting in Azure AD. This meant that users could not consent to have the Portal read their AD profile:

Screen Shot 2017-08-06 at 2.07.41 pm.png

Screen Shot 2017-08-06 at 2.17.25 pm.png

The immediate fix is just to enable this setting back on again. I also had to restart my Portal to get this working.

Screen Shot 2017-08-06 at 3.34.55 pm.png



Mac OS Taskbar Calendar

Calendars are important to consultants, and Windows has a quick and easy to use one when you click on the time. When using a Mac you¬†don’t have this and going into the Calendar view in Microsoft Outlook (or other app) can often distract you from the task at hand (i.e. you are writing an email suggesting to meet next Wednesday, but you want to be really sure that next Wednesday is the 9th). WinCal
Itsycal is a great utility that does this for your Mac, is free and simple:


Create Azure ASP.NET applications on a Mac

I recently started to get back into ASP.NET after a long absence and was suprised by the ‘reboot’ of ASP.NET Core (formally known as ASP.NET 5). It is wrong to think of it as the next big version after ASP.NET 4 since it is a completely different beast. It is smaller and (for now) far less featured than ASP.NET 4. It is highly portable across Windows, Azure, Mac, Linux etc.

If I was architecting a new project then I might be tempted to stay with the saftey of the ASP.NET 4 platform since it is a mature product with great tooling as well as still being activally developed.

Luckially this was just for fun, and I wanted to try some things out with Azure services. I shunned Visual Studio 2015 for the simplicity of Visual Studio Code on the Mac. The two products are not comparable for the most part, but since I was relearning ASP.NET doing so in a new simple way was very compelling.

Microsoft have a tutorial for this.

The main issue was installation. Installing the .NET Core SDK on the Mac required installing an additional few layers of separate dependancies (Yeoman, Homebrew, etc) which didn’t seem to work. After some Google of error messages I got everything working after about 20 minutes.

Screen Shot 2016-07-22 at 3.10.19 PM.png

Building the scaffolding of a simple ASP.NET page was trivial, and running it on my local Mac was extreemly easy.

Screen Shot 2016-07-22 at 3.14.01 PM.png

Screen Shot 2016-07-22 at 3.17.48 PM

I wasn’t looking forward to deploying it to Azure. I guess my history with ASP.NET made me expect that deploying to a new site would be painful. In fact once you set up GIT in Azure and locally, the push is really easy and worked as expected first time!


So basically I could build an ASP.NET application on my MacBook, try it out, and then publish it to Azure with no Mirosoft Windows or Visual Studio required in the process at all.


I’m definately a convert to this new way of working now. If you are considering learning .NET Core then I’d really recommend ditching Visual Studio, even if you have a Windows environment. You might miss out on some graphical familiarity, but it is easy to start from the basics.












Delete all Opportunity Records in Salesforce

This seems rather simple, but it was hard to find out how to delete all Opportunity records from your Salesforce organization.

The main reason was that the easiest way is to use DataLoader to extract all the record ID’s and then bulk delete them. Unfortunately this method requires me to leave my laptop open for a few hours whilst it deletes the 1 million records.

As an alternative I adapted this Apex class that implements the Database Batchable interface. This then sets up the required number of individual batches to delete all the records (5,000 at a time).


global class example_Batch implements Database.Batchable<sObject>{

global Database.QueryLocator start(Database.BatchableContext BC) {


return Database.getQueryLocator(query);

global void execute(Database.BatchableContext BC, List<sObject> scope) {



global void finish(Database.BatchableContext BC) {



It then runs in the background freeing up my laptop.

Screen Shot 2016-06-15 at 4.14.47 PM

CRM Data Migration Part 3: Framework

CRM Data Migration Part 3: Framework

In 2016 most new CRM installations are Cloud based, including, MS Dynamics Online and Oracle Sales Cloud.

There are many options for tools that will move data into both the Staging area and the CRM. For the purpose of this article I will use Informatica Cloud.


Performance depends on a lot of variables such as the speed of your internet pipes, servers, types of cloud systems used, etc.

It is worth noting that considerable data migration may cause a significant impact to your framework. For example you may need to move a copy of your data to an area that can reach through your DMZ. How long will it take to copy 10k records over to your staging environment? How long will it take to copy those fixed records into your new Cloud platform? How long can you take your CRM system offline during migration?

On-Premise Approach

You can setup your staging environment and ETL tools on-premise. This has the advantage that you have some control over environmental variables. Remember that you may not have total control, which could prove signifiant. What if they are other ETL processes running overnight that use up all the local bandwidth you were looking for in your migration?

Cloud Approach

Although you lack some of the direct control that on-premise offers, you do nevertheless have a more reliable environment that is dedicated to you and segregated from other services running on the cloud platform.

To keep with the Cloud theme I will also use a Microsoft Windows Server with SQL Server as a Microsoft Azure VM for the Staging database. Any other cloud (or on-premise) database solution such as AWS would work just as well, as well as a solution leveraging the Azure SQL Database Service.


Given that you can get free trials of most Cloud software, you can actually setup a ‘proof of concept’ and seeing how it works for you quite quickly. In my example this applies to Microsoft Azure, Salesforce and Informatica Cloud.

One benefit of the Cloud based approach is that you can provision your framework just for the duration of the migration. When the migration has been completed and signed-off then you can just turn it off and stop paying for it.