Mac OS Taskbar Calendar

Calendars are important to consultants, and Windows has a quick and easy to use one when you click on the time. When using a Mac you don’t have this and going into the Calendar view in Microsoft Outlook (or other app) can often distract you from the task at hand (i.e. you are writing an email suggesting to meet next Wednesday, but you want to be really sure that next Wednesday is the 9th). WinCal
Itsycal is a great utility that does this for your Mac, is free and simple:


Start and connect to your Azure VM from the Mac command line

Update 11th October: OK, I didn’t realize is was now very trivial to set a static IP and a DNS name in Azure Public IP Address configuration for Azure. Still I think the commands in this article are still interesting to illustrate how to interact with Azure.

Microsoft Azure is great, but I’m not really a big fan of the Azure portal . I find it rather messy to navigate, and multiple clicks to find the function that you want.

My first simple scenario is to start and connect to one of my Azure developer machines. It costs a great deal to run it 24 hours a day, and I may only use it between 1 and 4 hours each day. It can be started and stopped on the portal, but with some Azure CLI tools and bash scripting I can start and be in quickly.

First let’s check the version installed:

Azure --version


Then login, which will ask you to open and enter the given code. You can then link your subscription through the browser.

Azure login

Screen Shot 2016-10-07 at 8.59.40 AM.png

If succesful, then all your MSDN subscription will get added (in this example I have 3)

Screen Shot 2016-10-07 at 9.03.08 AM.png

So now your environment is ready.

List out all your available VM’s with

azure vm list

Screen Shot 2016-10-07 at 9.26.38 AM.png

and start with

azure vm start -g -n


I was looking for a command that would download the remote desktop file (RDP), but seems there isn’t. No problem because it is easy to create from the VM’s IP address:
azure vm show |grep "Public IP address" | awk -F ":" '{print $3}'

So with a little Bash scripting, you can derive and launch the remote session automatically. This assumes that you have installed Microsoft Remote Desktop for Mac. Just create a new bash script file with the following contents, changing the environment variables to you Azure and local configuration:


echo "RDP Generator"
#Change to your Azure resource group

#Change to your Azure VM name

#Change to the user name on that VM

#Set the location of the temp RDP file

#This should be the default location for Microsoft RDP install
VarRemoteDesktopLocation="/Applications/Microsoft Remote"

VarResult1=$(azure vm show $VarResourceGroupName $VarVMName |grep "Public IP address" )
VarResult2="$(echo $VarResult1 | cut -f3 -d ':')"

VarBegin="full address:s:"

echo $VarRemoteDesktopLocation

touch $VarRDPLocation
#overwrite existing file
echo $VarBegin$VarResult2$VarEnd > $VarRDPLocation
echo "prompt for credentials:i:1" >> $VarRDPLocation
echo "administrative session:i:1" >> $VarRDPLocation
echo "username:s:"$VarUserName >> $VarRDPLocation

#Launch remote desktop session with new file
open -a "$VarRemoteDesktopLocation" $VarRDPLocation

Just run your new script file and it will launch the Microsoft RDP client automatically.


Create Azure ASP.NET applications on a Mac

I recently started to get back into ASP.NET after a long absence and was suprised by the ‘reboot’ of ASP.NET Core (formally known as ASP.NET 5). It is wrong to think of it as the next big version after ASP.NET 4 since it is a completely different beast. It is smaller and (for now) far less featured than ASP.NET 4. It is highly portable across Windows, Azure, Mac, Linux etc.

If I was architecting a new project then I might be tempted to stay with the saftey of the ASP.NET 4 platform since it is a mature product with great tooling as well as still being activally developed.

Luckially this was just for fun, and I wanted to try some things out with Azure services. I shunned Visual Studio 2015 for the simplicity of Visual Studio Code on the Mac. The two products are not comparable for the most part, but since I was relearning ASP.NET doing so in a new simple way was very compelling.

Microsoft have a tutorial for this.

The main issue was installation. Installing the .NET Core SDK on the Mac required installing an additional few layers of separate dependancies (Yeoman, Homebrew, etc) which didn’t seem to work. After some Google of error messages I got everything working after about 20 minutes.

Screen Shot 2016-07-22 at 3.10.19 PM.png

Building the scaffolding of a simple ASP.NET page was trivial, and running it on my local Mac was extreemly easy.

Screen Shot 2016-07-22 at 3.14.01 PM.png

Screen Shot 2016-07-22 at 3.17.48 PM

I wasn’t looking forward to deploying it to Azure. I guess my history with ASP.NET made me expect that deploying to a new site would be painful. In fact once you set up GIT in Azure and locally, the push is really easy and worked as expected first time!


So basically I could build an ASP.NET application on my MacBook, try it out, and then publish it to Azure with no Mirosoft Windows or Visual Studio required in the process at all.


I’m definately a convert to this new way of working now. If you are considering learning .NET Core then I’d really recommend ditching Visual Studio, even if you have a Windows environment. You might miss out on some graphical familiarity, but it is easy to start from the basics.












Delete all Opportunity Records in Salesforce

This seems rather simple, but it was hard to find out how to delete all Opportunity records from your Salesforce organization.

The main reason was that the easiest way is to use DataLoader to extract all the record ID’s and then bulk delete them. Unfortunately this method requires me to leave my laptop open for a few hours whilst it deletes the 1 million records.

As an alternative I adapted this Apex class that implements the Database Batchable interface. This then sets up the required number of individual batches to delete all the records (5,000 at a time).


global class example_Batch implements Database.Batchable<sObject>{

global Database.QueryLocator start(Database.BatchableContext BC) {

String query = ‘SELECT ID FROM Opportunity’;

return Database.getQueryLocator(query);

global void execute(Database.BatchableContext BC, List<sObject> scope) {



global void finish(Database.BatchableContext BC) {



It then runs in the background freeing up my laptop.

Screen Shot 2016-06-15 at 4.14.47 PM

CRM Data Migration Part 4: Connections with Informatica Cloud

CRM Data Migration Part 4: Connections with Informatica Cloud

This article will show how to connect source and target through the framework described earlier.

We need to define 3 connections, Legacy, Staging and Salesforce.

Informatica Cloud is not completely straightforward to setup with your environment, but then again not too difficult that a day of infrastructure configuration wouldn’t fix. You download a piece of software called ‘Secure Agent’ that will run the ETL process. It is worth noting that you can only have one Secure Agent for each ETL, so both source and target data source as well as Informatica Cloud itself needs connectivity from the server running the Agent.  I also found that quite a bit of time was needed to ensure all database connections were running smoothly.

Setting up a mapping configuration is quite straightforward. The tool will determine the sources and target schemas, and allow some transformations in the middle. Here I am joining the email and phone tables with the contacts before inserting into Staging.


Screen Shot 2016-04-26 at 7.07.03 PM

This is a good example of a compromise where I am just taking the first email address. In practice you may want to consider how to deal with all those extra addresses.

(If you were doing this for real, then it would be better to create views within the source SQL Server and just export out of those. There would be less scope for an ETL configuration error.)

You can then just keep running until you get the migration result that you are looking for. Ensure that the ‘Truncate’ option is turned on.

Screen Shot 2016-04-26 at 7.10.31 PM

Check everything is looking good in the Staging (using a SQL query):

Screen Shot 2016-04-26 at 7.16.48 PM.png

And then define a simple ETL for loading into Salesforce. Given that the Staging should be more or less how you want things in Salesforce this should be easy:

Screen Shot 2016-04-26 at 7.21.04 PM.png

Run and check out the new Contacts in Salesforce:

Screen Shot 2016-04-26 at 7.25.33 PM.png




CRM Data Migration Part 3: Framework

CRM Data Migration Part 3: Framework

In 2016 most new CRM installations are Cloud based, including, MS Dynamics Online and Oracle Sales Cloud.

There are many options for tools that will move data into both the Staging area and the CRM. For the purpose of this article I will use Informatica Cloud.


Performance depends on a lot of variables such as the speed of your internet pipes, servers, types of cloud systems used, etc.

It is worth noting that considerable data migration may cause a significant impact to your framework. For example you may need to move a copy of your data to an area that can reach through your DMZ. How long will it take to copy 10k records over to your staging environment? How long will it take to copy those fixed records into your new Cloud platform? How long can you take your CRM system offline during migration?

On-Premise Approach

You can setup your staging environment and ETL tools on-premise. This has the advantage that you have some control over environmental variables. Remember that you may not have total control, which could prove signifiant. What if they are other ETL processes running overnight that use up all the local bandwidth you were looking for in your migration?

Cloud Approach

Although you lack some of the direct control that on-premise offers, you do nevertheless have a more reliable environment that is dedicated to you and segregated from other services running on the cloud platform.

To keep with the Cloud theme I will also use a Microsoft Windows Server with SQL Server as a Microsoft Azure VM for the Staging database. Any other cloud (or on-premise) database solution such as AWS would work just as well, as well as a solution leveraging the Azure SQL Database Service.


Given that you can get free trials of most Cloud software, you can actually setup a ‘proof of concept’ and seeing how it works for you quite quickly. In my example this applies to Microsoft Azure, Salesforce and Informatica Cloud.

One benefit of the Cloud based approach is that you can provision your framework just for the duration of the migration. When the migration has been completed and signed-off then you can just turn it off and stop paying for it.



CRM Data Migration Part 2: Scenario

CRM Data Migration Part 2: Scenario

In order to illustrate how a migration will work, I have decided to migrate my (hypothetical) sports business from a custom solution based on Microsoft SQL Server to

Luckily Microsoft provide a good example of a legacy system : Adventureworks! You can download the database, attach it to whatever SQL Server instance you have (even the free Express edition) and treat that as your legacy source.

The great thing about the AdventureWorks is that it illustrates very well how differently two CRM systems can treat a customer. This is the data model of ‘Person’ in AdventureWorks, which we will have to migrate to’s ‘Contact’ object:


We can already see that a good amount of the data structure is not required. We don’t have to migrate the data in ‘StateProvince’ and ‘CountryRegion’ because it already exists within (although the values will need to be matched still). The ‘Password’ table should almost certainly not be migrated, although it leaves a tricky question about how to bring that (presumably) web site security login functionality to the new system. Many aspects are highly denormalized (limitless phone numbers, email addresses, etc).

Basically a simple mapping will not be possible. The flexibility of the relational database will make your migration task difficult. It is important to determine what data is truly important to you.