Start and connect to your Azure VM from the Mac command line

Update 11th October: OK, I didn’t realize is was now very trivial to set a static IP and a DNS name in Azure Public IP Address configuration for Azure. Still I think the commands in this article are still interesting to illustrate how to interact with Azure.

Microsoft Azure is great, but I’m not really a big fan of the Azure portal . I find it rather messy to navigate, and multiple clicks to find the function that you want.

My first simple scenario is to start and connect to one of my Azure developer machines. It costs a great deal to run it 24 hours a day, and I may only use it between 1 and 4 hours each day. It can be started and stopped on the portal, but with some Azure CLI tools and bash scripting I can start and be in quickly.

First let’s check the version installed:

Azure --version

screen-shot-2016-09-26-at-9-39-41-am

Then login, which will ask you to open https://aka.ms/devicelogin and enter the given code. You can then link your subscription through the browser.

Azure login

Screen Shot 2016-10-07 at 8.59.40 AM.png

If succesful, then all your MSDN subscription will get added (in this example I have 3)

Screen Shot 2016-10-07 at 9.03.08 AM.png

So now your environment is ready.

List out all your available VM’s with

azure vm list

Screen Shot 2016-10-07 at 9.26.38 AM.png

and start with

azure vm start -g -n

screen-shot-2016-10-07-at-9-40-09-am

I was looking for a command that would download the remote desktop file (RDP), but seems there isn’t. No problem because it is easy to create from the VM’s IP address:
azure vm show |grep "Public IP address" | awk -F ":" '{print $3}'

So with a little Bash scripting, you can derive and launch the remote session automatically. This assumes that you have installed Microsoft Remote Desktop for Mac. Just create a new bash script file with the following contents, changing the environment variables to you Azure and local configuration:

#!/bin/bash

echo "RDP Generator"
#Change to your Azure resource group
VarResourceGroupName="DEFAULT-STORAGE-SOUTHEASTASIA"

#Change to your Azure VM name
VarVMName="DevBox1"

#Change to the user name on that VM
VarUserName="andrewwhitten"

#Set the location of the temp RDP file
VarRDPLocation="/Users/Andrew/temp.rdp"

#This should be the default location for Microsoft RDP install
VarRemoteDesktopLocation="/Applications/Microsoft Remote Desktop.app"

VarResult1=$(azure vm show $VarResourceGroupName $VarVMName |grep "Public IP address" )
VarResult2="$(echo $VarResult1 | cut -f3 -d ':')"

VarBegin="full address:s:"
VarEnd=":3389"

echo $VarRemoteDesktopLocation

touch $VarRDPLocation
#overwrite existing file
echo $VarBegin$VarResult2$VarEnd > $VarRDPLocation
echo "prompt for credentials:i:1" >> $VarRDPLocation
echo "administrative session:i:1" >> $VarRDPLocation
echo "username:s:"$VarUserName >> $VarRDPLocation

#Launch remote desktop session with new file
open -a "$VarRemoteDesktopLocation" $VarRDPLocation

Just run your new script file and it will launch the Microsoft RDP client automatically.

screen-shot-2016-10-09-at-7-57-34-pm

Create Azure ASP.NET applications on a Mac

I recently started to get back into ASP.NET after a long absence and was suprised by the ‘reboot’ of ASP.NET Core (formally known as ASP.NET 5). It is wrong to think of it as the next big version after ASP.NET 4 since it is a completely different beast. It is smaller and (for now) far less featured than ASP.NET 4. It is highly portable across Windows, Azure, Mac, Linux etc.

If I was architecting a new project then I might be tempted to stay with the saftey of the ASP.NET 4 platform since it is a mature product with great tooling as well as still being activally developed.

Luckially this was just for fun, and I wanted to try some things out with Azure services. I shunned Visual Studio 2015 for the simplicity of Visual Studio Code on the Mac. The two products are not comparable for the most part, but since I was relearning ASP.NET doing so in a new simple way was very compelling.

Microsoft have a tutorial for this.

The main issue was installation. Installing the .NET Core SDK on the Mac required installing an additional few layers of separate dependancies (Yeoman, Homebrew, etc) which didn’t seem to work. After some Google of error messages I got everything working after about 20 minutes.

Screen Shot 2016-07-22 at 3.10.19 PM.png

Building the scaffolding of a simple ASP.NET page was trivial, and running it on my local Mac was extreemly easy.

Screen Shot 2016-07-22 at 3.14.01 PM.png

Screen Shot 2016-07-22 at 3.17.48 PM

I wasn’t looking forward to deploying it to Azure. I guess my history with ASP.NET made me expect that deploying to a new site would be painful. In fact once you set up GIT in Azure and locally, the push is really easy and worked as expected first time!

AzureMac1

So basically I could build an ASP.NET application on my MacBook, try it out, and then publish it to Azure with no Mirosoft Windows or Visual Studio required in the process at all.

AzureMac2.png

I’m definately a convert to this new way of working now. If you are considering learning .NET Core then I’d really recommend ditching Visual Studio, even if you have a Windows environment. You might miss out on some graphical familiarity, but it is easy to start from the basics.

 

 

 

 

 

 

 

 

 

 

 

Delete all Opportunity Records in Salesforce

This seems rather simple, but it was hard to find out how to delete all Opportunity records from your Salesforce organization.

The main reason was that the easiest way is to use DataLoader to extract all the record ID’s and then bulk delete them. Unfortunately this method requires me to leave my laptop open for a few hours whilst it deletes the 1 million records.

As an alternative I adapted this Apex class that implements the Database Batchable interface. This then sets up the required number of individual batches to delete all the records (5,000 at a time).

 

global class example_Batch implements Database.Batchable<sObject>{

global Database.QueryLocator start(Database.BatchableContext BC) {

String query = ‘SELECT ID FROM Opportunity’;

return Database.getQueryLocator(query);
}

global void execute(Database.BatchableContext BC, List<sObject> scope) {

database.delete(scope,false);

}

global void finish(Database.BatchableContext BC) {

}

}

It then runs in the background freeing up my laptop.

Screen Shot 2016-06-15 at 4.14.47 PM

CRM Data Migration Part 4: Connections with Informatica Cloud

CRM Data Migration Part 4: Connections with Informatica Cloud

This article will show how to connect source and target through the framework described earlier.

We need to define 3 connections, Legacy, Staging and Salesforce.

Informatica Cloud is not completely straightforward to setup with your environment, but then again not too difficult that a day of infrastructure configuration wouldn’t fix. You download a piece of software called ‘Secure Agent’ that will run the ETL process. It is worth noting that you can only have one Secure Agent for each ETL, so both source and target data source as well as Informatica Cloud itself needs connectivity from the server running the Agent.  I also found that quite a bit of time was needed to ensure all database connections were running smoothly.

Setting up a mapping configuration is quite straightforward. The tool will determine the sources and target schemas, and allow some transformations in the middle. Here I am joining the email and phone tables with the contacts before inserting into Staging.

 

Screen Shot 2016-04-26 at 7.07.03 PM

This is a good example of a compromise where I am just taking the first email address. In practice you may want to consider how to deal with all those extra addresses.

(If you were doing this for real, then it would be better to create views within the source SQL Server and just export out of those. There would be less scope for an ETL configuration error.)

You can then just keep running until you get the migration result that you are looking for. Ensure that the ‘Truncate’ option is turned on.

Screen Shot 2016-04-26 at 7.10.31 PM

Check everything is looking good in the Staging (using a SQL query):

Screen Shot 2016-04-26 at 7.16.48 PM.png

And then define a simple ETL for loading into Salesforce. Given that the Staging should be more or less how you want things in Salesforce this should be easy:

Screen Shot 2016-04-26 at 7.21.04 PM.png

Run and check out the new Contacts in Salesforce:

Screen Shot 2016-04-26 at 7.25.33 PM.png

 

 

 

CRM Data Migration Part 3: Framework

CRM Data Migration Part 3: Framework

In 2016 most new CRM installations are Cloud based, including Salesforce.com, MS Dynamics Online and Oracle Sales Cloud.

There are many options for tools that will move data into both the Staging area and the CRM. For the purpose of this article I will use Informatica Cloud.

Performance

Performance depends on a lot of variables such as the speed of your internet pipes, servers, types of cloud systems used, etc.

It is worth noting that considerable data migration may cause a significant impact to your framework. For example you may need to move a copy of your data to an area that can reach through your DMZ. How long will it take to copy 10k records over to your staging environment? How long will it take to copy those fixed records into your new Cloud platform? How long can you take your CRM system offline during migration?

On-Premise Approach

You can setup your staging environment and ETL tools on-premise. This has the advantage that you have some control over environmental variables. Remember that you may not have total control, which could prove signifiant. What if they are other ETL processes running overnight that use up all the local bandwidth you were looking for in your migration?

Cloud Approach

Although you lack some of the direct control that on-premise offers, you do nevertheless have a more reliable environment that is dedicated to you and segregated from other services running on the cloud platform.

To keep with the Cloud theme I will also use a Microsoft Windows Server with SQL Server as a Microsoft Azure VM for the Staging database. Any other cloud (or on-premise) database solution such as AWS would work just as well, as well as a solution leveraging the Azure SQL Database Service.

cloud

Given that you can get free trials of most Cloud software, you can actually setup a ‘proof of concept’ and seeing how it works for you quite quickly. In my example this applies to Microsoft Azure, Salesforce and Informatica Cloud.

One benefit of the Cloud based approach is that you can provision your framework just for the duration of the migration. When the migration has been completed and signed-off then you can just turn it off and stop paying for it.

 

 

CRM Data Migration Part 2: Scenario

CRM Data Migration Part 2: Scenario

In order to illustrate how a migration will work, I have decided to migrate my (hypothetical) sports business from a custom solution based on Microsoft SQL Server to Salesforce.com.

Luckily Microsoft provide a good example of a legacy system : Adventureworks! You can download the database, attach it to whatever SQL Server instance you have (even the free Express edition) and treat that as your legacy source.

The great thing about the AdventureWorks is that it illustrates very well how differently two CRM systems can treat a customer. This is the data model of ‘Person’ in AdventureWorks, which we will have to migrate to Salesforce.com’s ‘Contact’ object:

Person2.png

We can already see that a good amount of the data structure is not required. We don’t have to migrate the data in ‘StateProvince’ and ‘CountryRegion’ because it already exists within Salesforce.com (although the values will need to be matched still). The ‘Password’ table should almost certainly not be migrated, although it leaves a tricky question about how to bring that (presumably) web site security login functionality to the new system. Many aspects are highly denormalized (limitless phone numbers, email addresses, etc).

Basically a simple mapping will not be possible. The flexibility of the relational database will make your migration task difficult. It is important to determine what data is truly important to you.

CRM Data Migration Part 1: Conceptual

CRM Data Migration Part 1: Conceptual

Introduction

So you’ve decided to invest in a brand new CRM system. Given that most CRM systems are customized to some degree, moving data from any source to target system has many issues.

Please note that these articles are only about data migration. CRM system design is a different subject altogether.

This is also not a step by step guide. It is really just to describe some (and not all) of the challenges that you may face, as well as some examples to show how a migration framework could work.

Migration frameworks cannot be purchased off the shelf or downloaded, but rather a combination of working out an approach that works for your business scenario and matching that with the appropriate technologies that address that approach. You need to consider a good number of aspects before you even begin to think technically about it. The conceptual areas below may help with this.

 

1. High Level Steps

Conceptually you can have three steps:

Concept

Step 1: Source can be any data repository you have. An Excel spreadsheet, an existing CRM system, a data warehouse, etc. Additionally you can have multiple sources required for a migration.

Step 2: Stage is where you consolidate, massage, enhance and prepare your data sets for loading.

Step 3: Target is where the data ends up

It is worth noting the Step 2 can be optional. If your source data is of acceptable quality, your transformation requirements are simple and your ETL tool has enough functionality then you could just go from Source to Target.

2. Master Data

Does your source CRM contain all of your master data? What if it needs to be combined with other data such as a separate system that manages products? What if other departments in your company want to maintain their own master lists of customer information?

 

3. Multiple Source Systems

It is common to merge multiple systems together into the new CRM. For example your company has acquired another, and you want to use this opportunity to consolidate your customer systems. You will need to think of merging rules, such as what to do when the same distinct customer exists in both systems and which data come first? Do you accept that the ‘losing’ data source will not be used?

 

4. Change Deltas / Cutover Plan

Data may not always migrate cleanly in a CRM system. For example, an Opportunity may be in an open state and be waiting for further workflow actions. Is it a good idea to move it now or wait for it to close? Moving it now is the quickest way forward, but may lose your business valuable opportunities if now migrated correctly. Waiting for it to close means maintaining your old CRM system and then migrating the delta, which may itself be complicated to work out (i.e. a Contact details could be updated in both old and new systems whilst the related Opportunity remains open. Which has primacy?).

 

5. Difference in Data Structures

Many consultancies will direct you to not consider the design of your old CRM system when building your new CRM system, but rather ‘focus on the business outcome’ desired for the upgrade. This is all fine in theory, but legacy data structures were created for a reason (even bad ones) and you will probably find that you will either have compromise the amount of data each structure brings across, or invest heavily in ETL techniques to achieve full data.

6. Data Completeness and Business Buy In

Realistically you are not going to migrate 100% of the data to be found in your legacy CRM system. Do try and do so would incur a considerable cost, and probably impact the effectiveness of your new CRM by filling it with low value data. It is better to identify the key data sets early and get agreement with what you are leaving out. For example, does the new CRM require customer records that have been disabled? By reducing the scope of the data to be migrated you are increasing the chances of a successful migration.