Tuesday, 29 September 2020

Using Key Vault reference in arm parameters file template, another scenario

Following the last post, where an example is given on how to use Key Vault in a Arm Parameters File Template.

That's a great idea for using Key Vault when deploying a resource with DevOps. However, that's not the only challenge you have, another question that can be important to ask is. And what about a Service Url that you can be using in a Logic App for example. 

First thing that probably comes to mind is: OK, let's create a parameter like [Service URL], set the value to the right URL needed for the purpose. And do that for all parameters files related to each environment.

Now, assume a disaster recover type of issue, where that [Service URL] is no longer available and instead you are provided a new one to configure in your Logic App. (Assuming the next steps are done only after provided the new URL)

Yes, you go back to your source control, change the relevant value in the parameters file, check in the code, most likely, promote the change, and finally with continuos integration it gets deployed to the specific environment. Now, I will ask you, how much time took you to perform that change, from the moment it got to you, until it was deployed? If your team has the coding and deployment well oiled, probably 1h, but as we know, not all companies have that so well implemented. What it means that it can take quite some time. 

Depending on how the availability affects the "Customer", most likely someone will be asked questions why it took so long a small change to point to a new URL?

There's where the Key Vault in here can have an important paper. Why not use the Key Vault for this purpose? Probably another interesting idea for using the Key Vault, don't you think? Instead of setting the URL for every environment "hard coded" in the parameters files, just create a Key in the Key Vault for that purpose and reference it.

For this specific scenario, and works really well. Assuming the same disaster scenario. The only thing needed would be ask someone to create a revision on the secret, set the value and that's it. How much time needed? Probably 2 minutes.

Any comments/questions are welcome.

Hope it helps.

Thursday, 24 September 2020

Azure Key Vault - Using in arm template parameter files

Let's start from the beginning, Microsoft says:

"Azure Key Vault is a tool for securely storing and accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, or certificates. A vault is a logical group of secrets."

A colleague of mine told me about using a key vault reference in an arm template parameters file, the idea would be for accessing data like a password of a user for an SQL Server connection. 

The idea was brilliant, first, no passwords referenced in any file, second, deploying to a new environment, only thing that we need to change is the Resurce Group name, and Artifact name (Key Vault Name) from the ID of the resource.

That part it is easy if you work with some standard naming convention, for Azure purpose, and mentioned already in my previous post. Link: https://dynamicsmonster.wordpress.com/2020/09/22/configuring-arm-templates-deploying-to-different-environments/

As an example please see below JSON parameter, this is an example from a parameter file.

"sql_1_password": {
"reference": {
"keyVault": {
"id": "/subscriptions/[Dubscription Id]/resourceGroups/[Resource Group]/providers/Microsoft.KeyVault/vaults/[Key Vault Name]"
},
"secretName": "[Sql Password]"
}

So, you would have one parameter file per environment, the only thing you would need to change from example above is [Resoure Group] and [Key Vault Name], keeping the same secret name in the other key vaults in the other resource groups will work with no problems at all.

In next post will tell another scenario where using Key Vault in arm parameters files is useful.

Anything please comment here or contact us.

Hope it helps.

Follow us.

Thank you.

Tuesday, 22 September 2020

Configuring Arm Templates (Deploying to different environments)

When developing around Azure, more specifically with Azure Logic Apps, Azure Functions, Azure API Management, and with almost no experience or none experience at all a good starting point it is using the browser.

This allows us to start doing our tasks, practising, testing and then deploying to a different environment.

I read many articles with different views of how to achieve this. Usually in any IT Company when we are developing we all need to use a source control tool, in this case it only applies to Microsoft Devops, and for some artifacts I think we need to be using Premium account.

I started the same way, I went to Azure Portal and created few Logic Apps to become proficient with the technology. Not to getting in here with a long story, I will cut things short now and if you need anything, comment in here or contact us.

Using Visual Studio and installing Azure extensions allowed me to start using my Visual Studio in conjunction with my TFS Online (Git in my case).

The reason of this post is more to have in mind a way to automate(not the focus of this post, but helps in terms of configuration) the deployment of the Resources to Azure.

Assumptions here:

  • Use Visual Studio
  • A resource group in Azure means an "Environment" 
  • Full automation of Deployment trough Azure Dev Ops

High level steps below:

  1. Created a Visual Studio 'Azure Resource Group' and used a Logic App template
    1. Created 2 files
      1. LogicApp.json
      2. LogicApp.parameters.json
    2. I changed the naming convention of the parameters file and created another file
      1. LogicApp.parameters.dev.json
      2. LogicApp.parameters.uat.json
    3. Opened the LogicApp.json design mode
      1. Configured with recurrence trigger
      2. Added an artifact 'Initialize variable' and set value to 'Hello World'
    4. Opened the LogicApp.json as json file and configured the following:
    5. Inside the parameters node:
      1. "Environment": { "type": "string", "defaultValue": "Dev" }
      2. "LogicAppName": { "type": "string", "defaultValue": "[concat('MyLogicAPp-', parameters('Environment'))]" }
    6. In the LogicApp.parameters.dev.json inside the parameters node:
      1. "Environment": { "type": "string", "value": "Dev" }
    7. In the LogicApp.parameters.uat.json inside the parameters node:
      1. "Environment": { "type": "string", "value": "Uat" }

This is a personal option of deploying to different environments, but as i saw there are many ways of doing the same. Hope it is clear.

The result in terms of naming convention would be something like:

  1. Azure Resource Groups
    1. Dev
      1. Logic Apps
        1. MyLogicApp-Dev
    2. Uat
      1. Logic Apps
        1. MyLogicApp-Uat

Why this option? Because if you use Azure Dev Ops, you know for instance that navigating for example to Logic Apps area:

Without doing this naming convention the result would be something like.

NameResource Group
MyLogicAppDev
MyLogicAppUat

Doing this naming convention the result would be something like.

NameResource Group
MyLogicApp-DevDev
MyLogicApp-UatUat

If you have one or two logic apps that's probably fine doing without appending the environment to the name, but honestly I found that working in larger projects using Azure this is actually beneficial.

Any additional information or help please don't hesitate in contacting us.

Monday, 14 September 2020

CRM Transaction Mode operations (Dynamics 365 V9)

I hope I'm not late with this post. Who never had to do develop some code where the requirement would be "In Transaction"?

I don't think this will apply to all possible scenarios but it is a start.

Let's assume the following scenario: In CRM 365, someone asked you to apply a change to an entity based on a really easy filter. For example, if a customer address is in City "A", set a custom field with value "B", however, if you can't update one of the records for some reason (example, exception thrown), you need to rollback everything.

Here, and we all know that are many ways to achieve this, I'm giving one.

Hypothetically, let's assume we opt for a plugin at first sight, however, after some testing we start to get Timeouts because we need to update to many records and the 2 minutes just pass.

Another option would be with a Console Application, and this is a personal option, not 100% right nor 100% wrong, there is one of many ways. 

By not doing with a Plugin, brings me to a problem that needs to be addressed based on the requirements that is the requirement "Transaction". If one of the records fail, I need to undo the others, so, more code to deal with that, however, you can avoid that for this specific case only with CRM SDK, no need to reinvent the wheel.

Found in the SDK the message ExecuteTransactionRequest that do the Transaction I really needed, this allows me to perform multiple requests as one single database transaction. Using this message allows executing two or more organization services requests in a single database transaction. If one of the requests fails the transaction is rolled back.

For more information visit Microsoft page below:

https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/org-service/use-executetransaction

Any questions please reply trough this post or contact us.

Hope it helps.

CRM 365 Cloud - Disassociate 2 records using typescript

In case you need to Disassociate 2 records, please find below the code that allows you to do that.      export async function DissociateE...