0 Comments

It’s a bit of a follow up from my achievement last June when I passed the 70-486 MVC exam. I thought that I’d get the MCSD because I completed two (old/previous) UWP related exams before. But there was a bug in the cert planner. I had to wait a long time to hear from Microsoft, but unfortunately I had to take another exam to get the MCSD. I choose 70-532 because I wanted to go for MCSA: Cloud platform. So I ordered the v2 of the exam guide and studied every day during my holiday.

I found out (later) that the 70-532 and 70-535 and some more, will be deprecated by the end of the year… and will be replaced.

70-532 →   AZ-200
AZ-201
70-535 → AZ-300
AZ-301

But if you finish 532 or 535 before the 31ste of December you can use a cheaper transition exam to transfer to the same as the new “AZ” style. Since I passed 70-532, I decided to give AZ-202 a go and I hope that I will pass. I have to pass before June 2019 to get the MCAD. After that, I have added the AZ-300 and AZ-301 in my to-do list to get the MCASA (Microsoft Certified Azure Solutions Architect)

azure developer image

But first, let me enjoy my MCSD.

Good luck studying!

Pin on pinterest Plus on Googleplus Post on LinkedIn

I have blogged before about this Excel Nuget package where you don’t need to use interop and have Excel installed on the server. And my journey to start this Azure Function. This is because the most recent Excel format uses xml under the cover in a zipped file stored as a file with an xlsx extension. Since you do not have hard disk access in a serverless environment like Azure Functions you need to generate the Excel in memory (or store stuff in blob storage). I chose the in memory to leave no footprints or take up space in the cloud.

I wanted to use an Azure function to have it run in the cloud. Not being dependent on a Server which needs updates, reboots etc. Since the database already is in the Azure Cloud (Azure SQL) this seems a good/perfect fit.

I got the option to go for Azure function v1 or v2 which is in preview. So this was a nice opportunity to use the v2 and .Net Core/Standard. https://docs.microsoft.com/en-us/azure/azure-functions/functions-versions

The v2 also has support for the Office365 Graph. But that was out of (my) scope.

I took a timer based project because I wanted it to send an overview of invoices on a monthly basis. The Timer based project has a timer as data annotation based on CRON scheduling. There is however a small difference. Instead of 5 “fields” the Azure function has 6. It also let’s you schedule the seconds.

https://en.wikipedia.org/wiki/Cron#CRON_expression

So not just: minutes, hours, day of month, month, day of week, year but seconds, minutes, hours, day of month, month, day of week, year. Of course the order is really important. https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer#cron-expressions

I used this Nuget for the Excel export https://www.nuget.org/packages/EPPlus/

it has .Net Core support and will work perfectly.

using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Net;
using System.Net.Mail;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using OfficeOpenXml;

namespace MonthlyMailInvoices
{
    public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([TimerTrigger("10 0 0 1 * *")]TimerInfo myTimer, TraceWriter log)
        {
            log.Info($"C# Timer trigger function executed at: {DateTime.Now}");

            var com = new SqlCommand("SELECT * FROM [dbo].[INVOICES] where invoicedate > @startdt and invoicedate < @enddt");

            com.Parameters.AddWithValue("startdt", DateTime.Now.AddMonths(-1));
            com.Parameters.AddWithValue("enddt", DateTime.Now.AddDays(-1));

            var dt = new DataTable();

            using (var con = new SqlConnection("connectionstring goes here"))
            {
                con.Open();
                com.Connection = con;
                var da = new SqlDataAdapter(com);
                da.Fill(dt);

                log.Info($"start: {DateTime.Now.AddMonths(-1)} and end { DateTime.Now.AddDays(-1) } gave {dt.Rows.Count}");
            }

            using (var wb = new ExcelPackage())
            {
                wb.Workbook.Worksheets.Add("Our company");
                var ws = wb.Workbook.Worksheets[0];

                FillData(ws, dt, "Our company B.V.");

                var msg = new MailMessage();
                msg.To.Add("mymail@companydomain.com");
                msg.Subject = "Montly invoices";
                msg.From = new MailAddress("the@cloud.com");
                msg.Body = $"Invoices from {DateTime.Now.AddMonths(-1)} to { DateTime.Now.AddDays(-1) } in the Excel attachment.";
                var ms = new MemoryStream(wb.GetAsByteArray());
                ms.Position = 0;

                //msg.Attachments.Add(new Attachment(ms, "Invoices.xlsx", "application/vnd.ms-excel"));
                msg.Attachments.Add(new Attachment(ms, "Invoices.xlsx", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"));
                var smtp = new SmtpClient
                {
                    Host = "smtp.gmail.com",
                    Port = 587,
                    EnableSsl = true,
                    Credentials = new NetworkCredential("my@gmailaccount.com", "incorrectpassword")
                };
                smtp.Send(msg);
            }
        }

        private static void FillData(ExcelWorksheet ws, DataTable dt, string title)
        {
            ws.Cells[1, 1].Value = title;

            ws.Cells[2, 1].Value = "Invoice nr";
            ws.Cells[2, 2].Value = "Invoice date";
            ws.Cells[2, 3].Value = "Amount inc. VAT";
            ws.Cells[2, 4].Value = "VAT";
            ws.Cells[2, 5].Value = "Amount exc. VAT";

            int row = 3;

            foreach (DataRow dr in dt.Rows)
            {
                ws.Cells[row, 1].Value = dr[0].ToString();
                ws.Cells[row, 2].Value = dr[1].ToString();
                ws.Cells[row, 3].Value = dr[2].ToString();
                ws.Cells[row, 4].Value = dr[3].ToString();
                ws.Cells[row++, 5].Value = dr[4].ToString();
            }
        }
    }
}

I could not test it locally because I had some issues with logins for my localdb. So I hit publish to deploy it on Azure. However republishing failed. I found the answer (as always) on StackOverflow. I had to add “MSDEPLOY_RENAME_LOCKED_FILES” and set it to 1 (true).

app-settings

Tony gave the correct solution.

I also had issues with the Excel generating in memory and having the Memorystream to a byte array and providing the right Mime type. Found that too on SO.

The last bit was to automate deployment. I had my code in VSTS (git) and configured a CI/CD pipeline (build + release) But had issues to grant myself (personal account) global admin rights from our company account in order to be able to access Azure resources to deploy. It was a matter of time before the Azure rights/roles changes are active. It’s a nice small serverless function which you can (should) add to source control and ci/cd to automate the latest builds to a test or production environment in the cloud.


Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

I have written powershell scripts in Azure runbooks in Azure Automation. It’s not a new concept. It’s even from back in 2014

https://azure.microsoft.com/en-us/blog/azure-automation-your-sql-agent-in-the-cloud/

I started to use it because there is no SQL Agent in Azure SQL databases. I relied on SQL Agent to perform Ola’s database maintenance scripts. I use the Azure automation with Runbooks now for a long time to build reports from Azure SQL and have them send to people by SMTP.

The problem is that I string concatenate HTML in the powershell script and just put the results in an HTML enabled Email message. It is still a good option… Until a coworker requests an Excel attached to the mail…

Excel in Azure Runbook (Powershell)

I did build the powershell locally first.

image

When using the Azure Automation ISE add-on for Windows PowerShell ISE it hit me. The cloud probably has no Excel com/interop…

So I found this module to work with Excel in Powershell without Excel on GitHub. It uses Epplus. Which I mentioned in my post from 6 years ago.

But I realised that I could also just use Azure Functions and code in C# and have a time trigger. This enables me to write my beloved C# rather then scripting Powershell. I can also just use the Epplus nuget package.

The Azure functions v2 are now in preview and have .Net Standard support (which is great!)

The Visual Studio dialog can be unclear if you visit it for the first time and have no clue that the schedule uses CRON notation. Maybe they will change it, but now you know.


Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

There are a lot of benefits with managed disks and it is the preferred way to create a new VM. However because of my tight budget, I wanted to move to unmanaged. I did not expect the costs to continue when the VM was stopped. It was because of the managed storage disks(s).

I found this SO answer of Jason Ye - MSFT.

$sas = Grant-AzureRmDiskAccess -ResourceGroupName "[ResourceGroupName]" -DiskName "[ManagedDiskName]" -DurationInSecond 3600 -Access Read  
$destContext = New-AzureStorageContext –StorageAccountName "[StorageAccountName]" -StorageAccountKey "[StorageAccountAccessKey]"
$blobcopy=Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer "[ContainerName]" -DestContext $destContext -DestBlob "[NameOfVhdFileToBeCreated].vhd"

I had to create containers in the storage account but I had copied both the OS and the Data disk to the blob storage as .vhd file.

I used this powershell script and the template to create the vhd from blob storage. The datadisk can be added later in the web gui.

  1. Login-AzureRMAccount
  2. Get-AzureRmSubscription
  3. Set-AzureRmContext -SubscriptionName "my subscription name here"
  4. $sas = Grant-AzureRmDiskAccess -ResourceGroupName "resourcegroup" -DiskName "manageddiskname" -DurationInSecond 45000 -Access Read 
    $destContext = New-AzureStorageContext –StorageAccountName "storageaccount" -StorageAccountKey "myprivatekey"
    $blobcopy=Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer "vhd-containers" -DestContext $destContext -DestBlob "givetheunmanageddiskaname.vhd"
  5. Get-AzureStorageBlobCopyState -Container "vhd-containers" -Blob "givetheunmanageddiskaname.vhd" -Context $destContext –WaitForComplete

My mistake was to use the 3600 value for ‘DurationInSecond’ which is just an hour (60 sec, 60 minutes). The 512 gb datadisk could not be copied to blob storage within an hour (or two). Found out that an hour was also insufficient when I found ‘Get-AzureStorageBlobCopyState’.

Because I already had a vnet from my vm with managed disks, I used this template to create a new vm with the os disk from blob storage: https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-specialized-vhd-existing-vnet

If you do not have a vnet yet, you should use https://github.com/Azure/azure-quickstart-templates/tree/master/101-vm-from-user-image The deploy to azure button is a useful tool!

image

Once you have a new vm with an unmanaged disk up and running, close it to add the data disk. Once you have done that and have a remote desktop connection, go to disk management and bring the datadisk online again. It took me some time to get my head around the ASM and ARM differences in the powershell tooling. Also because there is now Azure Cli and a cross plat powershell 6.0

The cloud is moving fast, so hop on. Don’t miss out!


Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

The documentation for Azure SQL db performance tuning is great and accurate with great detail on docs.microsoft.com (instead of the old msdn location)

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-query-performance

I was looking at a SQL db in Azure and noticed some spikes. Here is my graph:

snip_20170215133957

I could drill into the Azure portal to see what was causing this spike and recognized the SQL statement, so I knew which product to update the SQL for. However this was not even necessary.

Here are the client stats when I copy pasted the query causing the spikes in SQL Management Studio Express. Check out the “Total execution time”

snip_20170215120138

My next step was to take a look at the execution plan of the query. There is this exclamation mark at the sort operation:

snip_20170215120257

When you hover it, you get this context popup:

snip_20170215124310

So it used tempdb. I still had no clue how to fix this, so I reached out to stack exchange. And a user named T.H. gave me the solution to create two rather simple indexes:

CREATE NONCLUSTERED INDEX TEST ON STOCKDEBUGTRIGGERED (ChangeDate)

CREATE NONCLUSTERED INDEX TEST ON STOCKDEBUG(ProductID, StockOld, StockNew)

Here is the query plan after the two new indexes:

snip_20170215124852

No more yellow exclamation mark! and fewer steps. The client statistics also prove that this is a lot faster/better:

snip_20170215124929

From an average of 4250 down to 50!

 

This is also backed by the dramatic drop in resource usage in the Azure Portal:

snip_20170215133957

So the lesson is: do not trust the Azure db perf advisor a 100% Smile

T.H. commented on Stack Exchange:

The automatic index advice is extremely limited, and often misleading, so can only be considered as a starting point.

 

Hope this info might help someone troubleshooting Azure Db perf!

 

Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn