infuerno.github.io

Microsoft Azure: AZ-204: Developing Solutions for Microsoft Azure

Learning Path: Create serverless applications

Azure services for automating business processes

Business processes modelled in software are known as workflows. Azure includes 4 different technolgies to implement workflows which integrate different systems:

  1. Logic Apps - design first for IT pros and developers - over 200 connectors
  2. Microsoft Power Automate - build on top of logic apps - design first for non technical staff
  3. WebJobs - continuous or triggered (schedule or manually) - code written in script or .NET using WebJobs SDK - only technology that permits developers to control retry policies - useful if deploying an Azure App Service anyway
  4. Azure functions - cost effective if using the consumption plan
  Azure WebJobs Azure Functions
Supported languages C# if you are using the WebJobs SDK C#, Java, JavaScript, PowerShell, etc.
Automatic scaling No Yes
Development and testing in a browser No Yes
Pay-per-use pricing No Yes
Integration with Logic Apps No Yes
Package managers NuGet if you are using the WebJobs SDK Nuget and NPM
Can be part of an App Service application Yes No
Provides close control of JobHost Yes No

Azure Functions

Serverless

Serverless doesn’t mean there are no servers - it just means the developer doesn’t have to worry about servers. Instead, a cloud provider such as Azure, manages servers.

Bindings

The power of Azure Functions comes mainly from the integrations that it offers with a range of data sources and services, which are defined with bindings. With bindings, developers interact with other data sources and services without worrying about how the data flows to and from their function.

Possible bindings: https://docs.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings?tabs=csharp#supported-bindings

Binding configuration

Bindings require at least 3 properties:

  1. Name - Defines the function parameter through which data is accessed. For example, in a queue input binding, this is the name of the function parameter that receives the queue message content.
  2. Type - Identifies the type of binding, i.e., the type of data or service we want to interact with.
  3. Direction - Indicates the direction data is flowing, i.e., is it an input or output binding?

Additionally, most binding types also need a fourth property:

  1. Connection - Provides the name of an app setting key that contains the connection string. Bindings use connection strings stored in app settings to keep secrets out of the function code. This makes your code more configurable and secure.

Depending on the binding type, other properties may be required e.g. path property in a blob storage trigger.

Bindings are defined in the function.json configuration file.

Binding expressions

A binding expression is specialized text in function.json, function parameters, or code that is evaluated when the function is invoked to yield a value.

Most expressions are identified by wrapping them in curly braces except app setting binding expressions which are wrapped in percent signs. e.g. %Environment%/newblob.txt

Triggers

Durable Functions

Durable Functions enable performing long-lasting, stateful operations. Azure provides the infrastructure for maintaining state information, so they can be used to orchestrate a long-running workflow. This is useful when there is a manual approval process in the middle of custom business logic.

Function Types

Application Patterns

Also see: https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp

Install an NPM package to a function

Function > App Service Editor > Console > wwwroot > touch package.json > open package.json > create a barebones package.json > save > console > npm install durable-functions > Overview > Restart function

Timers

Use durable timers in orchestrator functions instead of the setTimeout() and setInterval() functions.

Create a durable timer by calling the createTimer() method of the DurableOrchestrationContext. This method returns a task that resumes on a specified date and time.

const df = require("durable-functions");
const moment = require("moment");

module.exports = df.orchestrator(function*(context) {
    for (let i = 0; i < 10; i++) {
        const deadline = moment.utc(context.df.currentUtcDateTime).add(i, 'd');
        yield context.df.createTimer(deadline.toDate());
        yield context.df.callActivity("SendReminder");
    }
});

Always use currentUtcDateTime to obtain the current date and time, instead of Date.now or Date.UTC.

Azure Functions Core Tools

Tools to develop functions locally and push to Azure.

Other Functions development tools, such as the Functions-related features in Visual Studio and the Azure Functions extension for Visual Studio Code, are built on top of the Core Tools.

Structure of an Azure Function

public static class Function1
{
  [FunctionName("Function1")]
  public static async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
    ILogger log)
  {
    log.LogInformation("C# HTTP trigger function processed a request.");

    string name = req.Query["name"];

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    dynamic data = JsonConvert.DeserializeObject(requestBody);
    name = name ?? data?.name;

    return name != null
        ? (ActionResult)new OkObjectResult($"Hello, {name}")
        : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
    }
  }
}

Function Apps

A Function App is the context under which an Azure Function runs. They specify:

SignalR

SignalR is an abstraction for a series of technologies that allows apps to enjoy two-way communication between the client and server. It handles connection management automatically, and allows broadcasting messages to all connected clients simultaneously, as well as sending messages to specific clients. The connection between the client and server is persistent, unlike a classic HTTP connection, which is re-established for each communication.

A key benefit of the abstraction provided by SignalR is the way it supports “transport” fallbacks. A transport is method of communicating between the client and server. SignalR connections begin with a standard HTTP request. As the server evaluates the connection, the most appropriate communication method (transport) is selected. Transports are chosen depending on the APIs available on the client.

For clients that support HTML 5, the WebSockets API transport is used by default. If the client doesn’t support WebSockets, then SignalR falls back to Server Sent Events (also known as EventSource). For older clients, Ajax long polling or Forever Frame (IE only) is used to mimic a two-way connection.

Update polling app to use SignalR

The web client uses the SignalR client SDK to establish a connection to the server. By convention the connection is retrieved via a function name negotiate.

Azure API Management

Functions can be added to Azure API Management to be presented to users as part of a single API.

Learning Path: Connect your services together

Azure provides several technologies tto communicate more reliably, including:

Events or Messages

Messaging solutions

Azure Queue Storage

Queue storage uses Azure Storage to store large numbers of messages that can be securely accessed using a simple REST-based interface. Queues can contain millions of messages, limited only by the capacity of the storage account that owns it.

Azure Service Bus Queues

Service Bus is a message broker system intended for enterprise applications. Apps often have multiple communication protocols, different data contracts, higher security requirements.

Azure Service Bus Topics

Topics are like queues, but can have multiple subscribers. Internally, topics use queues. When you post to a topic, the message is copied and dropped into the queue for each subscription.

Azure Service Bus Relays

Performs synchronous two-way communication between applicaition, but DOESN’T have a temporary storage mechanism. Use to cross problematic network boundaries.

Delivery guarantees

Transactional support

If multiple messages must either succeed OR fail, then ensure transactions are supported. e.g. Order is sent AND credit card is billed.

Which to choose

Use Storage queues when you want a simple and easy-to-code queue system. For more advanced needs, use Service Bus queues. If you have multiple destinations for a single message, but need queue-like behavior, use topics.

Service Bus Queues support:

Queue storage supports:

Though distributed components can communicate directly, enhance reliability by using Azure Service Bus or Azure Event Grid.

Implementating Service Bus Queues

Sending messages to queues
var queueClient = new QueueClient(TextAppConnectionString, "PrivateMessageQueue");
string message = "Sure would like a large pepperoni!";
var encodedMessage = new Message(Encoding.UTF8.GetBytes(message));
await queueClient.SendAsync(encodedMessage);
Receiving messages from queues
var queueClient = new QueueClient(TextAppConnectionString, "PrivateMessageQueue");
queueClient.RegisterMessageHandler(MessageHandler, messageHandlerOptions);

// within the MessageHandler method, process message and then call
await queueClient.CompleteAsync(message.SystemProperties.LockToken);

Implementing Service Bus Topics

Sending messages to topics
var topicClient = new TopicClient(TextAppConnectionString, "GroupMessageTopic");
string message = "Cancel! I can't believe you use canned mushrooms!";
var encodedMessage = new Message(Encoding.UTF8.GetBytes(message));
await topicClient.SendAsync(encodedMessage);
Subscribing to topics
subscriptionClient = new SubscriptionClient(ServiceBusConnectionString, "GroupMessageTopic", "NorthAmerica");
subscriptionClient.RegisterMessageHandler(MessageHandler, messageHandlerOptions);

// within the MessageHandler method, process message and then call
await subscriptionClient.CompleteAsync(message.SystemProperties.LockToken);

Implementing Storage Queues

Creating the queue, sending a message, receiving a message

CloudStorageAccount account = CloudStorageAccount.Parse(connectionString);
CloudQueueClient client = account.CreateCloudQueueClient();
CloudQueue queue = client.GetQueueReference("myqueue");

// sender application should always be responsible for creating the queue
await queue.CreateIfNotExistsAsync();

// send a message
var message = new CloudQueueMessage("your message here");
await queue.AddMessageAsync(message);

// receive a message
CloudQueueMessage message = await queue.GetMessageAsync();
if (message != null)
{
    // process the message
    // ...
    await queue.DeleteMessageAsync(message);
}

Azure Event Grid

Good for a publisher with many subscribers e.g. upload a music file, notify any potentially interested subscribers.

Azure Event Grid is a fully-managed event routing service running on top of Azure Service Fabric. It distributes events from different sources (blob storage, media services) to differnt handlers (functions, webhooks).

Terminology

An event occurs at an event source e.g. Azure Storage is the event source for “blob created” events. An event publisher is the user or organisation who publishes the events e.g. Microsoft. the terms publisher and event source are often used interchangeably.

The source publishes it to a topic. Topics are created for groups of different categories of events e.g. user events and orders events. System topics are provided by Azure and can be subscribed to (not shown in a subscription). Custom topics are for applications and for third parties (visible in the subscription).

The events are routed according to subscriptions and sent to event handlers or subscribers.

Azure Event Grid

Many different Azure services can generate events.

Events are JSON messages with a particular format.

[
  {
    "topic": string,
    "subject": string,
    "id": string,
    "eventType": string,
    "eventTime": string,
    "data":{
      object-unique-to-each-publisher
    },
    "dataVersion": string,
    "metadataVersion": string
  }
]

Reasons to choose Event Grid

Azure Event Hubs

Useful for handling a massive number of events e.g. millions of events per second. Optimised for extremely high throughput, a large number of publishers, subscribers and resiliency.

As events are received, they are divided into partitions or buffers. By default events stay in the buffers for 24 hours if subscribers are not yet ready to receive them. Events can be sent immediatey to Azure Data Lake or Blob storage for immediate persistence. Events publishers can be authorised with tokens.

An event is a small packet of information (a datagram) that contains a notification. Events can be published individually, or in batches, but a single publication (individual or batch) can’t exceed 1 MB.

Event publishers are any app or device that can send out events using either HTTPS or AMQP 1.0.

Event subscribers are apps that use one of two supported programmatic methods to receive and process events from an Event Hub. Either:

  1. EventHubReceiver - A simple method that provides limited management options
  2. EventProcessorHost - Built on top of EventHubReceiver providing a simpler interface with some automated options e.g. distribute multiple instances across partitions

Multiple consumer groups can be used if required to process the events and provide different views.

Pricing: Basic, Standard, and Dedicated - differing in terms of supported connections, number of consumer groups and throughput. Defined at the Event Hub namespace level. Configure different hubs for different throughput requirements.

Reasons to choose Event Hub

Learning Path: Store data in Azure

Azure Blobs

There are 3 kinds of blobs:

  1. Block blobs - text or binary files designed to be read from beginning to end e.g. images - files larger than 100MB must be uploaded in “blocks”
  2. Page blobs - random access blobs - usually for backing storage for VHDs
  3. Append blobs - similar to block blobs, but optimized for append e.g. logging

Standard pattern using continuation tokens to get all resources:

BlobContinuationToken continuationToken = null;
BlobResultSegment resultSegment = null;

do
{
    resultSegment = await container.ListBlobsSegmentedAsync(continuationToken);

    // Do work here on resultSegment.Results

    continuationToken = resultSegment.ContinuationToken;
} while (continuationToken != null);

CODE BELOW FROM: git clone https://github.com/MicrosoftDocs/mslearn-store-data-in-azure.git

using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using System.Linq;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;

namespace FileUploader.Models
{
    public class BlobStorage : IStorage
    {
        private readonly AzureStorageConfig storageConfig;

        public BlobStorage(IOptions<AzureStorageConfig> storageConfig)
        {
            this.storageConfig = storageConfig.Value;
        }

        public Task Initialize()
        {
            var container = GetContainerReference();
            return container.CreateIfNotExistsAsync();
        }

       // The stream-based upload code shown here is more efficient than reading the file into a byte array
       // before sending it to Azure Blob storage. However, the ASP.NET Core IFormFile technique you use to
       // get the file from the client is not a true end-to-end streaming implementation, and is only
       // appropriate for handling uploads of small files.
        public Task Save(Stream fileStream, string name)
        {
            var container = GetContainerReference();
            var blockBlob = container.GetBlockBlobReference(name);
            return blockBlob.UploadFromStreamAsync(fileStream);
        }

        public async Task<IEnumerable<string>> GetNames()
        {
            List<string> names = new List<string>();

            var container = GetContainerReference();
            BlobContinuationToken continuationToken = null;
            BlobResultSegment resultSegment = null;

            do
            {
                resultSegment = await container.ListBlobsSegmentedAsync(continuationToken);

                // Get the name of each blob.
                names.AddRange(resultSegment.Results.OfType<ICloudBlob>().Select(b => b.Name));

                continuationToken = resultSegment.ContinuationToken;
            } while (continuationToken != null);

            return names;
        }

        public Task<Stream> Load(string name)
        {
            var container = GetContainerReference();
            return container.GetBlobReference(name).OpenReadAsync();
        }

        private CloudBlobContainer GetContainerReference()
        {
            var account = CloudStorageAccount.Parse(storageConfig.ConnectionString);
            var client = account.CreateCloudBlobClient();
            return client.GetContainerReference(storageConfig.FileContainerName);
        }
    }
}

Azure Files

Files shares using SMB for e.g. multiple VMs to access shared files, storing log files etc

Azure Queues

Use queues to loosely connect parts of an appication together.

Learning Path: Deploy a website with Azure virtual machines

What? Typical tasks Sizes
General use computing/web Testing and development, small to medium databases, or low to medium traffic web servers B, Dsv3, Dv3, DSv2, Dv2
Heavy computational tasks Medium traffic web servers, network appliances, batch processes, and application servers Fsv2, Fs, F
Large memory usage Relational database servers, medium to large caches, and in-memory analytics Esv3, Ev3, M, GS, G, DSv2, Dv2
Data storage and processing Big data, SQL, and NoSQL databases that need high disk throughput and I/O Ls
Heavy graphics rendering or video editing … … as well as model training and inferencing (ND) with deep learning NV, NC, NCv2, NCv3, ND
High-performance computing (HPC) Your workloads need the fastest and most powerful CPU virtual machines with optional high-throughput network interfaces H

Azure Automation Services

Instead of using PowerShell, Azure CLI etc to manage VMs, may using higher-level services such as Azure Automation Services to help operate from a higher level.

Allows automating frequent, time-consuming, and error-prone management tasks

Process Automation

If a VM is monitored for a specific error event, process automation allows setting up watcher tasks that can respond to events that may occur in the datacenter.

Configuration Management

Used to track software updates that become available for the OS. Microsoft Endpoint Configuration Manager is used to manage a company’s PC, servers, and mobile devices. Extend this support to Azure VMs with Configuration Manager.

Update Management

Manage updates and patches for VMs. Assess the status of available updates, schedule installation, review deployment results to verify updates were applied successfully. Update management incorporates services that provide process and configuration management. Enable update management for a VM directly from the Azure Automation account. Can also allow update management for a single virtual machine from the virtual machine pane in the portal.

Manage availability

Fault Domains

Install an SSH key on an existing Linux VM

ssh-copy-id -i ~/.ssh/id_rsa.pub azureuser@myserver

Some linux commands

Initialize the extra data disk

Check status of service e.g. Apache

sudo systemctl status apache2 --no-pager

Learning Path: Manage resources in Azure

There are3 approaches to deploying cloud resources: public, private and hybrid cloud

Private cloud

Hybrid cloud

Azure CLI

Azure PowerShell

Manage costs

Total Cost of Ownership Calculator: https://azure.microsoft.com/en-gb/pricing/tco/calculator/

  1. Define workloads of current on premise solution: Servers, Databases, Storage, Networking
  2. Adjust assumptions: software licenses, electricity, IT administration

Resource Groups

POST https://management.azure.com/subscriptions/<your-subscription-id>/resourceGroups/<your-source-group>/validateMoveResources?api-version=2019-05-10
Authorization: Bearer <your-access-token>
Content-type: application/json

with the following JSON body

{
 "resources": ["<your-resource-id-1>", "<your-resource-id-2>", "<your-resource-id-3>"],
 "targetResourceGroup": "/subscriptions/<your-subscription-id>/resourceGroups/<your-target-group>"
}

Alternatively submit using Azure CLI:

az rest --method post /
   --uri https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/<your-source-group>/validateMoveResources?api-version=2019-05-10 /
   --body "{\"resources\": [\"<your-resource-id-1>\", \"<your-resource-id-2>\", \"<your-resource-id-3>\"], \"targetResourceGroup\": \"/subscriptions/<your-subscription-id>/resourceGroups/<your-target-group>\"}"

Learning Path: Deploy a website to Azure with Azure App Service

Install dotnet cli in the cloud Shell

wget -q -O - https://dot.net/v1/dotnet-install.sh | bash -s -- --version 3.1.102
export PATH="~/.dotnet:$PATH"
echo "export PATH=~/.dotnet:\$PATH" >> ~/.bashrc

Manual deployment

dotnet publish -o pub
cd pub
zip -r site.zip *
az webapp deployment source config-zip \
    --src site.zip \
    --resource-group learn-c5f54c43-2def-448d-b6b8-25db523f586f \
    --name <your-app-name>

Learning Path: Secure your cloud data