Azure Virtual Network Communication Between vNet

I was able to setup a VM in a vNet and RDP to it. It is the simplest scenario to use Azure IaaS. A virtual network supplies a perfect isolation to group related resources that talk to each other. Usually that is not how it is used in the real world.

There are many services living under isolated environments. They expose endpoints that other services can communicate with. Warning: I do not discuss about microservices. Regardless of the term, each service will stay inside a virtual machine in a virtual network. What would it take to make them talk to each other?

Follow the step I did in the previous post, I created another setup in Central US.

Network Peering

There are 2 different virtual networks at different locations, with different address spaces

2 vNet ready for peering

For 2 virtual networks, there is Network Peering. From each virtual network, create a peering to the other.

A peering can

  1. Peer 2 virtual networks (of course there must be 2) in different regions
  2. Belong to a different subscription. It is possible to select a different subscription when creating a peering.

Creating a peer is pretty simple

Create a peer resource

The above creates a peer from ps-az300-vnet to ps-vnet. To finish the peering, create another one from ps-vnet to ps-az300-vnet.

The peering is ready. Let’s see if these virtual machines can talk to each other

2 VMs from different regions

Let’s RDP to each machine and test a connection to the other. This picture makes my day

Test connection between 2 virtual machines

So far, I am able to

  1. Create a virtual machine with its network setup. In a more abstract term, I create an isolated environment which allows me to deploy whatever I want
  2. Connect the 2 isolated environments via Azure Peering resource

Gateway, Hub-spoke topology

Another option is to use a gateway, hub-spoke. They are kind of advanced topics that I do not really need to grasp at the moment. There are step by step on MS Docs site.

References

Azure virtual network peering

Hub-spoke

Azure Virtual Network

Azure has been there for a while. It is huge. I once said that I will study Azure. Then I started. Lost. There are so many materials out there, wonderful MS docs site, super Pluralsight, and many other personal blogs. “How do I start? Where do I start?” I asked.

I took a chance to read around, tried to capture some Azure concepts especially the mindset. Without a correct mindset, everything is a mess. What I read will confuse me more.

Almost everything in Azure is a resource. To manage a resource there is Resource Manager. A resource can be created, managed using templates. So there is Resource Template. As a developer, that part makes sense to me.

The design is modular, component-based. In the high level, its design is familiar with the software design principles.

A virtual machine is deployed to the cloud. Its connection is controlled by a Network Interface (NIC), a separated resource.

Let’s say we need to deploy a virtual machine in Azure. And we should be able to remote (RDP) to it. How many resources do we need? How does it look? Let’s find out.

All resources inside rg-az300 resource group

All my resources start with ps-az300. The rest are auto generated by Azure or my mistakes while experiencing.

  1. Resource group: rg-az300
  2. Virtual network (vNet): ps-az300-vnet
  3. Virtual machine: ps-az300
  4. Network interface (NIC): ps-az300-nic
  5. Network security group (NSG): ps-az300-nsg
  6. Public IP address (PIP): ps-az300-pip

Resource Group

Resource groups are logical containers for everything. All resources used to setup our example are grouped in a resource group. Once the experience is completed, deleting the resource group will wipe out its resources.

Virtual Network

Virtual network (vNet) supplies an isolated environment where resources inside a vNet can talk to each other. It increases security.

Network Security Group (NSG)

Like firewall in Windows. Define the inbound and outbound rules. Beside the default rules generated by Azure, the inbound rule “RDP_3389” is created to allow remote desktop connection.

NSG with RDP inbound rule

Network Interface (NIC)

Act as a lawyer between resources with the internet. A virtual machine should not define its firewall directly. Instead, a network interface is attached to it.

A NIC with public IP address

A network interface has a vNet, a NSG, and attaches to a Virtual Machine. It might have a public IP address, defined by a public IP address resource (ps-az300-pip).

This network interface allows the VM (ps-az300) communicates with other resources or over the internet. What it can communicate with depends on the NSG settings.

Its public IP address is configured under the Settings -> IP configurations

NIC IP configurations

The interesting thing here is the Public IP Address. One can create a PIP easily, just remember in the Assignment section to choose the static.

Public IP Address (PIP)

As seen in the NIC section above. The public IP address for the NIC is 23.101.16.27 supplied by Azure. There are 5 reserved public IP address for a public IP address resource. That’s why I choose the static assignment.

Virtual Machine (VM)

Just go through the Azure wizard and choose settings: Resource security group, Network interface, Virtual network. There is another course regarding creating virtual machines in Azure. Hope that I can write something about it soon.

Since I am learning Virtual Network, this is the most interesting about the virtual machine setup – the Networking.

All components linked to build a networking for VM

The Virtual Machine is the actual resource that hosts other business services; if we want to deploy, say a website, an internal web service.

  1. Use the network interface ps-az300-nic to communicate with the outside
  2. Run under the virtual network with the default subnet
  3. Have a public IP address 23.101.16.27 with a private IP (10.1.0.13) inside its virtual network
  4. Follow the inbound/outbound rules from the network security group ps-az300-nsg

With those setup, I can click on the Connect button and download RDP file.

There are many things in the process that I do not understand. There are many concepts in those images I paste here. That’s ok. Things make more sense to me.

The next challenge is to have 2 virtual machines in different virtual networks communicate to each other.

Azure Functions Gettting Started

Update: Stopped for while. Now Azure has changed tremendously. I published this post for me to, at least, have a reference later. The content has never finished 🙁

When something happens, executes a logic, outputs result (if there is) somewhere. At the highest abstraction, it is simple like that, Azure Functions. But its simplicity has captured almost all scenarios in real life.

I was about to write some concepts in Azure Functions. However, MS Docs is so good that you can go ahead and read them. I posted the link below to save you some typing; and for my reference next time

Azure Functions Official Documentation from MS

Trigger – Action – Output

When a result is place into a storage, it triggers another action, and another action. The chain keeps going. It stops when a business requirement is met. The power of Azure Function is here. It gives us a flexible tool to build a complex business process. The limit is our design skills. Yeah, I know that there are limitations in term of technical challenge. But with a good design, architectural skill, you can build almost whatever you want to accomplish.

Trigger Action Output
Trigger Action Output

I am a developer. I need to know deeper. And most importantly, I have to answer this question

What can I do with such a powerful tool?

After reading around the internet and courses from Pluralsight, I head over to Azure portal and start my first function. Most of my writing here is not new. Many have written them. The point of my writing is for my learning purpose. I want to document what I learn in my own words.

Let’s build a simple team cooperation process using available tools. Say in my team whenever we do a release:

  1. Prepare code, build
  2. Prepare release notes
  3. Deploy to test environment
  4. Smoke test
  5. Review system logs

There are more but those are the basic.

Core Concepts

As C# a developer, this document is perfect for my reference.

The biggest challenge, as a developer, is to get it done right. The infrastructure has many built-in support. There are variety of input and output bindings. Each binding has a set of convenient conversion support as well as customization at your disposal.

Azure Function in Azure Portal

 

Azure Function from Visual Studio

Environment: Visual Studio 2017 Enterprise, .NET Core 2.0

After creating a new Function project, here what VS gives me

using System.IO;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;

namespace Aduze.Functions
{
    public static class HttpTriggerFunctions
    {
        [FunctionName("HttpTriggerDemo")]
        public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            string name = req.Query["name"];

            string requestBody = new StreamReader(req.Body).ReadToEnd();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            name = name ?? data?.name;

            return name != null
                ? (ActionResult)new OkObjectResult($"Hello, {name}")
                : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
        }
    }
}

There are notions of AspNetCore, WebJobs. In HttpTrigger function, the function accepts a HttpRequest and returns IActionResult. Whoever codes ASP.NET MVC knows what they are.

Challenge for Architects

How should you architect a system with all the power you have from Azure?

Azure WebJob Getting Started

After having ASP.NET Core website setup, having basic EF Core in place, let’s have some fun with Azure WebJob.

Journey to Azure

At the simplest form, WebJob is a background service (think of Windows Service) running alongside with the WebSite (or Web Application). The abstraction usage is that it will handle the long running jobs for web application, therefore, free the web application to serve as many requests as possible.

From the book or abstraction level thinking, think of a scenario where there is a book management website. A user has many books. One day, he wants to download all his books in a zip file. Assuming that the action will take time, in a matter of minutes, therefore, you do not want your users to wait. At the high design level, there are steps

  1. Record a download all books request.
  2. Trigger a background job to handle the request: Read all books and create a zip file.
  3. Email the user with a link to download the zip file.

#1 and #3 are handled via web application. #2 is a good candidate for a WebJob. There are other ways to implement #2. But, in the context of this post, it is a WebJob.

That is the overview, the abstract level. Looks simple and easy to understand. But, hmm everything has a but, devil is at the detail. Let’s get our hands dirty in the code.

Context

Everything has its own context. Here they are

Given that there is an ASP.NET Core 2.0 website running in Azure App Service. Configuration settings, connection strings are configured using portal dashboard

I want to be able to build a WebJob that:

  1. Can consume those settings. So I can manage application settings at one place, and change at wish without redeployment.
  2. Take advantages of Dependency Inject from Microsoft.Extensions (the same as ASP.NET Core application)

Simple like that!

Environment and Code

Visual Studio Enterprise 2017

Version 15.6.4

.NET Framework 4.7.02556

If you are using a different environment, some default settings might be different.

Add a WebJob project from VS installed templates

Visual Studio Azure WebJob
Visual Studio Azure WebJob

All the detail is here at MS docs.

I found a perfect post that has what I need. To implement the Dependency Inject, I need to tell JobHostConfiguration to use my custom JobActivator.

NuGet packages

Microsoft.Extensions.Configuration

Microsoft.Extensions.DependencyInjection

Microsoft.Extensions.Configuration.Json

Microsoft.Extensions.Options.ConfigurationExtensions

Microsoft.Extensions.Configuration.EnvironmentVariables

Before showing code, you must install those packages using NuGet, I prefer using Package Manager Console. Tips: Type “Tab” to use auto complete.

    class Program
    {
        // Please set the following connection strings in app.config for this WebJob to run:
        // AzureWebJobsDashboard and AzureWebJobsStorage
        static void Main()
        {
            IServiceCollection serviceCollection = new ServiceCollection();
            ConfigureServices(serviceCollection);
            var config = new JobHostConfiguration
            {
                JobActivator = new ServiceCollectionJobActivator(serviceCollection.BuildServiceProvider())
            };
            if (config.IsDevelopment)
            {
                config.UseDevelopmentSettings();
            }
            
            // See full trigger extensions https://github.com/Azure/azure-webjobs-sdk-extensions/blob/master/README.md 
            config.UseTimers();
            var host = new JobHost(config);
            
            // The following code ensures that the WebJob will be running continuously
            host.RunAndBlock();
        }
        /// <summary>
        /// https://matt-roberts.me/azure-webjobs-in-net-core-2-with-di-and-configuration/
        /// </summary>
        /// <param name="serviceCollection"></param>
        private static void ConfigureServices(IServiceCollection serviceCollection)
        {
            var config = new ConfigurationBuilder()
                .SetBasePath(Directory.GetCurrentDirectory())
                .AddJsonFile("appsettings.json", optional:true, reloadOnChange:true)
                .AddEnvironmentVariables()
                .Build();
            serviceCollection.AddOptions();
            serviceCollection.Configure<AppSetting>(config);
            // Configure custom services
            serviceCollection.AddScoped<Functions>();
        }
    }
  1. First, create a ServiceCollection and configure it with all dependencies. Pay attention to the use of AddEnvironmentVariables()
  2. Second, create a custom IJobActivator: ServiceCollectionJobActivator
  3. Wire them up
    public class ServiceCollectionJobActivator : IJobActivator
    {
        private readonly IServiceProvider _serviceProvider;

        public ServiceCollectionJobActivator(IServiceProvider serviceProvider)
        {
            _serviceProvider = serviceProvider;
        }
        public T CreateInstance<T>()
        {
            return _serviceProvider.GetService<T>();
        }
    }

A very simple implementation. What it does is telling the JobHostConfiguration to use IServiceProvider (supply from ServiceCollection) to create instances.

And I have DI at will

    public class Functions
    {
        private readonly AppSetting _settings;

        public Functions(IOptions<AppSetting> settingAccessor)
        {
            _settings = settingAccessor.Value;
        }
        public void FetchTogglTimeEntry([TimerTrigger("00:02:00")] TimerInfo timer, TextWriter log)
        {
            log.WriteLine("Toggl job settings: {0}", _settings.TogglJobSettings.Url);
        }
    }

    public class AppSetting
    {
        public TogglJobSettings TogglJobSettings { get; set; }
    }
    public class TogglJobSettings
    {
        public string Url { get; set; }
        public string SecretKey { get; set; }
    }

The Functions class accept IOptions<AppSetting> injected into its constructor, just like an MVC controller.

I just want to have only TogglJobSettings. However, it does not work if injected IOptions<TogglJobSettings>.

Application Settings
Application Settings

Looking at the syntax (having __ in the keys), they should have been correct with the binding. However, whatever in the Application settings, there will be APPSETTING_ prefix, by looking at the environment variables.

Go to Azure Website, access Console, type env to see all environment variables

Azure Console Evn
Azure Console View Environment Variables

Pretty cool tool 🙂

By creating a top level class AppSetting (think of appSettings in web.config), things just work out of the box.

Wrap Up

Once having them setup, I can start writing business code. The WebJobs SDK and its extensions supply many ways of triggering a job. The DI infrastructure setup might happen once, and reuse (copy and paste) many times in other WebJobs; however, I gain so much confident and knowledge when getting my hands on code. Well, actually, it is always a good, right way of learning anything.

If you are learning Azure, I suggest you open Visual Studio and start typing.

EF Core Getting Started

I am learning Entity Framework Core as part of my Azure journey. Database is an important part in an application. In the old days, developers wrote raw SQL queries. Later, we have had ADO.NET. Recently we have ORM. I have had a chance to work (know) the 2 big guys: NHibernate and Entity Framework.

ORM does more than just a mapping between object model and database representation, such as SQL Table, Column. Each ORM framework comes with a plenty of features, supports variety of scenarios. ORM helps you build a better application. Let’s discover some from the latest ORM from Microsoft: Entity Framework Core.

I was amazed by visiting the official document site. Everything you need to learn is there, in well-written, understandable pages. To my learning, I started with courses on Pluralsight, author Julie Lerman. If you happen to have Pluralsight account, go ahead and watch them. I is worth your time. Then I read the EF document on its official site.

It is  easy to say that “Hey I know Entity Framework Core“. Yes, I understand it. But I need the skill, not just a mental understanding. To make sure I build EF skill, I write blog posts and write code. It is also my advice to you, developers.

Journey to Azure

Getting Started Objectives

  1. Define a simple domain model and hook up with EF Core in ASP.NET Core + EF Core project
  2. Migration: From code to database
  3. API testing with Postman or Fiddler (I do not want to spend time on building UI)
  4. Unit Testing with In Memory and real databases.
  5. Running on Azure with Azure SQL
  6. Retry strategy

1 – Domain Model

To get started, I have only these super simple domain model

namespace Aduze.Domain
{
    public abstract class Entity
    {
        public int Id { get; set; }
    }

    public class User : Entity
    {
        public string LoginName { get; set; }
        public string FullName { get; set; }
        public Image Avatar { get; set; }
    }

    public class Image : Entity
    {
        public string Uri { get; set; }
    }
}

A User with an avatar (Image).

Next, I have to setup DbContext

namespace Aduze.Data
{
    public class AduzeContext : DbContext
    {
        public DbSet<User> Users { get; set; }

        public AduzeContext(DbContextOptions options)
        :base(options)
        {
            
        }
        
        protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
        {
        }

        protected override void OnModelCreating(ModelBuilder modelBuilder)
        {
            base.OnModelCreating(modelBuilder);
        }
    }
}

Pretty simple just like the example in the document site. Just a quick note here, I organize domain classes in Domain project, data access layer in Data project. I do not like the term Repository very much.

Wire them up in the ASP.NET Core Web project

       public void ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();
            services.AddDbContext<AduzeContext>(options =>
            {
                options.UseSqlServer(Configuration.GetConnectionString("AduzeSqlConnection"))
                    .EnableSensitiveDataLogging();
            });
            services.AddLogging(log =>
                log.AddAzureWebAppDiagnostics()
                    .AddConsole());
        }

Just call the extension method: AddDbContext and done. God damn simple!

2 – Migration

The system cannot work unless there is a database. There are 2 possible solutions

  1. Use your SQL skill and create database with correct schema.
  2. Use what EF offers

I have done the former many years. Let’s explore the later.

Having your VS 2017 opened, access the Package Manager Console window

Add-Migration

EF Core Add Migration
EF Core Add Migration

  1. Default project: Aduze.Data where the DbContext is configured.
  2. Add-Migration: A PowerShell command supplied by EF Core. Tips: Type Get-Help Add-Migration to ask for help
  3. InitializeUser: The migration name. One can give whatever makes sense.

After executed, The “Migrations” folder is added into the Data project. Visit EF Core document to understand what it does and syntaxes.

Script-Migration

So how does the SQL script look like?

PM> Script-Migration
IF OBJECT_ID(N'__EFMigrationsHistory') IS NULL
BEGIN
    CREATE TABLE [__EFMigrationsHistory] (
        [MigrationId] nvarchar(150) NOT NULL,
        [ProductVersion] nvarchar(32) NOT NULL,
        CONSTRAINT [PK___EFMigrationsHistory] PRIMARY KEY ([MigrationId])
    );
END;

GO

CREATE TABLE [Image] (
    [Id] int NOT NULL IDENTITY,
    [Uri] nvarchar(max) NULL,
    CONSTRAINT [PK_Image] PRIMARY KEY ([Id])
);

GO

CREATE TABLE [Users] (
    [Id] int NOT NULL IDENTITY,
    [AvatarId] int NULL,
    [FullName] nvarchar(max) NULL,
    [LoginName] nvarchar(max) NULL,
    CONSTRAINT [PK_Users] PRIMARY KEY ([Id]),
    CONSTRAINT [FK_Users_Image_AvatarId] FOREIGN KEY ([AvatarId]) REFERENCES [Image] ([Id]) ON DELETE NO ACTION
);

GO

CREATE INDEX [IX_Users_AvatarId] ON [Users] ([AvatarId]);

GO

INSERT INTO [__EFMigrationsHistory] ([MigrationId], [ProductVersion])
VALUES (N'20180420112151_InitializeUser', N'2.0.2-rtm-10011');

GO

Cool! I can take the script and run in SQL Management Studio. Having scripts ready, I can use them to create Azure SQL database later on.

Update-Database

Which allows me to create the database directly from Package Manager Console (which is a PowerShell). Let’s see

PM> Update-Database -Verbose

By turning Verbose on, It logs everything out in the console. The result is my database created

EF Update Database
EF Update Database

It is very smart. How could It do?

  1. Read the startup project Aduze.Web and extract the ConnectionString from appsettings.json
  2. Run the migrations created from Add-Migration command.

3 – API Testing

So far nothing has happened yet.

namespace Aduze.Web.Controllers
{
    public class UserController : Controller
    {
        private readonly AduzeContext _context;

        public UserController(AduzeContext context)
        {
            _context = context;
        }
        [HttpPost]
        public async Task<IActionResult> Create([FromBody]User user)
        {
            _context.Add(user);
            await _context.SaveChangesAsync();
            return Json(user);
        }

        [HttpGet]
        public async Task<IActionResult> Index()
        {
            var users = await _context.Users.ToListAsync();
            return Json(users);
        }
    }
}

A typical Web API controller.

  1. Create: Will insert a user. There is no validation, mapping between request to domain, … It is not a production code.
  2. Index: List all users.

Here is the test using Postman

API Test with Postman
API Test with Postman

If I invoke the /user endpoint, the user is on the list.

Hey, what was going on behind the scene?

EF SQL Log
EF SQL Log

There are plenty of information you can inspect from the Debug window. When inserting a user, those are queries sent to the database (you should see the one to insert the avatar image).

So far so good. I have gone from domain model and build a full flow endpoint API. How about unit testing?

4 – Unit Test

One of the biggest concern when doing unit test is the database dependency. How could EF Core help? It has In-Memory provider. But first, I have to refactor my code since I do not want to test API controller.

namespace Aduze.Data
{
    public class UserData
    {
        private readonly AduzeContext _context;

        public UserData(AduzeContext context)
        {
            _context = context;
        }

        public async Task<User> Create(User user)
        {
            _context.Add(user);
            await _context.SaveChangesAsync();
            return user;
        }

        public async Task<IEnumerable<User>> GetAll()
        {
            return await _context.Users.ToListAsync();
        }
    }
}

namespace Aduze.Web.Controllers
{
    public class UserController : Controller
    {
        private readonly UserData _userData;

        public UserController(UserData userData)
        {
            _userData = userData;
        }
        [HttpPost]
        public async Task<IActionResult> Create([FromBody]User user)
        {
            return Json(await _userData.Create(user));
        }

        [HttpGet]
        public async Task<IActionResult> Index()
        {
            return Json(await _userData.GetAll());
        }
    }
}

That’s should do the trick. Then just register the new UserData service to IoC

        public void ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();
            services.AddDbContext<AduzeContext>(options =>
            {
                options.UseSqlServer(Configuration.GetConnectionString("AduzeSqlConnection"))
                    .EnableSensitiveDataLogging();
            });
            services.AddScoped<UserData>();
        }

Time to create a test project: Aduze.Tests. And then install the Microsoft.EntityFrameworkCore.InMemory package

PM> Install-Package Microsoft.EntityFrameworkCore.InMemory

This is really cool, see below

Unit Test DbContext in Memory
Unit Test DbContext in Memory

Because my refactor UserData uses async version. It seems to have a problem with MS Tests runner. But it is the same with testing directly again AduzeDbContext.

  1. Use DbContextOptionsBuilder to tell EF Core that the context will use In Memory provider.
  2. Pass the options to DbContext constructor.

Having the power to control which provider will be using is a powerful design. One can have a test suite that is independent to the provider. Most of the time we will test with In Memory provider. But when time comes to verify that the database schema is correct, can switch to a real database.

5 – Azure SQL

Time to grow up … to the cloud with these simple steps

  1. Publish the web to Azure
  2. Create Azure SQL database
  3. Update connection string
  4. Run the script (remember the Script-Migration command?) to create database schema

Azure set ConnectionString
Azure set ConnectionString

Just add the connection string: AduzeSqlConnection (is defined in appsettings.json at the local development).

Test again with Postman. Oh yeah baby. It works like a charm.

6 – Retry Strategy

This topic is not something I want to explore at this stage of my learning journey. But it is important to be aware of, at least note down the reference link to Connection Resiliency.

 

Wrap Up

It is not something new nor complicated if we look at its surface. However, when I get my hands dirty at the code and writing, I learn so much. Knowing how to define a DbContext is easy, understanding why it was designed that way is another complete story.

But is that all about EF Core? No. It is just a beginning. There are many things that developers will look at them when they experience problems in real projects. The document is there, the community is there. Oh, S.O has all answers.

What I will look at next is how EF Core supports developers with DDD (Domain Driven Design).

%d bloggers like this: