.NET Core

Database Creation with DbUp

This is going to be a quick post to show how to configure DbUp to create a new database on top of the migration functionality we have covered in the past. Use the following post to catch up on the DbUp topics we have covered in the past.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp
Logging Script Output with DbUp
Migrations in Transactions with DbUp

Database Creation

The code for database creation is a one-liner that just needs a connection string and should be run before any migrations. The following is part of our Main function for the sample application we have been working with with the EnsureDatabase line added and highlighted which is the code that takes care of database creation.

static int Main(string[] args)
{
    var connectionString =
        args.FirstOrDefault()
        ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

    EnsureDatabase.For.SqlDatabase(connectionString);

    var upgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => !f.Contains(".AlwaysRun."))
                     .LogToConsole()
                     .WithTransaction()
                     .Build();

    var result = upgrader.PerformUpgrade();

Wrapping Up

As you can see it is simple to add the ability to create databases in your DbUp applications. Database creation in the official DbUp docs is just a note and is easy to miss so I wanted to do a post to call it out as it is a powerful feature when you need it. In our example, the application will always attempt to create the database if it doesn’t exist, but if that doesn’t fit your use case it might be worth wrapping database creation in an argument so the user could decide if the database should be created or not.

Database Creation with DbUp Read More »

Don’t Launch a Browser Running ASP.NET Core Back-end Created from Web Template Studio

In last week’s post, I mentioned off-hand that we could stop VSCode from launching a web browser when starting a debug session on the back-end of our application. This is going to be a quick post on how to stop that browser launch.  The following are the previous Web Template Studio related posts if you want to catch up.

Create an Application with Web Template Studio
Debug ASP.NET Core Back-end Created from Web Template Studio

 

Launch Configurations

If you recall from last week to run our back-end we used the debug tab in VSCode and then hit the Run button for the .NET Core Launch (web) configuration.

The options that are available for running in VSCode are controlled by a launch.json file found in the .vscode directory. The following screenshot is the launch.json for the sample project from last week.

Stop the Browser

To stop the browser from opening delete the serverReadyAction section in the specific configuration you are dealing with. From the sample in the screenshot above the following is what would be removed from the  .NET Core Launch (web) configuration.

// Enable launching a web browser when ASP.NET Core starts. For more information: https://aka.ms/VSCode-CS-LaunchJson-WebBrowser
"serverReadyAction": {
    "action": "openExternally",
    "pattern": "\\bNow listening on:\\s+(https?://\\S+)"
},

With the above removed hitting the run button for this configuration will no longer launch a browser, but the rest of the debugging experience will remain as it was before.

Wrapping Up

Hopefully, this will help remove a little annoyance in the development. For more details on the serverReadyAction check out the docs. I also recommend checking out the full docs for launch.json to get a good feel of what all is possible.

Don’t Launch a Browser Running ASP.NET Core Back-end Created from Web Template Studio Read More »

Debug ASP.NET Core Back-end Created from Web Template Studio

Last week we covered how to Create an Application with Web Template Studio and now that we have an application we are going to work through how to debug the ASP.NET Core back-end.

Sample Application

The sample application used in this post has been expanded from what we used last week to include two instances of each page type provided by Web Template Studio. The gird, list, and master/detail page types trigger controllers to be added to our back-end application which will give us something to work with. To add these extra page types you have to run through the full wizard in Web Template Studio and not use the create project button on the first page of the wizard. The following is a screenshot of the new sample application with the extra page types.

Debugging in VSCode

Our goal in this post is going to be to hit a breakpoint in the controller action that returns the data for the gird page in the screenshot above. Looking through the project under server/Controllers it looks pretty likely that we are interested in the GridController. This controller only has one action so click to the left of the line number for the line that has the return statement for the Get function.

Unlike the post from last week to debug the back-end we need to run it separate from the front-end which means we need to move away from using npm start to run the whole application. Use the following command to start just the front-end.

npm start-frontend

Now to run the back-end goto the Run section in VSCode and then click the Play button to run the back-end using the .NET Core Launch (web) profile.

This will launch a blank page which we don’t need and it can be closed. If the blank page bothers you too much it can be changed in the launch profile. Now that our back-end is running use the front-end to navigate to the Grid page.

Wrapping Up

Hopefully, with the above steps, you are ready to debug the back-end part of your application. I’m betting for most of us running the front-end and back-end separately is going to more closely match how we normally work. For more information on debugging in VSCode check out the official doc.

Debug ASP.NET Core Back-end Created from Web Template Studio Read More »

Create an Application with Web Template Studio

Creating new applications is something I do a fair amount of. Most of the time they are throwaway projects used to test something out or demo projects used for this blog. With all the project creation going on I try and keep an eye out for tools that make the process easier. This post is going to cover a tool a came across a few weeks ago from Microsoft call Web Template Studio (WebTS).

What is WebTS?

From the project’s GitHub read me:

Microsoft Web Template Studio (WebTS) is a Visual Studio Code Extension that accelerates the creation of new web applications using a wizard-based experience. WebTS enables developers to generate boilerplate code for a web application by choosing between different front-end frameworks, back-end frameworks, pages and cloud services. The resulting web app is well-formed, readable code that incorporates cloud services on Azure while implementing proven patterns and best practices.

For the front-end, we have options of Angular, React, or Vue and on the back-end, there are options for ASP.NET Core, Flask, Molecular, or Node.

Installation

We are assuming you already have a recent version of Visual Studio Code (VSCode) installed, but if not you can download it from here. Open VSCode and from the left side of the screen select Extensions. In the search box at the top enter Web Template Studio and then from the list select Web Template Studio. In the detail page that opens click the green Install button at the top.

Project Creation

Now that we have the extension installed press Ctrl + Shift + P to open the VSCode command palette and enter Web Template Studio: Launch (or as much as needed for the option to show) and then select Web Template Studio: Launch.

This will launch the Web Template Studio project creation process. The first page that shows gives you a very basic set of options, but enough for our example. To see the full array of options Web Template Studio provides hit next on this first screen instead of Create Project like we do in this example. Also, note that if you want an ASP.NET [Core] back-end you need to select it on this summary screen as it isn’t yet an option later in the process. Back to our example, on this page, you will need to enter a Project Name, pick a Front-end Framework, and a Back-end Framework before finally clicking Create Project.

After the process finishes a dialog will show letting us know it is complete. Click Open Project to load the created project in a new instance of VSCode.

Running the Project

Before getting to a runnable state we need to run a couple of terminal commands first. This can be done from VSCode’s built-in terminal or from an external terminal of your choice. In VSCode if you don’t see a terminal you can use Cntrl + Shift + ` to open a new one, or from the menu select Terminal > New Terminal. The following are the two commands that need to be run using npm, but they can be adjusted to use yarn instead if you prefer it. The second command may be specific to an ASP.NET Core back-end, make sure and check the project README.md for verification.

npm install
npm run restore-packages

Now the project can be run with the following command.

npm start

The resulting application will look something like this.

Wrapping Up

WebTS seems to be a great tool for quickly getting a project up and running. I do recommend going through the full project creation wizard as it has options to set up Azure integration as well as the ability to add up to 20 different pages to the generated application. Also, keep in mind that the ASP.NET back-end framework option is pretty new so I wouldn’t be surprised to see some changes as it progresses through the preview stage. Make sure and check out the project’s GitHub repo.

Create an Application with Web Template Studio Read More »

Migrations in Transactions with DbUp

This post will be continuing our exploration of DbUp with the addition of transactions as part of migration execution. If you are new to DbUp check out the following post to catch up on this series.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp
Logging Script Output with DbUp

Transaction Types and Restrictions

DbUp has three different options for transactions. The default is no transaction, which is what we have been using so far. The other two options are a transaction per migration and finally a single transaction for all the migrations in a single upgrader. As far as I can tell so far there is no way to use the same transaction across multiple upgraders so that means in our sample application we could have our normal migrations work correctly and get committed, but have an issue in our always run migrations. The other thing to keep in mind is that some database providers don’t support transactions when changing the structure of the database. As pointed out by the DbUp docs, the Postgres docs have a good summary of transactional support for DDL of different database providers.

Adding Transactions

In the following example, we have changed the upgrader to use a single transaction. This is the full code for the upgrader with the change highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .WithTransaction()
                 .Build();

The other options are WithTransactionPerScript for, surprise, a transaction per migration, and WithoutTransaction for no transactions (or just leave it out since this is the default value). The following is a sample of the output with transactions on with two different upgraders in play and the begin transactions highlighted.

Beginning transaction
Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning transaction
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.01-EverytimeNoPrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.02-EverytimeNoPrintsNoCountOn.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.03-EverytimePrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.04-EverytimePrintNoCountOn.sql'
Upgrade successful
Success!

Wrapping Up

DbUp makes adding transactions to your migrations super simple so this was a really short post. It would be nice to have an option to use a transaction across multiple upgraders. The official DbUp docs on transactions can be found here.

Migrations in Transactions with DbUp Read More »

Logging Script Output with DbUp

In today’s post, we will be going over how to log script output during a DbUp run. If you are new to this series of posts or DbUp in general you might find the following post helpful to review.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp

Enable Script Output Logging

I will be using the always run migrations we covered last weekend to simplify testing, but the concepts are the same for normal migrations. The change to enable script output logging is a single line added to our upgrader setup as you can see with the highlighted line in the following code.

var alwaysRunUpgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                       f => f.Contains(".AlwaysRun."))
                 .JournalTo(new NullJournal())
                 .LogToConsole()
                 .LogScriptOutput()
                 .Build();

Running the application will result in something like the following. Notice the records affected line which is the output from our migration.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
RecordsAffected: 0

Upgrade successful
Success!

Improve the Script Output

So the above is great for a simple migration that has a single statement, but for more complex migrations it might be helpful to use PRINT statements to provide more information. Let’s tweak our script to print the start and end times of the execution. The following is the sample script (and yes I know it would never actually update anything).

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time';

The above migration results in the following output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:15:21.877 Start every time

2020-08-05 06:15:21.877 End every time

RecordsAffected: 0

Upgrade successful
Success!

The prints are helpful, but the records affected is noise in this case, especially since it is the records affected for the last statement and does print per SQL statement. To suppress the records affected message we and use SET NOCOUNT ON in our script like the following.

SET NOCOUNT ON;

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time with NoCount ON';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time with NoCount ON';

SET NOCOUNT OFF;

And as you can see in the following we got a cleaner output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:38:18.863 Start every time

2020-08-05 06:38:18.863 End every time

Upgrade successful
Success!

Wrapping Up

Logging the output of scripts can be really helpful at times, but if you are going to use it as we saw above it is important to make sure and plan what that output is going to look like. Having 500 instances of the number of records affected isn’t necessarily helpful.

Logging Script Output with DbUp Read More »

Always Run Migrations with DbUp

In this post, we are going to walk through how to use DbUp to run specific migrations on every run instead of just once. If you are new to DbUp check out the following post to see how the sample project got to its current state.

Database Migrations with DbUp
Code-based Database Migrations with DbUp

 

Differentiating for Always Run

One of the keys to getting migrations that should run every time is being able to identify them separately from migrations that should only run once. This can be done at either the filename level or by placing all the always run files in a specific directory. This sample is going with the directory methods, but the file method would work basically the same way. In the sample project under the existing Scripts directory, I added an AlwaysRun directory. I also added an example script in this directory so we could verify that it is getting executed on every run. In the sample, the file is named Everytime.sql.

Based on the above when DbUp is processing it will see Everytime.sql as DbupTest.Scripts.AlwaysRun.Everytime.sql.

Filtering Out Always Run

Now that we have a way to identify the migrations we want to always run we need to filter them out of the existing upgrade process so that they don’t get included in the log of already executed migrations. In DbUp this log is referred to as the journal. The extension methods that setup how migrations will be located, either WithScriptsEmbeddedInAssembly or WithScriptsAndCodeEmbeddedInAssembly in our examples, can be provided with a filter which we will use to exclude migrations that are in the AlwaysRun directory. The following code is the upgrader with the filter in place.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .Build();

var result = upgrader.PerformUpgrade();

Executing Always Run Migrations

Now that we have adjusted our original upgrader to exclude migrations in the AlwaysRun directory we need something to execute the always run migrations. To accomplish this we add a second upgrader that is filtered just to AlwaysRun scripts. The second really important bit about this second upgrader is that it uses a NullJournal. The NullJournal is what keeps the execution of the scripts from being logged which results in them always getting run. The following code was inserted after the above code. The highlighted bits point out the filter and null journal.

if (result.Successful)
{
    var alwaysRunUpgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => f.Contains(".AlwaysRun."))
                     .JournalTo(new NullJournal())
                     .LogToConsole()
                     .Build();

    result = alwaysRunUpgrader.PerformUpgrade();
}

The following is the full Main function just in case the above code didn’t give enough context on how the code should fit together.

static int Main(string[] args)
{
    var connectionString =
        args.FirstOrDefault()
        ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

    var upgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => !f.Contains(".AlwaysRun."))
                     .LogToConsole()
                     .Build();

    var result = upgrader.PerformUpgrade();

    if (result.Successful)
    {
        var alwaysRunUpgrader =
            DeployChanges.To
                         .SqlDatabase(connectionString)
                         .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                               f => f.Contains(".AlwaysRun."))
                         .JournalTo(new NullJournal())
                         .LogToConsole()
                         .Build();

        result = alwaysRunUpgrader.PerformUpgrade();
    }

    if (!result.Successful)
    {
        Console.ForegroundColor = ConsoleColor.Red;
        Console.WriteLine(result.Error);
        Console.ResetColor();
#if DEBUG
        Console.ReadLine();
#endif                
        return -1;
    }

    Console.ForegroundColor = ConsoleColor.Green;
    Console.WriteLine("Success!");
    Console.ResetColor();

    return 0;
}

Wrapping Up

With the above changes we now have a project that can run normal migrations, code-base migrations, and always run migrations. No matter what style of database change management you decide on having the changes in some sort of source control system will be super helpful.

For more information on DbUp’s journalling check out the official docs.

Always Run Migrations with DbUp Read More »

Code-based Database Migrations with DbUp

In today’s post, we are going to check out the code-based migration feature of DbUp which allows code to run as part of the migration process instead of just SQL based scripts. The ability to run code as part of the migration provides a ton of flexibility in the migration process. This builds on last week’s post, Database Migrations with DbUp, make sure and check it out if you are new to DbUp.

Script Provider Change

In our example application from last week’s post in the Program class when setting up our upgrader we used the following setup.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

The above only picks up embedded SQL type files. In order to also pick up code-based migrations we have to change from the WithScriptsEmbeddedInAssembly script provider to WithScriptsAndCodeEmbeddedInAssembly. The following is the upgrader with the new script provider highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Code-based Migrations

To add a code-based migration add a new class and set it to implement DbUp’s IScript interface. The script provider we changed to above will pick up any non-abstract class that implements IScript as well as SQL script files. The results are combined and sorted by name to determine execution order so make sure whatever naming convention you pick will sort properly for both your SQL scripts as well as code.

The IScript interface defines a single function, ProvideScript, that has a single parameter that provides a function to create a database command and returns a string. Depending on the need you can return a string with the SQL to be executed or you can create a command and execute any commands you need directly on the database or some combination of both if needed.

To test out code-based migrations, I added the following new class in the Scripts folder. It doesn’t actually change any data but instead logs some information to the console. Even without changing data it still conveys how code-based migrations work.

using System;
using System.Data;
using DbUp.Engine;

namespace DbupTest.Scripts
{
    public class Code0002Test : IScript
    {
        public string ProvideScript(Func<IDbCommand> dbCommandFactory)
        {
            using (var cmd = dbCommandFactory())
            {
                cmd.CommandText = "SELECT DeliveryMethodName FROM Application.DeliveryMethods";

                using (var dr = cmd.ExecuteReader())
                {
                    Console.WriteLine("  Delivery Methods");

                    while (dr.Read())
                    {
                        Console.WriteLine($"    {dr["DeliveryMethodName"]}");
                    }
                }
            }

            return string.Empty;
        }
    }
}

The following is the console output from running the application.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
  Delivery Methods
    Air Freight
    Chilled Van
    Courier
    Customer Collect
    Customer Courier to Collect
    Delivery Van
    Post
    Refrigerated Air Freight
    Refrigerated Road Freight
    Road Freight
Executing Database Server script 'DbupTest.Scripts.Code0002Test.cs'
Checking whether journal table exists..
Upgrade successful
Success!

Wrapping Up

The ability to run code as part of the database migration process is an awesome set of functionality. I would advise sticking with SQL based migrations as default and only using code when SQL doesn’t provide a way to do what is needed or doing the migration in code is much more understandable. Make sure and check out the official DbUp docs on this feature for more information.

Code-based Database Migrations with DbUp Read More »

Database Migrations with DbUp

Managing database schemas can be a challenging problem. It is also an area where there is no one size fits all solution (not sure that is ever really true for anything). Solutions range from manual script management, database vendor-specific options (such as DACPAC for Microsoft SQL Server), to migrations. This post will be looking at one of the options for a migration based approach using DbUp.

Sample Database

I needed a sample database to start off with so I dug up one of my posts for Getting a Sample SQL Server Database to guide me through getting Microsoft’s WideWorldImporters sample database downloaded and restored.

DbUp Console Application

There are quite a few ways DbUp can be used, for this example, we are going to be using it from a .NET Core Console application. From a terminal use the following command to create a new console application in the directory you want the application in.

dotnet new console

Now we can use the following command to add the DbUp NuGet package to the sample project. In this case, we are using the SQL Server package, but there are packages for quite a few database providers so install the one that is appropriate for you.

dotnet add package dbup-sqlserver

Next, open the project in Visual Studio (or any editor but part of how this example is setup is easier in Visual Studio). In the Program class replace all the code with the following. We will look at a couple big of this code that below.

using System;
using System.Linq;
using System.Reflection;
using DbUp;

namespace DbupTest
{
    class Program
    {
        static int Main(string[] args)
        {
            var connectionString =
                args.FirstOrDefault()
                ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

            var upgrader =
                DeployChanges.To
                             .SqlDatabase(connectionString)
                             .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                             .LogToConsole()
                             .Build();

            var result = upgrader.PerformUpgrade();

            if (!result.Successful)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine(result.Error);
                Console.ResetColor();
#if DEBUG
                Console.ReadLine();
#endif                
                return -1;
            }

            Console.ForegroundColor = ConsoleColor.Green;
            Console.WriteLine("Success!");
            Console.ResetColor();

            return 0;
        }
    }
}

The bit below is trying to pull the connection string for SQL Server out of the first argument passed from the terminal to the application and if no arguments were passed in then it falls back to a hardcoded value. This works great for our sample, but I would advise against the fallback value for production use. It is always a bad day when you think your migrations have run successfully but it was on the wrong database because of a fall back value.

var connectionString = 
    args.FirstOrDefault() 
    ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

This next section is where all the setup happens for which database to deploy to, where to find the scripts to run, and where to log. There are a lot of options provided by DbUp in this area and I recommend checking out the docs under the More Info section for the details.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Finally, the following is where the scripts are actually executed against the database.

var result = upgrader.PerformUpgrade();

The rest of the function is dealing with displaying to the console the results of the scripts running.

Adding Scripts

Keep in mind that we are using the Embedded Scripts option so DbUp is going to find all the embedded files that end with ‘.sql’. I have added a Scripts directory to the project so the scripts don’t clutter the root of the project, but that isn’t a requirement. Do not that how the files are named will control the order in which the scripts get executed so make sure you establish a good naming convention upfront. For our sample, I added a file named 0001-TestScript.sql to the Scripts directory. Since we are using the embedded scripts provider it is critical that after adding the file we go to its properties and set the Build Action to Embedded resource.

The script file itself doesn’t do anything in this case except select a value.

Trying it out

At this point, we can hit Run in Visual Studio and it will execute our new script. The output will look something like the following.

If we were to run the application again it would tell us that no new scripts need to be executed. DbUp like all migration based solutions I have encountered use a table in the database to keep a record of what migrations have been run. The default table for DbUp is SchemaVersion and it consists of an Id, ScriptName, and Applied columns. Given this structure, it is important to keep in mind that if you rename a script that has already been executed on a database DbUp will see that as a different script and execute it again.

Wrapping Up

In the docs for DbUp there is a philosophy section and one of the points is that a database is a result of a set of transitions, not just its new state vs. its old state. This point is key to why I am a fan of a migration based approach.

Check out the DbUp docs and GitHub repo for more information.

Database Migrations with DbUp Read More »

GitHub: Use Actions to Run Multiple Jobs

In the post, we are going to take our sample Workflow that builds two ASP.NET Core web applications and split it so each web application is built individually. This post is using the repo and Workflow built in the following posts if you need to catch up.

GitHub: Import an Azure DevOps Repo
GitHub: Use Actions to build ASP.NET Core Application
GitHub: Use Actions to Publish Artifacts

Starting Point and the Plan

Our Workflow currently contains a single job that just happens to build two ASP.NET Core web application based on the fact that the .NET CLI picks up and builds both applications. The following is the YAML for our current Workflow.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
      
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore
      
    - name: Build
      run: dotnet build --configuration Release --no-restore
      
    - name: Test
      run: dotnet test --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish 
    
    - name: Upload WebApp1 Build Artifact
      uses: actions/upload-artifact@v2
      with:
        name: WebApp1
        path: /home/runner/work/Playground/Playground/src/WebApp1/bin/Debug/netcoreapp3.1/publish/
        
    - name: Upload WebApp2 Build Artifact
      uses: actions/upload-artifact@v2
      with:
        name: WebApp2
        path: /home/runner/work/Playground/Playground/src/WebApp2/bin/Debug/netcoreapp3.1/publish/

This post is going to take this Workflow and split the build and publish of the two web applications into two jobs. By splitting the Workflow into multiple jobs we open the possibility that the jobs can run in parallel. One reason to do this would be to speed up the total Workflow run time if you have parts of your build that are independent. Another example of why you would need multiple jobs is if the different jobs need different needed to run on different operating systems such as one needing to run Windows and another a Linux.

Creating the Jobs

The following is the Workflow set up along with the job to build the first web application. The first change was of the ID from build to build_web_app1 since each job has to have a unique ID. Most of the rest of the highlighted changes are related to the .NET CLI commands that are now directed at a specific project. Do also note that we changed from a hardcoded path to using expression to get the workspace path which is the ${{ github.workspace }} bit instead of /home/runner/work/Playground/Playground/. See the expression syntax docs for more info.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build_web_app1:

    name: Build WebApp1
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
      
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
    
    - name: Upload Build Artifact
      uses: actions/upload-artifact@v2
      with:
        name: WebApp1
        path: ${{ github.workspace }}/src/WebApp1/bin/Debug/netcoreapp3.1/publish/

Here is the full file with both jobs defined. As you can see the second job is basically the same thing as the first one with a different ID, name, and project.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build_web_app1:

    name: Build WebApp1
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
      
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
    
    - name: Upload Build Artifact
      uses: actions/upload-artifact@v2
      with:
        name: WebApp1
        path: ${{ github.workspace }}/src/WebApp1/bin/Debug/netcoreapp3.1/publish/
        
  build_web_app2:

    name: Build WebApp2
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
      
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp2/WebApp2.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp2/WebApp2.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp2/WebApp2.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp2/WebApp2.csproj
       
    - name: Upload Build Artifact
      uses: actions/upload-artifact@v2
      with:
        name: WebApp2
        path: ${{ github.workspace }}/src/WebApp2/bin/Debug/netcoreapp3.1/publish/

After all the edits are done commit the changes to the repo to run the Workflow. From the results of the Workflow run, you will see that it now has two jobs and we still got the artifacts for both applications as we had before.

Wrapping Up

If your applications need it breaking them up into different jobs can be helpful not only with Workflow runtimes but it can also help your ability to reason about what each part of your build process is doing.

GitHub: Use Actions to Run Multiple Jobs Read More »