#Automation: Getting myself #fired or #free

And there is the belief that automation will create a crisis, and that people without jobs will revel against society. The truth is that I rather lose my job and look for something else above that, than letting myself slave away doing a repetitive task just because I don't want to move on.

And there is the belief that automation will create a crisis, and that people without jobs will revel against society. The truth is that I rather lose my job and look for something else above that, than letting myself slave away doing a repetitive task just because I don't want to move on.

https://anchor.fm/jorgeescobar/episodes/Automation-Getting-myself-fired-or-free-e3buej/a-ab6f9e

What are the differences between Standard SQL and Transact-SQL?

What are the differences between Standard SQL and Transact-SQL?

In this article, we'll explain syntax differences between standard SQL and the Transact-SQL language dedicated to interacting with the SQL

#1 Names of Database Objects

In relational database systems, we name tables, views, and columns, but sometimes we need to use the same name as a keyword or use special characters. In standard SQL, you can place this kind of name in quotation marks (""), but in T-SQL, you can also place it in brackets ([]). Look at these examples for the name of a table in T-SQL:

CREATE TABLE dbo.test.“first name” ( Id INT, Name VARCHAR(100));
CREATE TABLE dbo.test.[first name]  ( Id INT, Name VARCHAR(100));

Only the first delimiter (the quotation marks) for the special name is also part of the SQL standard.

What Is Different in a SELECT Statement?#2 Returning Values

The SQL standard does not have a syntax for a query returning values or values coming from expressions without referring to any columns of a table, but MS SQL Server does allow for this type of expression. How? You can use a SELECT statement alone with an expression or with other values not coming from columns of the table. In T-SQL, it looks like the example below:

SELECT 12/6 ;

In this expression, we don’t need a table to evaluate 12 divided by 6, therefore, the FROM statement and the name of the table can be omitted.

#3 Limiting Records in a Result Set

In the SQL standard, you can limit the number of records in the results by using the syntax illustrated below:

SELECT * FROM tab FETCH FIRST 10 ROWS ONLY

T-SQL implements this syntax in a different way. The example below shows the MS SQL Server syntax:

SELECT * FROM tab ORDER BY col1 DESC OFFSET 0 ROWS FETCH FIRST 10 ROWS ONLY;

As you notice, this uses an ORDER BY clause. Another way to select rows, but without ORDER BY, is by using the TOP clause in T-SQL:

SELECT TOP 10 * FROM tab;
#4 Automatically Generating Values

The SQL standard enables you to create columns with automatically generated values. The syntax to do this is shown below:

CREATE TABLE tab (id DECIMAL GENERATED ALWAYS AS IDENTITY);

In T-SQL we can also automatically generate values, but in this way:

CREATE TABLE tab (id INTEGER IDENTITY);
#5 Math Functions

Several common mathematical functions are part of the SQL standard. One of these math functions is CEIL(x), which we don’t find in T-SQL. Instead, T-SQL provides the following non-standard functions: SIGN(x), ROUND(x,[,d]) to round decimal value x to the number of decimal positions, TRUNC(x) for truncating to given number of decimal places, LOG(x) to return the natural logarithm for a value x, and RANDOM() to generate random numbers. The highest or lowest number in a list in the SQL standard is returned by MAX(list) and MIN(list) functions, but in Transact-SQL, you use the GREATEST(list) and LEAST(list) functions.

T-SQL function ROUND:

SELECT ROUND(col) FROM tab;

#6 Aggregate Functions

We find another syntax difference with the aggregate functions. The functions COUNT, SUM, and AVG all take an argument related to a count. T-SQL allows the use of DISTINCT before these argument values so that rows are counted only if the values are different from other rows. The SQL standard doesn't allow for the use of DISTINCT in these functions.

Standard SQL:
SELECT COUNT(col) FROM tab;

T-SQL:
SELECT COUNT(col) FROM tab;

SELECT COUNT(DISTINCT col) FROM tab;

But in T-SQL we don’t find a population covariance function: COVAR_POP(x,y), which is defined in the SQL standard.

#7 Retrieving Parts of Dates and Times

Most relational database systems deliver many functions to operate on dates and times.

In standard SQL, the EXTRACT(YEAR FROM x) function and similar functions to select parts of dates are different from the T-SQL functions like YEAR(x) or DATEPART(year, x).

There is also a difference in getting the current date and time. Standard SQL allows you to get the current date with the CURRENT_DATE function, but in MS SQL Server, there is not a similar function, so we have to use the GETDATE function as an argument in the CAST function to convert to a DATE data type.

#8 Operating on Strings

Using functions to operate on strings is also different between the SQL standard and T-SQL. The main difference is found in removing trailing and leading spaces from a string. In standard SQL, there is the TRIM function, but in T-SQL, there are several related functions: TRIM (removing trailing and leading spaces), LTRIM (removing leading spaces), and RTRIM (removing trailing spaces).

Another very-often-used string function is SUBSTRING.

The standard SQL syntax for the SUBSTRING function looks like:

SUBSTRING(str FROM start [FOR len])

but in T-SQL, the syntax of this function looks like:

SUBSTRING(str, start, length)

There are reasons sometimes to add values coming from other columns and/or additional strings. Standard SQL enables the following syntax to do this:

As you can see, this syntax makes use of the || operator to add one string to another.

But the equivalent operator in T-SQL is the plus sign character. Look at this example:

SELECT col1 + col2  FROM tab;

In SQL Server, we also have the possibility to use the CONCAT function concatenates a list of strings:

SELECT CONCAT(col1, str1, col2, ...)  FROM tab;

We can also repeat one character several times. Standard SQL defines the function REPEAT(str, n) to do this. Transact-SQL provides the REPLICATE function. For example:

SELECT  REPLICATE(str, x);

where x indicates how many times to repeat the string or character.

#9 Inequality Operator

During filtering records in a SELECT statement, sometimes we have to use an inequality operator. Standard SQL defines <> as this operator, while T-SQL allows for both the standard operator and the != operator:

SELECT col3 FROM tab WHERE col1 != col2;
#10 ISNULL Function

In T-SQL, we have the ability to replace NULL values coming from a column using the ISNULL function. This is a function that is specific to T-SQL and is not in the SQL standard.

SELECT ISNULL(col1) FROM tab;
Which Parts of DML Syntax Are Different?

In T-SQL, the basic syntax of DELETE, UPDATE, and INSERT queries is the same as the SQL standard, but differences appear in more advanced queries. Let’s look at them.

#11 OUTPUT Keyword

The OUTPUT keyword occurs in DELETE, UPDATE, and INSERT statements. It is not defined in standard SQL.

Using T-SQL, we can see extra information returned by a query. It returns both old and new values in UPDATE or the values added using INSERT or deleted using DELETE. To see this information, we have to use prefixes in INSERT, UPDATE, and DELETE.

UPDATE tab SET col='new value'
OUTPUT Deleted.col, Inserted.col;

We see the result of changing records with the previous and new values in an updated column. The SQL standard does not support this feature.

#12 Syntax for INSERT INTO ... SELECT

Another structure of an INSERT query is INSERT INTO … SELECT. T-SQL allows you to insert data from another table into a destination table. Look at this query:

INSERT INTO tab SELECT col1,col2,... FROM tab_source;

It is not a standard feature but a feature characteristic of SQL Server.

#13 FROM Clause in DELETE and UPDATE

SQL Server provides extended syntax of the UPDATE and DELETE with FROM clauses. You can use DELETE with FROM to use the rows from one table to remove corresponding rows in another table by referring to a primary key and a foreign key. Similarly, you can use UPDATE with FROM update rows from one table by referring to the rows of another table using common values (primary key in one table and foreign key in second, e.g. the same city name). Here is an example:

DELETE FROM Book
FROM Author
WHERE Author.Id=Book.AuthorId AND Author.Name IS NULL;

UPDATE Book
SET Book.Price=Book.Price*0.2
FROM Author
WHERE Book.AuthorId=Author.Id AND Author.Id=12;

The SQL standard doesn’t provide this syntax.

#14 INSERT, UPDATE, and DELETE With JOIN

You can also use INSERT, UPDATE, and DELETE using JOIN to connect to another table. An example of this is:

DELETE ItemOrder FROM ItemOrder
JOIN Item ON ItemOrder.ItemId=Item.Id
WHERE YEAR(Item.DeliveredDate) <= 2017;

This feature is not in the SQL standard.

Summary

This article does not cover all the issues about syntax differences between the SQL standard and T-SQL using the MS SQL Server system. However, this guide helps point out some basic features characteristic only of Transact-SQL and what SQL standard syntax isn’t implemented by MS SQL Server.

Thanks for reading. If you liked this post, share it with all of your programming buddies!

Originally published on https://dzone.com


Build a CRUD App with ASP.NET Core 2.2 and SQL Server

Build a CRUD App with ASP.NET Core 2.2 and SQL Server

​ I’ve always said that you can tell a lot about a person by the kind of music they listen to. Don’t tell me you haven’t had serious doubts about whether you can be friends with someone when you find out that they like a particular band or artist. In that spirit, I created *JudgeMyTaste*, an ASP.NET Core web application where people can enter their favorite band or artist so that people on the Internet can judge them openly. ​ The combination of ASP.NET and SQL Server is probably the most common pairing in the enterprises that use ASP.NET. With ASP.NET Core and SQL Server both being cross-platform, you don’t *have* to run this combination on Windows anymore! I’ll show you how to create a basic CRUD application using ASP.NET Core 2.2 and SQL Server 2017. I’ll be running on Linux, but with the free tools used here, it won’t matter what operating system you’re using! ​ The tools I’ll be using that are available for all platforms are: * SQL Server 2017 (I’ll be running on Ubuntu 18.04) * Visual Studio Code * Azure Data Studio * ASP.NET Core 2.2 ​ Once you’ve got all the tools installed for your platform, let’s rock and roll! ​ ​ ​ ## Scaffold Your ASP.NET Core 2.2 Application ​ No matter the platform you’re on, the ```dotnet``` CLI is available. The commands used here should be the same for everyone. To scaffold the ASP.NET Core 2.2 MVC application, create a new folder for it: I’ve always said that you can tell a lot about a person by the kind of music they listen to. Don’t tell me you haven’t had serious doubts about whether you can be friends with someone when you find out that they like a particular band or artist. In that spirit, I created JudgeMyTaste, an ASP.NET Core web application where people can enter their favorite band or artist so that people on the Internet can judge them openly.

I’ve always said that you can tell a lot about a person by the kind of music they listen to. Don’t tell me you haven’t had serious doubts about whether you can be friends with someone when you find out that they like a particular band or artist. In that spirit, I created JudgeMyTaste, an ASP.NET Core web application where people can enter their favorite band or artist so that people on the Internet can judge them openly.

The combination of ASP.NET and SQL Server is probably the most common pairing in the enterprises that use ASP.NET. With ASP.NET Core and SQL Server both being cross-platform, you don’t have to run this combination on Windows anymore! I’ll show you how to create a basic CRUD application using ASP.NET Core 2.2 and SQL Server 2017. I’ll be running on Linux, but with the free tools used here, it won’t matter what operating system you’re using!

The tools I’ll be using that are available for all platforms are:

  • SQL Server 2017 (I’ll be running on Ubuntu 18.04)
  • Visual Studio Code
  • Azure Data Studio
  • ASP.NET Core 2.2

Once you’ve got all the tools installed for your platform, let’s rock and roll!

Scaffold Your ASP.NET Core 2.2 Application

No matter the platform you’re on, the dotnet CLI is available. The commands used here should be the same for everyone. To scaffold the ASP.NET Core 2.2 MVC application, create a new folder for it:

mkdir JudgeMyTaste

Change into that new directory:

cd JudgeMyTaste

Then run the following command:

dotnet new mvc

Then open the new application in VS Code.

code .

When you open the new application in VS Code, you should get a warning in the bottom right corner asking to add some missing assets. Go ahead and add the missing assets. You’ll see the .vscode folder added with a launch.json and a tasks.json file.

These will allow you to run the application from VS Code. To verify that everything scaffolded properly, run the base application by typing F5. This will build the application, run it, and open it in a new browser window.

You may notice a strange error page come up, if you’ve never run an ASP.NET Core 2.x application before. By default ASP.NET Core wants to run on HTTPS. This is a recommended practice for web applications. You could avoid this message by removing the redirect to HTTPS in your Startup.cs or by generating a certificate for your local machine, but this error screen only comes up once in a great while, so I just side step it by clicking on Advanced and telling the browser that it’s okay to visit this site even though there is no certificate for it.

For your daily work, it will probably behoove you to create a local certificate for development so that you never have to see this message again.

Create Your SQL Server Database

Open Azure Data Studio and connect to your localhost server with the SA password you created when installing SQL Server on your machine. You’ll notice it is arranged very much like VS Code. In the Connections Explorer, you will see localhost as a connection. Right-click on the connection and choose New Query, which will open a new query window on the right side. Start typing the word CREATE and an intellisense drop down will open and one of the choices will be sqlCreateDatabase. Choose that option and a query will be scaffolded with the database name highlighted in the three places that it occurs in the query. You can just start typing the database name “JudgeMyTaste” and it will be replaced in all three places so that the final query looks like this.

-- Create a new database called 'JudgeMyTaste'
-- Connect to the 'master' database to run this snippet
USE master
GO
-- Create the new database if it does not exist already
IF NOT EXISTS (
  SELECT [name]
    FROM sys.databases
    WHERE [name] = N'JudgeMyTaste'
)
CREATE DATABASE JudgeMyTaste
GO

Now you can just click the green Run arrow at the top of the window to create the database. Simple, no?

Now when you expand the Databases folder in the Connection Explorer, you will see the JudgeMyTaste database in the list. Right-click on the new database and choose New Query again. Start typing CREATE again and this time choose sqlCreateTable from the options presented. Again, you can start typing the table name FavoriteBands and it will be filled in all the places it occurs in the query.

You’ll also need to add some other columns to the table. Add the columns for Id, Name, EnteredBy, and EnteredOn so that the query looks like this:

-- Create a new table called '[FavoriteBands]' in schema '[dbo]'
-- Drop the table if it already exists
IF OBJECT_ID('[dbo].[FavoriteBands]', 'U') IS NOT NULL
DROP TABLE [dbo].[FavoriteBands]
GO
-- Create the table in the specified schema
CREATE TABLE [dbo].[FavoriteBands](
  [Id] [int] IDENTITY(1,1) NOT NULL,
  [Name] [varchar](255) NULL,
  [EnteredBy] [varchar](255) NULL,
  [EnteredOn] [date] NULL
);
GO

Then run the query by clicking the green Run arrow as before.

It’s good practice to create a user specifically for your application to connect with the database. One that only has the permissions that it will need to interact with your database. Here’s a script to create a login and a user for the database and assign that user dbo permissions to the database.

USE master

GO

CREATE LOGIN webapp WITH PASSWORD=N'[email protected]!', DEFAULT_DATABASE=JudgeMyTaste

GO

ALTER LOGIN webapp ENABLE

GO

USE JudgeMyTaste

GO

CREATE USER webapp FOR LOGIN webapp
EXEC sp_addrolemember 'db_owner', 'webapp'

GO

It might seem like a lot going on here, but it simply creates a login for SQL Server, makes that login a user for the JudgeMyTaste database, and add it to the db_owner role for the database. This will allow that login to do all the CRUD operations that the application will need. Now your database is ready to be used by your application!

Connect SQL Server to Your ASP.NET Core 2.2 MVC Application

Before anything else, you’ll need the Entity Framework Core NuGet package. To install it, run the following command in the terminal.

dotnet add package Microsoft.EntityFrameworkCore.SqlServer --version 2.2.4

Start by adding the connection string to your appsettings.json file in the root of your MVC project, so that it looks like this:

{
  "Logging": {
    "LogLevel": {
      "Default": "Warning"
    }
  },
  "AllowedHosts": "*",
  "ConnectionStrings": {
    "JudgeMyTasteDatabase": "Server=.;Database=JudgeMyTaste;user id=webapp;[email protected]!"
  }
}

In the Models folder, create a class file called FavoriteBand.cs.

using System;
using System.ComponentModel.DataAnnotations;

namespace JudgeMyTaste.Models
{
  public class FavoriteBand
  {
    public int Id { get; set; }
    public string Name { get; set; }
    public string EnteredBy { get; set; }
    public DateTime EnteredOn { get; set; }
  }
}

This class will be used to work with the FavoriteBand entries.

In the root of the project, create a folder called Data to house the database context for the application. Create a C# file called JudgeMyTasteContext.cs with the following contents:

using JudgeMyTaste.Models;
using Microsoft.EntityFrameworkCore;

namespace JudgeMyTaste.Data
{
  public class JudgeMyTasteContext : DbContext
  {
    public JudgeMyTasteContext(DbContextOptions<JudgeMyTasteContext> options) : base(options)
    {
    }

    public DbSet<FavoriteBand> FavoriteBands { get; set; }
  }
}

In your Startup.cs file, in the ConfigureServices() method, right before the services.AddMvc()... line, add the newly created context with the connection string.

services.AddDbContext<JudgeMyTasteContext>(options => options.UseSqlServer(Configuration.GetConnectionString("JudgeMyTasteDatabase")));

Now your database is all hooked into your application. All you need to do is create some way for the user to enter their favorite bands. To get some more scaffolding goodness for the CLI, install the Code Generation tool.

dotnet add package Microsoft.VisualStudio.Web.CodeGeneration.Design

Now you can scaffold a controller to handle all the CRUD operations for the FavoriteBand class by running the following command from the terminal.

dotnet aspnet-codegenerator controller -name FavoriteBandsController -async -m JudgeMyTaste.Models.FavoriteBand -dc JudgeMyTaste.Data.JudgeMyTasteContext -namespace Controllers -outDir Controllers -udl

This is a long one but if you break it down into its component pieces, it’s easier to understand.

The first part just calls the dotnet CLI’s new aspnet-codegenerator command for a controller. You want the controller’s name to be “FavoriteBandsController” and for the controller actions to all be -async. The model being used to generate the controller is the JudgeMyTaste.Models.FavoriteBand class, and the database context will be the JudgeMyTaste.Data.JudgeMyTasteContext class you just created. The namespace and output directory for the controller will be Controllers and the -udl switch tells the generator to use the default layout for the views it will generate (yeah, it’s going to generate views for everything too!). Pretty cool, right?

Once you run the command, you should see the controller and all it’s views show up. The only thing left is to create a link so that users can get to the favorite bands section of the site easily.

In the Views/Shared folder open the Layout.cshtml file and add a link to the menu to get to the new section of the site.

<li class="nav-item">
  <a class="nav-link text-dark" asp-area="" asp-controller="FavoriteBands" asp-action="Index">Favorite Bands</a>
</li>

Now when you run the application, you can click on the Favorite Bands menu item and see a list of all the favorite bands that have been entered. Of course there aren’t any right now, so add one using the Create New link at the top of the page and see it show up in the listing.

Now it’s a little cumbersome to add the EnteredOn value manually, and the code generator you used can’t know that you can just add that field to the entry as it’s being saved, so change the Create() method of the FavoriteBandController to add it automatically.

// POST: FavoriteBands/Create
// To protect from overposting attacks, please enable the specific properties you want to bind to, for
// more details see http://go.microsoft.com/fwlink/?LinkId=317598.
[HttpPost]
[ValidateAntiForgeryToken]
public async Task<IActionResult> Create([Bind("Id,Name,EnteredBy")] FavoriteBand favoriteBand)
{
  if (ModelState.IsValid)
  {
    favoriteBand.EnteredOn = DateTime.Now;
    _context.Add(favoriteBand);
    await _context.SaveChangesAsync();
    return RedirectToAction(nameof(Index));
  }
  return View(favoriteBand);
}

The only things that have changed is that I removed the EnteredOn field from the Bind statement in the method signature, and I added the value DateTime.Now as the value right before saving it to the database.

Add Authentication to Your ASP.NET Core 2.2 MVC + SQL Server Application

What you have now is okay, but there’s currently no way to keep users from editing other user’s entries. We want to make sure to judge people for their favorite band that they actually entered, right?

No reason to write this yourself. You can easily integrate Okta to handle the authentication for you and easily:

Sign up for a forever-free developer account (or log in if you already have one).

Once you have signed up and logged in, you’ll be taken to your dashboard. Make note of your Org URL in the top right corner.

Click on the Applications menu item at the top, click Add Application, and from the first page of the wizard choose Web and click Next.

On the next screen, change the application name to “Judge My Taste App” and update the Base URIs value and the Login Redirect URIs to reflect to the correct port and the fact that you’re running on the HTTPS scheme.

Then click Done and you’re taken to the application page. On the General Settings tab click Edit and add a URL to the Logout Redirect URIs with a value of <a href="https://localhost:5001/signout/callback" target="_blank">https://localhost:5001/signout/callback</a>. This is where Okta will redirect back to after the logout call. This is handled by the ASP.NET OIDC Middleware.

Configure Your ASP.NET Core 2.2 MVC Application for Authentication

Now you need to tell your application how to use Okta for authentication. The easiest way is to use the ASP.NET SDK from Okta. You can install it from NuGet using the following command:

dotnet add package Okta.AspNetCore --version 1.1.5

Add some configuration values to your appsettings.json file to that the final file looks like this:

{
  "Logging": {
    "LogLevel": {
      "Default": "Warning"
    }
  },
  "AllowedHosts": "*",
  "ConnectionStrings": {
    "JudgeMyTasteDatabase": "Server=.;Database=JudgeMyTaste;user id=webapp;[email protected]!"
  },
  "Okta": {
    "ClientId": "{yourClientId}",
    "ClientSecret": "{yourClientSecret}",
    "OktaDomain": "https://{yourOktaDomain}",
    "PostLogoutRedirectUri": "https://localhost:5001/"
  }
}

This PostLogoutRedirectUri is the URL that the middleware will redirect to once Okta has redirected back to the signout/callback URL. You can use any valid URL in the MVC application. Here, I am just redirecting to the root of the application.

Back in the Startup.cs file, add the following using statements:

using Okta.AspNetCore;
using Microsoft.AspNetCore.Authentication.Cookies;

Then at the very beginning of the ConfigureServices() method add:

var oktaMvcOptions = new OktaMvcOptions();
Configuration.GetSection("Okta").Bind(oktaMvcOptions);
oktaMvcOptions.Scope = new List<string> { "openid", "profile", "email" };
oktaMvcOptions.GetClaimsFromUserInfoEndpoint = true;

services.AddAuthentication(options =>
{
  options.DefaultAuthenticateScheme = CookieAuthenticationDefaults.AuthenticationScheme;
  options.DefaultSignInScheme = CookieAuthenticationDefaults.AuthenticationScheme;
  options.DefaultChallengeScheme = OktaDefaults.MvcAuthenticationScheme;
})
.AddCookie()
.AddOktaMvc(oktaMvcOptions);

This is a pretty dense chunk of code, but most of it is boilerplate for the OIDC middleware that the Okta SDK is built on. The first part just binds all of those configuration values you just added in appsettings.json to the oktaMvcOptions. It also adds the scopes you want to receive (which are the OpenID information, the user’s profile, and the user’s email address). It also tells the middleware that it can get the claims from the user info endpoint, which all OIDC identity providers have.

When the code adds authentication, it tells the OIDC provider to use cookies for storing tokens and that you’ll be sending users to Okta from an MVC application.

To actually wire up authentication, you need to tell the Configure() method to use this service you just configured. Right before the app.UseMvc(...) line, add:

app.UseAuthentication();

Okta is now configured in your application! You still need to set up your application to challenge the user (send them to Okta to authenticate).

Create a new controller in the Controllers folder called AccountController with the following code:

using Microsoft.AspNetCore.Authentication.Cookies;
using Microsoft.AspNetCore.Mvc;
using Okta.AspNetCore;

namespace JudgeMyTaste.Controllers
{
  public class AccountController : Controller
  {
    public IActionResult Login()
    {
      if (!HttpContext.User.Identity.IsAuthenticated)
      {
        return Challenge(OktaDefaults.MvcAuthenticationScheme);
      }
      return RedirectToAction("Index", "Home");
    }

    public IActionResult Logout()
    {
      return new SignOutResult(new[]
      {
        OktaDefaults.MvcAuthenticationScheme,
        CookieAuthenticationDefaults.AuthenticationScheme
      });
    }
  }
}

This will give you a Login() and Logout() method to wire up some menu items. Speaking of which, add a new view in Views/Shared called _LoginPartial.cshtml. This will house all the code for the login menu items.

@if (User.Identity.IsAuthenticated)
{
  <ul class="navbar-nav ml-auto">
    <li>
      <span class="navbar-text">Hello, @User.Identity.Name</span> &nbsp;
      <a onclick="document.getElementById('logout_form').submit();" style="cursor: pointer;">Log out</a>
    </li>
  </ul>
  <form asp-controller="Account" asp-action="Logout" method="post" id="logout_form"></form>
}
else
{
  <ul class="navbar-nav">
    <li><a asp-controller="Account" asp-action="Login">Log in</a></li>
  </ul>
}

Change the main menu in Views/Shared/_Layout.cshtml to add this in and move the main menu to the left and have the login menu on the far right. The final div that houses the menu should look like this:

<div class="navbar-collapse collapse justify-content-between">
  <ul class="navbar-nav mr-auto">
    <li class="nav-item">
      <a class="nav-link text-dark" asp-area="" asp-controller="Home" asp-action="Index">Home</a>
    </li>
    <li class="nav-item">
      <a class="nav-link text-dark" asp-area="" asp-controller="Home" asp-action="Privacy">Privacy</a>
    </li>
    <li class="nav-item">
      <a class="nav-link text-dark" asp-area="" asp-controller="FavoriteBands" asp-action="Index">Favorite Bands</a>
    </li>
  </ul>
  <partial name="_LoginPartial" />
</div>

The class list for the navbar-collapse has changed to add the justify-content-between class that will keep the menus apart. The ul’s class also changed to mr-auto which will help keep it left. Lastly, the login partial is added at the end of the menu.

Don’t just sit there, fire this thing up and judge me for liking Nickleback!

Now you have a complete CRUD slice built in an ASP.NET Core 2.2 MVC application saving data to a SQL Server database! Now you can take the same path to add things like favorite movie, favorite food, and favorite beverage so that you can easily and completely judge people for their taste online!!!

Learn How to Use SQL Server With Node.js

Learn How to Use SQL Server With Node.js

In this article, we discuss how to use SQL Server with Node.js. We walk you through every part of the process starting from installation and ending with a demo application.

In this article, we discuss how to use SQL Server with Node.js. We walk you through every part of the process starting from installation and ending with a demo application.

I have a passion for relational databases, particularly SQL server. Throughout my career, I’ve got drawn to various aspects of databases, such as design, deployments, migrations, carefully crafting stored procedures, triggers, and views.

**I recently started building Node.js apps with SQL Server. Today, I’m going to show you how to do it in this step-by-step tutorial by creating a simple calendar application. **

Set Up Your Node.js Development Environment

Before you get started, you’ll need a couple of things:

If you don’t already have an instance of SQL Server you can connect to, you can install one locally for development and testing.

Install SQL Server on Windows

Download and install SQL Server Developer Edition.

Install SQL Server on Mac or Linux
  1. Install Docker
  2. Run the following in a terminal. This will download the latest version of SQL Server 2017 for Linux and create a new container named sqlserver.
docker pull microsoft/mssql-server-linux:2017-latest
docker run -d --name sqlserver -e 'ACCEPT_EULA=Y' -e '[email protected]' -e 'MSSQL_PID=Developer' -p 1433:1433 microsoft/mssql-server-linux:2017-latest

Set Up the SQL Database

You will need a SQL database for this tutorial. If you are running SQL Server locally and don’t already have a database, you can create one with the following script.

Note: If you have Visual Studio Code, you can use the excellent mssql extension to run SQL scripts. Or, you can use an application like Azure Data Studio.

USE master;
GO

CREATE DATABASE calendar; -- change this to whatever database name you desire
GO

Next, create a new table named events. This is the table you will use to store calendar events.

-- Dropping events table...
DROP TABLE IF EXISTS events;

-- Create events table...
CREATE TABLE events (
   id int IDENTITY(1, 1) PRIMARY KEY CLUSTERED NOT NULL
   , userId nvarchar(50) NOT NULL
   , title nvarchar(200) NOT NULL
   , description nvarchar(1000) NULL
   , startDate date NOT NULL
   , startTime time(0) NULL
   , endDate date NULL
   , endTime time(0) NULL
   , INDEX idx_events_userId ( userId )
);

Create a Node.js Web Application

With Node.js you can choose from many different frameworks for creating web applications. In this tutorial, you will use hapi, my personal favorite. Originally created by Walmart engineers, it is suitable for building APIs, services, and complete web applications.

Open up a command prompt (Windows) or a terminal (Mac or Linux), and change the current directory to where you want to create your project. Create a folder for your project, and change to the new folder.

mkdir node-sql-tutorial
cd node-sql-tutorial

A package.json file is required for Node.js projects and includes things like project information, scripts, and dependencies. Use the npm command to create a package.json file in the project folder.

npm init -y

Next, install hapi as a dependency.

npm install [email protected]

Now, open the project in your editor of choice.

If you don’t already have a favorite code editor, I recommend installing Visual Studio Code. VS Code has exceptional support for JavaScript and Node.js, such as smart code completion and debugging. There’s also a vast library of free extensions contributed by the community.

Node.js Project Structure

Most “hello world” examples of Node.js applications start with everything in a single JavaScript file. However, it’s essential to set up a good project structure to support your application as it grows.

There are countless opinions on how you might organize a Node.js project. In this tutorial, the final project structure will be similar to the following.

├── package.json
├── client
├── src
│   ├── data
│   ├── plugins
│   ├── routes
│   └── views
└── test

Create a Basic Server with Routes

Create a folder named src. In this folder, add a new file named index.js. Open the file and add the following JavaScript.

"use strict";
const server = require( "./server" );
const startServer = async () => {
   try {
       // todo: move configuration to separate config
       const config = {
           host: "localhost",
           port: 8080
       };
       // create an instance of the server application
       const app = await server( config );
       // start the web server
       await app.start();
       console.log( `Server running at http://${ config.host }:${ config.port }...` );
   } catch ( err ) {
       console.log( "startup error:", err );
   }
};
startServer();

Create a new file under src named server.js. Open the file and add the following code.

"use strict";
const Hapi = require( "hapi" );
const routes = require( "./routes" );
const app = async config => {
   const { host, port } = config;
   // create an instance of hapi
   const server = Hapi.server( { host, port } );
   // store the config for later use
   server.app.config = config;
   // register routes
   await routes.register( server );
   return server;
};
module.exports = app;

Separating server configuration from application startup will make testing the application easier.

Next, create a folder under src named routes. In this folder, add a new file named index.js. Open the file and add the following code.

"use strict";
module.exports.register = async server => {
   server.route( {
       method: "GET",
       path: "/",
       handler: async ( request, h ) => {
           return "My first hapi server!";
       }
   } );
};

Finally, edit the package.json file and change the "main" property value to "src/index.js". This property instructs Node.js on which file to execute when the application starts.

 "main": "src/index.js"

Now, you can start the application. Go back to your command/terminal window and type in the following command.

node .

You should see the message Server running at <a href="http://localhost:8080..." target="_blank">http://localhost:8080...</a>. Open your browser and navigate to <a href="http://localhost:8080" target="_blank">http://localhost:8080</a>. Your browser should display something like the following.

Success!

Note: To stop the Node.js application, go to the command/terminal window and press <em>CTRL+C</em>.

Manage Your Node.js Application Configuration

Before we get into writing code to interact with SQL Server, we need a good way to manage our application’s configuration, such as our SQL Server connection information.

Node.js applications typically use environment variables for configuration. However, managing environment variables can be a pain. dotenv is a popular Node.js package that exposes a .env configuration file to Node.js, as if it were all set using environment variables.

First, install dotenv as a project dependency.

npm install [email protected]

Create a file named .env in the root folder of the project, and add the following configuration.

# Set NODE_ENV=production when deploying to production
NODE_ENV=development

# hapi server configuration
PORT=8080
HOST=localhost
HOST_URL=http://localhost:8080
COOKIE_ENCRYPT_PWD=superAwesomePasswordStringThatIsAtLeast32CharactersLong!

# SQL Server connection
SQL_USER=dbuser
[email protected]
SQL_DATABASE=calendar
SQL_SERVER=servername
# Set SQL_ENCRYPT=true if using Azure
SQL_ENCRYPT=false

# Okta configuration
OKTA_ORG_URL=https://{yourOktaDomain}
OKTA_CLIENT_ID={yourClientId}
OKTA_CLIENT_SECRET={yourClientSecret}

Update the SQL Server configuration with your database configuration information. We will cover some of the other settings later.

Note: When using a source control system such as git, do not add the <em>.env</em> file to source control. Each environment requires a custom <em>.env</em> file and may contain secrets that should not be stored in a repository. It is recommended you document the values expected in the project README and in a separate <em>.env.sample</em> file.

Next, create a file under src named config.js, and add the following code.

"use strict";
const assert = require( "assert" );
const dotenv = require( "dotenv" );
// read in the .env file
dotenv.config();
// capture the environment variables the application needs
const { PORT,
   HOST,
   HOST_URL,
   COOKIE_ENCRYPT_PWD,
   SQL_SERVER,
   SQL_DATABASE,
   SQL_USER,
   SQL_PASSWORD,
   OKTA_ORG_URL,
   OKTA_CLIENT_ID,
   OKTA_CLIENT_SECRET
} = process.env;
const sqlEncrypt = process.env.SQL_ENCRYPT === "true";
// validate the required configuration information
assert( PORT, "PORT configuration is required." );
assert( HOST, "HOST configuration is required." );
assert( HOST_URL, "HOST_URL configuration is required." );
assert( COOKIE_ENCRYPT_PWD, "COOKIE_ENCRYPT_PWD configuration is required." );
assert( SQL_SERVER, "SQL_SERVER configuration is required." );
assert( SQL_DATABASE, "SQL_DATABASE configuration is required." );
assert( SQL_USER, "SQL_USER configuration is required." );
assert( SQL_PASSWORD, "SQL_PASSWORD configuration is required." );
assert( OKTA_ORG_URL, "OKTA_ORG_URL configuration is required." );
assert( OKTA_CLIENT_ID, "OKTA_CLIENT_ID configuration is required." );
assert( OKTA_CLIENT_SECRET, "OKTA_CLIENT_SECRET configuration is required." );
// export the configuration information
module.exports = {
   port: PORT,
   host: HOST,
   url: HOST_URL,
   cookiePwd: COOKIE_ENCRYPT_PWD,
   sql: {
       server: SQL_SERVER,
       database: SQL_DATABASE,
       user: SQL_USER,
       password: SQL_PASSWORD,
       options: {
           encrypt: sqlEncrypt
       }
   },
   okta: {
       url: OKTA_ORG_URL,
       clientId: OKTA_CLIENT_ID,
       clientSecret: OKTA_CLIENT_SECRET
   }
};

Update src/index.js to use the new config module you just created.

"use strict";
const config = require( "./config" );
const server = require( "./server" );
const startServer = async () => {
   try {
       // create an instance of the server application
       const app = await server( config );
       // start the web server
       await app.start();
       console.log( `Server running at http://${ config.host }:${ config.port }...` );
   } catch ( err ) {
       console.log( "startup error:", err );
   }
};
startServer();

Create a Node.js API With SQL Server

Now we can get to the fun part! In this step, you are going to add a route to hapi to query the database for a list of events and return them as JSON. You are going to create a SQL Server client plugin for hapi and organize the data access layer in a way that will make it easy to add new APIs in the future.

First, you need to install a few dependencies, the most important being the <a href="https://www.npmjs.com/package/mssql" target="_blank">mssql</a> package.

npm install [email protected] [email protected]

Create the SQL Data Access Layer

Using SQL Server with Node.js and the mssql package usually follows these steps:

  1. Create an instance of the mssql package.
  2. Create a SQL connection with connect().
  3. Use the connection to create a new SQL request.
  4. Set any input parameters on the request.
  5. Execute the request.
  6. Process the results (e.g. recordset) returned by the request.

Creating connections to SQL Server is a relatively expensive operation. There is also a practical limit to the number of connections that can be established. By default, the mssql package’s .connect() function creates and returns a connection “pool” object. A connection pool increases the performance and scalability of an application.

When a query request is created, the SQL client uses the next available connection in the pool. After the query is executed, the connection is returned to the connection of the pool.

Create a folder under src named data. Create a new file under src/data named index.js. Add the following code to this file.

"use strict";
const events = require( "./events" );
const sql = require( "mssql" );
const client = async ( server, config ) => {
   let pool = null;
   const closePool = async () => {
       try {
           // try to close the connection pool
           await pool.close();
           // set the pool to null to ensure
           // a new one will be created by getConnection()
           pool = null;
       } catch ( err ) {
           // error closing the connection (could already be closed)
           // set the pool to null to ensure
           // a new one will be created by getConnection()
           pool = null;
           server.log( [ "error", "data" ], "closePool error" );
           server.log( [ "error", "data" ], err );
       }
   };
   const getConnection = async () => {
       try {
           if ( pool ) {
               // has the connection pool already been created?
               // if so, return the existing pool
               return pool;
           }
           // create a new connection pool
           pool = await sql.connect( config );
           // catch any connection errors and close the pool
           pool.on( "error", async err => {
               server.log( [ "error", "data" ], "connection pool error" );
               server.log( [ "error", "data" ], err );
               await closePool();
           } );
           return pool;
       } catch ( err ) {
           // error connecting to SQL Server
           server.log( [ "error", "data" ], "error connecting to sql server" );
           server.log( [ "error", "data" ], err );
           pool = null;
       }
   };
   // this is the API the client exposes to the rest
   // of the application
   return {
       events: await events.register( { sql, getConnection } )
   };
};
module.exports = client;

When using SQL Server with Node.js, one of the most critical things to get right is properly handling connection errors when they occur. Internally, the sql/data module has two important functions: getConnection and closePool. getConnection returns the active connection pool or creates one if necessary. When any connection error occurs, closePool makes sure the previously active pool is disposed to prevent the module from reusing it.

Create a new file under src/data named utils.js. Add the following code to this file.

"use strict";
const fse = require( "fs-extra" );
const { join } = require( "path" );
const loadSqlQueries = async folderName => {
   // determine the file path for the folder
   const filePath = join( process.cwd(), "src", "data", folderName );
   // get a list of all the files in the folder
   const files = await fse.readdir( filePath );
   // only files that have the .sql extension
   const sqlFiles = files.filter( f => f.endsWith( ".sql" ) );
   // loop over the files and read in their contents
   const queries = {};
   for ( let i = 0; i < sqlFiles.length; i++ ) {
       const query = fse.readFileSync( join( filePath, sqlFiles[ i ] ), { encoding: "UTF-8" } );
       queries[ sqlFiles[ i ].replace( ".sql", "" ) ] = query;
   }
   return queries;
};
module.exports = {
   loadSqlQueries
};

Although it’s possible to embed SQL queries as strings in JavaScript code, I believe it’s better to keep the queries in separate .sql files and load them at startup. This utils module loads all the .sql files in a given folder and returns them as a single object.

Create a new folder under src/data named events. Add a new file under src/data/events named index.js. Add the following code to this file.

"use strict";
const utils = require( "../utils" );
const register = async ( { sql, getConnection } ) => {
   // read in all the .sql files for this folder
   const sqlQueries = await utils.loadSqlQueries( "events" );
   const getEvents = async userId => {
       // get a connection to SQL Server
       const cnx = await getConnection();
       // create a new request
       const request = await cnx.request();
       // configure sql query parameters
       request.input( "userId", sql.VarChar( 50 ), userId );
       // return the executed query
       return request.query( sqlQueries.getEvents );
   };
   return {
       getEvents
   };
};
module.exports = { register };

Add a new file under src/data/events named getEvents.sql. Add the following SQL to this file.

SELECT  [id]
       , [title]
       , [description]
       , [startDate]
       , [startTime]
       , [endDate]
       , [endTime]
FROM    [dbo].[events]
WHERE   [userId] = @userId
ORDER BY
       [startDate], [startTime];

Notice in the last two files that you are using a parameterized query, passing @userId as a named parameter, which guards against SQL injection attacks.

Create a Database Client Plugin

Next, you will add a database client plugin to make it easy to run SQL queries from other parts of the application, such as when a user requests an API. In other frameworks, this concept might be known as middleware, but hapi uses the term plugin.

Create a new folder under src named plugins. Create a new file under src/plugins named index.js. Add the following code.

"use strict";
const sql = require( "./sql" );
module.exports.register = async server => {
   // register plugins
   await server.register( sql );
};

Create a new file under src/plugins named sql.js. Add the following code.

"use strict";

// import the data access layer
const dataClient = require( "../data" );

module.exports = {
   name: "sql",
   version: "1.0.0",
   register: async server => {
       // get the sql connection information
       const config = server.app.config.sql;
       
       // create an instance of the database client
       const client = await dataClient( server, config );

       // "expose" the client so it is available everywhere "server" is available
       server.expose( "client", client );
   }
};

Next, update src/server.js to register plugins.

"use strict";

const Hapi = require( "hapi" );
const plugins = require( "./plugins" );
const routes = require( "./routes" );

const app = async config => {
   const { host, port } = config;
 
  // create an instance of hapi
   const server = Hapi.server( { host, port } );
 
  // store the config for later use
   server.app.config = config;

   // register plugins
   await plugins.register( server );

   // register routes
   await routes.register( server );

   return server;
};

module.exports = app;

Add an API Route

Now you will add an API route that will execute the getEvents query and return the results as JSON. You could add the route to the existing src/routes/index.js. However, as an application grows, it would be better to separate routes into modules that contain related resources.

Create a new folder under src/routes named api. Under src/routes/api, create a new file named index.js. Add the following code to this file.

"use strict";

const events = require( "./events" );

module.exports.register = async server => {
   await events.register( server );
};

Create a new file under src/routes/api named events.js. Add the following code to this file.

"use strict";

module.exports.register = async server => {
   server.route( {
       method: "GET",
       path: "/api/events",
       config: {
           handler: async request => {
               try {
                   // get the sql client registered as a plugin
                   const db = request.server.plugins.sql.client;

                   // TODO: Get the current authenticate user's ID
                   const userId = "user1234";

                   // execute the query
                   const res = await db.events.getEvents( userId );
 
                   // return the recordset object
                   return res.recordset;
               } catch ( err ) {
                   console.log( err );
               }
           }
       }
   } );
};

Now update src/routes/index.js to register the new api routes.

"use strict";

const api = require( "./api" );

module.exports.register = async server => {
   // register api routes
   await api.register( server );

   server.route( {
       method: "GET",
       path: "/",
       handler: async ( request, h ) => {
           return "My first hapi server!";
       }
   } );
};

Whew! You’re almost there! Insert a couple of test records into your database.

INSERT INTO [dbo].[events]
( userId, title, description, startDate, startTime, endDate, endTime )
VALUES
( 'user1234', N'doctor appt', N'Stuff', '2019-10-03', '14:30', NULL, NULL )
, ( 'user1234', N'conference', N'', '2019-09-17', NULL, '2019-09-20', NULL )

Start the web server from the command/terminal window.

node .

Now, navigate your browser to <a href="http://localhost:8080/api/events" target="_blank">http://localhost:8080/api/events</a>. If everything is set up correctly, you should see a JavaScript array of the records you just inserted!

Add Authentication to Your Node.js Application

Let’s get some real users in the application! Manually building authentication and user profile management for any application is no trivial task. And, getting it wrong can have disastrous results. Okta to the rescue!

To complete this step, you’ll need an Okta developer account. Go to the Okta Developer Portal and sign up for a forever free Okta account.

Okta sign up

After creating your account, click the Applications link at the top, and then click Add Application.

Add application

Next, choose a Web Application and click Next.

Add web application

Enter a name for your application, such as Node-SQL. Then, click Done to finish creating the application.

Application settings

Near the bottom of the application page you will find a section titled Client Credentials. Copy the Client ID and Client secret values and paste them into your .env file to replace {yourClientId} and {yourClientSecret}, respectively.

Client credentials

Click on the Dashboard link. On the right side of the page, you should find your Org URL. Copy this value into your .env file to replace the value for OKTA<em>ORG</em>URL.

Your org URL

Next, enable self-service registration. This will allow new users to create their own account. Click on the Users menu and select Registration.

User registration

Click on the Edit button.

  1. Change Self-service registration to Enabled.
  2. Click the Save button at the bottom of the form.

Enable self-service registration

Build a UI With Embedded JavaScript and Vue.js

In these next steps, you will add a frontend to your Node.js application using Embedded JavaScript (EJS) templates and Vue.js.

First, you will install a few dependencies needed to support authentication, rendering templates, and serving static files.

npm install [email protected] [email protected] [email protected] [email protected] [email protected] [email protected]

Register UI and Authentication Plugins

You will use bell to authenticate with Okta and hapi-auth-cookie to manage user sessions. Create a file under src/plugins named auth.js and add the following code.

"use strict";

const bell = require( "bell" );
const authCookie = require( "hapi-auth-cookie" );

const isSecure = process.env.NODE_ENV === "production";

module.exports.register = async server => {
   // register plugins
   const config = server.app.config;
   await server.register( [ authCookie, bell ] );

   // configure cookie authorization strategy
   server.auth.strategy( "session", "cookie", {
       password: config.cookiePwd,
       redirectTo: "/authorization-code/callback", // If there is no session, redirect here
       isSecure // Should be set to true (which is the default) in production
   } );

   // configure bell to use your Okta authorization server
   server.auth.strategy( "okta", "bell", {
       provider: "okta",
       config: { uri: config.okta.url },
       password: config.cookiePwd,
       isSecure,
       location: config.url,
       clientId: config.okta.clientId,
       clientSecret: config.okta.clientSecret
   } );
};

Next, you will update src/plugins/index.js to register the auth.js module and add support for serving files related to the UI.

"use strict";

const ejs = require( "ejs" );
const inert = require( "inert" );
const { join } = require( "path" );
const vision = require( "vision" );

const auth = require( "./auth" );
const sql = require( "./sql" );

const isDev = process.env.NODE_ENV !== "production";

module.exports.register = async server => {
   // register plugins
   await server.register( [ inert, sql, vision ] );

   // configure ejs view templates
   const filePath = join( process.cwd(), "src" );
   server.views( {
       engines: { ejs },
       relativeTo: filePath,
       path: "views",
       layout: true
   } );

   // register authentication plugins
   await auth.register( server );
};

The inert plugin is used to serve static files, and vision adds support for rendering server-side templates. Here, ejs is configured as the template engine.

Add Server Views

Create a folder under src named views. Under src/views add a new file named layout.ejs and add the following code.

<!DOCTYPE html>
<html>
<head>
   <meta charset="utf-8" />
   <meta http-equiv="X-UA-Compatible" content="IE=edge">
   <title><%= title %></title>
   <meta name="viewport" content="width=device-width, initial-scale=1">
   <link href="https://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
   <link rel="stylesheet" href="/index.css">
</head>
<body>
   <% include partials/navigation %>
   <%- content %>
   <script src="/index.js"></script>
</body>
</html>

Add a new file to src/views named index.ejs, and add the following code.

<div class="container">
   <% if ( isAuthenticated ) { %>
       <div id="app"></div>
   <% } else { %>
       <h1 class="header"><%= title %></h1>
       <p><%= message %></p>
   <% } %>
</div>

Create a new folder under src/views named partials. Under src/views/partials, add a new file named navigation.ejs, and add the following code.

<nav>
   <div class="nav-wrapper">
       <ul class="left">
           <% if ( isAuthenticated ) { %>
           <li><a class="waves-effect waves-light btn" href="/logout">Logout</a></li>
           <% } else { %>
           <li><a class="waves-effect waves-light btn" href="/login">Login</a></li>
           <% } %>
       </ul>
   </div>
</nav>

Update Routes to Support Views and Authentication

Under src/routes, add a new file named auth.js. Add the following code to this file.

"use strict";

const boom = require( "boom" );

module.exports.register = async server => {
   // login route
   server.route( {
       method: "GET",
       path: "/login",
       options: {
           auth: "session",
           handler: async request => {
               return `Hello, ${ request.auth.credentials.profile.email }!`;
           }
       }
   } );

   // OIDC callback
   server.route( {
       method: "GET",
       path: "/authorization-code/callback",
       options: {
           auth: "okta",
           handler: ( request, h ) => {
               if ( !request.auth.isAuthenticated ) {
                   throw boom.unauthorized( `Authentication failed: ${ request.auth.error.message }` );
               }
               request.cookieAuth.set( request.auth.credentials );
               return h.redirect( "/" );
           }
       }
   } );

   // Logout
   server.route( {
       method: "GET",
       path: "/logout",
       options: {
           auth: {
               strategy: "session",
               mode: "try"
           },
           handler: ( request, h ) => {
               try {
                   if ( request.auth.isAuthenticated ) {
                       // const idToken = encodeURI( request.auth.credentials.token );

                       // clear the local session
                       request.cookieAuth.clear();
                       // redirect to the Okta logout to completely clear the session
                       // const oktaLogout = `${ process.env.OKTA_ORG_URL }/oauth2/default/v1/logout?id_token_hint=${ idToken }&post_logout_redirect_uri=${ process.env.HOST_URL }`;
                       // return h.redirect( oktaLogout );
                   }

                   return h.redirect( "/" );
               } catch ( err ) {
                   request.log( [ "error", "logout" ], err );
               }
           }
       }
   } );
};

Now, edit src/routes/index.js to change the home page so it renders the new EJS view.

"use strict";

const api = require( "./api" );
const auth = require( "./auth" );

module.exports.register = async server => {
   // register api routes
   await api.register( server );

   // register authentication routes
   await auth.register( server );

   // home page route
   server.route( {
       method: "GET",
       path: "/",
       config: {
           auth: {
               strategy: "session",
               mode: "optional"
           }
       },
       handler: async ( request, h ) => {
           try {
               const message = request.auth.isAuthenticated ? `Hello, ${ request.auth.credentials.profile.firstName }!` : "My first hapi server!";
               return h.view( "index", {
                   title: "Home",
                   message,
                   isAuthenticated: request.auth.isAuthenticated
               } );
           } catch ( err ) {
               server.log( [ "error", "home" ], err );
           }
       }
   } );

   // Serve static files in the /dist folder
   server.route( {
       method: "GET",
       path: "/{param*}",
       handler: {
           directory: {
               path: "dist"
           }
       }
   } );
};

Update API Routes and Add SQL Queries

You need to update the application API to query the database based on the currently logged-in user. At a minimum, you need routes to create, update, and delete events, along with their respective SQL queries.

Create a new file under src/data/events named addEvent.sql. Add the following SQL to this file.

INSERT INTO [dbo].[events]
(
   [userId]
   , [title]
   , [description]
   , [startDate]
   , [startTime]
   , [endDate]
   , [endTime]
)
VALUES
(
   @userId
   , @title
   , @description
   , @startDate
   , @startTime
   , @endDate
   , @endTime
);

SELECT SCOPE_IDENTITY() AS id;

Create a new file under src/data/events named updateEvent.sql. Add the following SQL to this file.

UPDATE  [dbo].[events]
SET     [title] = @title
       , [description] = @description
       , [startDate] = startDate
       , [startTime] = @startTime
       , [endDate] = @endDate
       , [endTime] = @endTime
WHERE   [id] = @id
 AND   [userId] = @userId;

SELECT  [id]
       , [title]
       , [description]
       , [startDate]
       , [startTime]
       , [endDate]
       , [endTime]
FROM    [dbo].[events]
WHERE   [id] = @id
 AND   [userId] = @userId;

Create a new file under src/data/events named deleteEvent.sql. Add the following SQL to this file.

DELETE  [dbo].[events]
WHERE   [id] = @id
 AND   [userId] = @userId;

Update src/data/events/index.js to contain the following code.

"use strict";

const utils = require( "../utils" );

const register = async ( { sql, getConnection } ) => {
   // read in all the .sql files for this folder
   const sqlQueries = await utils.loadSqlQueries( "events" );

   const getEvents = async userId => {
       // get a connection to SQL Server
       const cnx = await getConnection();

       // create a new request
       const request = await cnx.request();

       // configure sql query parameters
       request.input( "userId", sql.VarChar( 50 ), userId );

       // return the executed query
       return request.query( sqlQueries.getEvents );
   };

   const addEvent = async ( { userId, title, description, startDate, startTime, endDate, endTime } ) => {
       const pool = await getConnection();
       const request = await pool.request();
       request.input( "userId", sql.VarChar( 50 ), userId );
       request.input( "title", sql.NVarChar( 200 ), title );
       request.input( "description", sql.NVarChar( 1000 ), description );
       request.input( "startDate", sql.Date, startDate );
       request.input( "startTime", sql.Time, startTime );
       request.input( "endDate", sql.Date, endDate );
       request.input( "endTime", sql.Time, endTime );
       return request.query( sqlQueries.addEvent );
   };

   const updateEvent = async ( { id, userId, title, description, startDate, startTime, endDate, endTime } ) => {
       const pool = await getConnection();
       const request = await pool.request();
       request.input( "id", sql.Int, id );
       request.input( "userId", sql.VarChar( 50 ), userId );
       request.input( "title", sql.NVarChar( 200 ), title );
       request.input( "description", sql.NVarChar( 1000 ), description );
       request.input( "startDate", sql.Date, startDate );
       request.input( "startTime", sql.Time, startTime );
       request.input( "endDate", sql.Date, endDate );
       request.input( "endTime", sql.Time, endTime );
       return request.query( sqlQueries.updateEvent );
   };

   const deleteEvent = async ( { id, userId } ) => {
       const pool = await getConnection();
       const request = await pool.request();
       request.input( "id", sql.Int, id );
       request.input( "userId", sql.VarChar( 50 ), userId );
       return request.query( sqlQueries.deleteEvent );
   };

   return {
       addEvent,
       deleteEvent,
       getEvents,
       updateEvent
   };
};

module.exports = { register };

Update src/routes/api/events.js to contain the following code.

"use strict";
const boom = require( "boom" );
module.exports.register = async server => {
   server.route( {
       method: "GET",
       path: "/api/events",
       config: {
           auth: {
               strategy: "session",
               mode: "required"
           },
           handler: async request => {
               try {
                   // get the sql client registered as a plugin
                   const db = request.server.plugins.sql.client;
                   // get the current authenticated user's id
                   const userId = request.auth.credentials.profile.id;
                   // execute the query
                   const res = await db.events.getEvents( userId );
                   // return the recordset object
                   return res.recordset;
               } catch ( err ) {
                   server.log( [ "error", "api", "events" ], err );
                   return boom.boomify( err );
               }
           }
       }
   } );
   server.route( {
       method: "POST",
       path: "/api/events",
       config: {
           auth: {
               strategy: "session",
               mode: "required"
           },
           handler: async request => {
               try {
                   const db = request.server.plugins.sql.client;
                   const userId = request.auth.credentials.profile.id;
                   const { startDate, startTime, endDate, endTime, title, description } = request.payload;
                   const res = await db.events.addEvent( { userId, startDate, startTime, endDate, endTime, title, description } );
                   return res.recordset[ 0 ];
               } catch ( err ) {
                   server.log( [ "error", "api", "events" ], err );
                   return boom.boomify( err );
               }
           }
       }
   } );
   server.route( {
       method: "DELETE",
       path: "/api/events/{id}",
       config: {
           auth: {
               strategy: "session",
               mode: "required"
           },
           response: {
               emptyStatusCode: 204
           },
           handler: async request => {
               try {
                   const id = request.params.id;
                   const userId = request.auth.credentials.profile.id;
                   const db = request.server.plugins.sql.client;
                   const res = await db.events.deleteEvent( { id, userId } );
                   return res.rowsAffected[ 0 ] === 1 ? "" : boom.notFound();
               } catch ( err ) {
                   server.log( [ "error", "api", "events" ], err );
                   return boom.boomify( err );
               }
           }
       }
   } );
};

Add Vue.js

First, install dependencies for Vue.js and other packages used for the UI.

npm install [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected]

Create a new folder at the root of the project named client. In this folder, add a new file named index.js. Add the following code to this file.

import Datetime from "vue-datetime";
import Vue from "vue";
import "materialize-css";
import "materialize-css/dist/css/materialize.min.css";
import "vue-datetime/dist/vue-datetime.css";

import App from "./App";

Vue.use( Datetime );

new Vue( { // eslint-disable-line no-new
 el: "#app",
 render: h => h( App )
} );

Add a new file to client named App.vue. Add the following code to this file.

<template>
 <div id="app">
   <h1>{{ msg }}</h1>
   <div class="row" id="eventList">
       <h2>Event List</h2>
       <table v-if="hasEvents">
           <thead>
               <tr>
                   <th>Start</th>
                   <th>End</th>
                   <th>Title</th>
                   <th>Description</th>
                   <th></th>
               </tr>
           </thead>
           <tbody>
               <tr v-for="event in events" :key="event.id">
                   <td>{{ event.startDate }} {{ event.startTime }}</td>
                   <td>{{ event.endDate }} {{ event.endTime }}</td>
                   <td>{{ event.title }}</td>
                   <td>{{ event.description }}</td>
                   <td>
                       <button id="eventDelete" @click="confirmDeleteEvent(event.id)" class="btn-small"><i class="material-icons right">delete</i>Delete</button>
                   </td>
               </tr>
           </tbody>
       </table>
       <p v-if="noEvents">No events yet!</p>
   </div>
   <div class="row" id="eventEdit">
       <h2>Add an Event</h2>
       <form class="col s12" @submit.prevent="addEvent">
           <div class="row">
               <div class="input-field col s6">
                   <span class="datetime-label">Start Date</span>
                   <datetime v-model="startDate" input-id="startDate" type="date" value-zone="local" input-class="validate"></datetime>
                   <!-- <label for="startDate" class="datetime-label">Start Date</label> -->
               </div>
               <div class="input-field col s6">
                   <span class="datetime-label">Time</span>
                   <datetime v-model="startTime" input-id="startTime" type="time" minute-step="5" use12-hour="true" value-zone="local" input-class="validate"></datetime>
                   <!-- <label for="startTime" class="datetime-label">Time</label> -->
               </div>
           </div>
           <div class="row">
               <div class="input-field col s6">
                   <span class="datetime-label">End Date</span>
                   <datetime v-model="endDate" input-id="endDate" type="date" value-zone="local" input-class="validate"></datetime>
                   <!-- <label for="endDate">End Date</label> -->
               </div>
               <div class="input-field col s6">
                   <span class="datetime-label">Time</span>
                   <datetime v-model="endTime" input-id="endTime" type="time" minute-step="5" use12-hour="true" value-zone="local" input-class="validate"></datetime>
                   <!-- <input v-model="endTime" ref="endTime" placeholder="" id="endTime" type="text" class="validate"> -->
                   <!-- <label for="endTime">Time</label> -->
               </div>
           </div>
           <div class="row">
               <div class="input-field col s12">
                   <input v-model="title" ref="title" placeholder="Appointment" id="title" type="text" class="validate">
                   <label for="title">Title</label>
               </div>
           </div>
           <div class="row">
               <div class="input-field col s12">
                   <input v-model="description" ref="description" placeholder="Description" id="description" type="text" class="validate">
                   <label for="description">Description</label>
               </div>
           </div>
           <button id="eventEditSubmit" class="btn" type="submit"><i class="material-icons right">send</i>Submit</button>
       </form>
   </div>
   <div id="deleteConfirm" ref="deleteConfirm" class="modal">
       <div class="modal-content">
           <h2>Confirm delete</h2>
           <p>Delete {{ selectedEvent }}?</p>
       </div>
       <div class="modal-footer">
           <button @click="deleteEvent(selectedEventId)" class="modal-close btn-flat">Ok</button>
           <button class="modal-close btn-flat">Cancel</button>
       </div>
   </div>
 </div>
</template>
<script>
import axios from "axios";
import * as M from "materialize-css";
import moment from "moment";
export default {
 name: "app",
 computed: {
   hasEvents() {
     return this.isLoading === false && this.events.length > 0;
   },
   noEvents() {
     return this.isLoading === false && this.events.length === 0;
   }
 },
 data() {
   return {
     title: "",
     description: "",
     events: [],
     isLoading: true,
     startDate: "",
     startTime: "",
     endDate: "",
     endTime: "",
     selectedEvent: "",
     selectedEventId: 0
   };
 },
 methods: {
   addEvent() {
     const event = {
       startDate: this.startDate ? moment( this.startDate ).format( "YYYY-MM-DD" ) : null,
       startTime: this.startTime ? moment( this.startTime ).format( "YYYY-MM-DD HH:mm:00" ) : null,
       endDate: this.endDate ? moment( this.endDate ).format( "YYYY-MM-DD" ) : null,
       endTime: this.endTime ? moment( this.endTime ).format( "YYYY-MM-DD HH:mm:00" ) : null,
       title: this.title,
       description: this.description
     };
     axios
       .post( "/api/events", event )
       .then( () => {
         this.startDate = "";
         this.startTime = "";
         this.endDate = "";
         this.endTime = "";
         this.title = "";
         this.description = "";
         this.loadEvents();
       } )
       .catch( err => {
         this.msg = err.message;
         console.log( err );
       } );
   },
   confirmDeleteEvent( id ) {
     const event = this.events.find( e => e.id === id );
     this.selectedEvent = `'${ event.title }' on ${ event.startDate }${ event.startTime ? ` at ${ event.startTime }` : "" }`;
     this.selectedEventId = event.id;
     const dc = this.$refs.deleteConfirm;
     const modal = M.Modal.init( dc );
     modal.open();
   },
   deleteEvent( id ) {
     axios
       .delete( `/api/events/${ id }` )
       .then( this.loadEvents )
       .catch( err => {
         this.msg = err.message;
         console.log( err );
         this.loadEvents();
       } );
   },
   formatDate( d ) {
     return d ? moment.utc( d ).format( "MMM D, YYYY" ) : "";
   },
   formatTime( t ) {
     return t ? moment( t ).format( "h:mm a" ) : "";
   },
   formatEvents( events ) {
     return events.map( event => {
       return {
         id: event.id,
         title: event.title,
         description: event.description,
         startDate: this.formatDate( event.startDate ),
         startTime: this.formatTime( event.startTime ),
         endDate: this.formatDate( event.endDate ),
         endTime: this.formatTime( event.endTime )
       };
     } );
   },
   loadEvents() {
     axios
       .get( "/api/events" )
       .then( res => {
         this.isLoading = false;
         this.events = this.formatEvents( res.data );
       } )
       .catch( err => {
         this.msg = err.message;
         console.log( err );
       } );
   }
 },
 mounted() {
   return this.loadEvents();
 }
};
</script>
<style lang="css">
#app h2 {
 font-size: 2rem;
}
.datetime-label {
 color: #9e9e9e;
 font-size: .8rem;
}
</style>

Add a Build Process

It is necessary to create a build process that transforms and bundles the client UI into formats compatible with most browsers. For Node.js applications, these build steps are typically added to the package.jsonfile under scripts.

First, install the packages you will need for building the client files.

npm install --save-dev [email protected] [email protected] [email protected] @vue/[email protected] [email protected]

Note: The <em>--save-dev</em> argument instructs <em>npm</em> to install these as developerdependencies as opposed to dependencies required for production at runtime.
Now, modify package.json and change the scripts section to match the following.

 "scripts": {
   "build": "parcel build client/index.js",
   "dev:start": "npm-run-all build start",
   "dev": "nodemon --watch client --watch src -e js,ejs,sql,vue,css --exec npm run dev:start",
   "start": "node .",
   "test": "echo \"Error: no test specified\" && exit 1"
 },

You can run any script defined from the command/terminal using npm run [label] where label is any of the labels defined under scripts. For example, you can run just the build step using npm run build.

By the way, nodemon is a fantastic utility that watches for changes to files and automatically restarts the Node.js application. You can now start the new build process and launch the web application with one command.

npm run dev

Calendar demo

I hope you have enjoyed learning how to use SQL Server with Node.js! You get the final source code for this project on GitHub, which also includes a few extras, such as examples of tests and a task to automate initializing the SQL database.