Improve your Docker workflow with this VS Code extension

There are quite a few things you will need to do as part of your Docker Workflow. You will spend a lot of your time at the terminal and a lot of your time authoring Dockerfiles and/or docker-compose.yaml. Luckily there exist an extension that can greatly help with that all of the above as well as deploying to the Cloud

Your Docker workflow

There are some actions we keep on doing when dealing with Docker. Those are:

  • Authoring a Dockerfile or docker-compose.yml
  • Managing, here we do everything from managing, tagging, pushing it to a repo and much more
  • Running/starting/stopping/ removing your container/s, there are quite a few movements involved here if we do this to every container/image. Luckily we have Docker Compose that can operate on groups.
  • Deploying your Docker Image to some sort of registry like Docker Hub or somewhere in the Cloud
  • Take it to production, this can be done on premise as well as using some sort of Cloud solution

We are likely to spend a lot of our time in the terminal unless we have something like Docker Kitematic at our disposal or some similar tool.

Docker extension

The point of this article is to present a Visual Studio Code Extension that can really help your workflow. So what can it do?

  • Authoring, it helps with generating Dockerfiles as well as Docker Compose files. Furthermore, it helps you do autocomplete and even lints your file and much more.
  • Manage, It comes loaded with a set of commands that helps with everything from file generation to managing your images and your containers
  • Browsing repositories, it allows you to browse your Docker Hub as well as container registries in the Cloud
  • Deploy to the Cloud, The tool enables you to deploy to the Cloud in one click, just select your image and there you are, as simple as you want the deployment to be

Install

We install this like we would install any extension. We open up our Visual Studio Code and press the extension button and type Docker, like so:

Authoring

There are two ways we can go about this:

  • Create our Dockerfile or docker-compose.yml file and start authoring
  • Have the extension generate the file for us Let’s show the latter.

Generate files

Bring up your command menu CMD +SHIFT + P on a Mac and start typing Docker. It should show you this

Select Add Docker Files to Workspace

Then we are prompted with the following Select platform

We go with Node.js cause that’s what we are trying to build. If you have a Go or .Net Core project, select that instead.

Lastly we are asked to select a port, we go with the default suggested 3000.

What dialogs you need to through after selecting platform might differ per choice of platform

The generated files

Ok then, what did we get from this?
We got:

  • docker-compose.yml
  • docker-compose.debug.yml
  • Dockerfile
  • .dockerignore

Not just the files but loaded with content.

Dockerfile
Let’s look at the Dockerfile for example

FROM node:10.13-alpine
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*", "./"]
RUN npm install --production --silent && mv node_modules ../
COPY . .
EXPOSE 3000
CMD npm start

Above we see that everything is done for us. It has

  • Selected an image
  • Set an env variable
  • Set a workdir
  • Copied package.json and package-lock.json
  • Installed our the libraries
  • Copied our application files
  • Exposed a port
  • Issued a command that will start our app up in the container

Quite impressive !. Of course, we still need to author our app

docker-compose.yml
Let’s look at the docker-compose.yml file next:

version: '2.1'

services:
  articles:
    image: articles
    build: .
    environment:
      NODE_ENV: production
    ports:
      - 3000:3000

It has set everything up in terms of how to build the image, set an environment variable and mapped a port

docker-compose.debug.yml
This gives us a very similar looking file as that of docker-compose.yml but with the difference of it running node in inspect mode, like so command: node --inspect index.js

.dockerignore
This file contains a lot of good patterns that match files that we don’t want to copy over like node_modules, .git, .env, Dockerfile. You might want to adjust this file to fit your needs.

Authoring with autocomplete

Ok. Let’s look at a scenario where we do everything from scratch. Let’s start off by creating a Dockerfile.

Let’s start typing FROM. As you can see below we get help with typing the command and what it should look like

We keep on typing the name of our baseImage, in this case, we are looking for Node.js image so we start typing the character n. Below we get a list of options matching what we are typing. It lists the base images by popularity and also adds some useful information so we understand what we are getting:

Next thing we try to type is ENV but we only get as far as E before it starts suggesting what command we are writing and how to type it:


As you can see it’s quite helpful in how we should type the command.

Next up is WORKDIR and it shows us:

Not only is it telling is how to type the command but tells us that it affects commands like COPY and ADD etc.

At this point we want to tell it to copy some files we might need before running commands like installing a library:


This gives us the two different ways in which we can copy things relative or absolute.

As mentioned we want to run a command so we can install things. Our auto complete tells us the following:


Again it suggests what kind of commands that might be.

This far in the Dockerfile we might want to COPY our application files and we’ve already shown you how to use the autocomplete for that so let’s look at EXPOSE:

As you can see it shows all the different ways in which you can export a port, really educational.

Ok, one more command is usually needed at this point either we use a CMD or ENTRYPOINT to start up our app in the container:

Manage

We will use the Command Palette here. It is almost a ridiculous amount of commands that it lets us invoke. Let’s try to mention them by topic though. The Command Palette consists of a long long list of Docker commands. We can

  • Build Docker image image
  • Run a container
  • See logs
  • Stop/Remove a container
  • Stop/Remove an image
  • Show logs and much much more…

Let’s focus on getting an app up and running.

Build the app

Ok, this is a really simple app so let’s turn it into a Node.js app by going to the terminal and run

npm init -y

then run:

npm install express

followed by adding the following to app.js:

const express = require('express')
const app = express()
const port = 3000

app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, () => console.log(`Example app listening on port ${PORT}!`))

lastly update package.json by adding the following to scripts:

"start": "node.app.js"

Now we are ready!

For all below commands, bring up the command palette with View / Command Palette from the menu or invoke the short command, for Mac it is CMD + SHIFT + P

Build the image

Start typing Docker: Build, the autocomplete will narrow down the choices. Invoke the suggested command.

This will ask us if we want to use the Dockerfile where we are standing and what to tag the image with. After we’ve done our choices it set’s about pulling down the base images and carries out all the commands in the Dockerfile.
Once it’s done you should be able to see the newly built image by typing docker images and look for the tag name that you gave it, it should be listed on top.

Run the image

Start typing Docker: Run and take the command it suggests. This will give you a list of Docker images you could run. Looking at the command it invokes in the terminal it looks like so: docker run --rm -d -p 3000:3000/tcp articles:latest

Docker Compose

Of course, we can leverage the power of Docker Compose, both up and down.

Start typing Docker: Compose Up, this will create the Docker images the first time it’s run followed by running the containers. Verify this with docker ps. Additionally, we have Docker: Compose Down and Docker Compose Restart.

Browse repositories

At the bottom of your action bar, you should have an icon that looks like a Docker whale. Clicking that and you should be faced with:


As you can see above you can view all the images on your machine but you can also look in different registries such as Docker Hub, Azure and if you’ve added any private registries. To use the Azure one you would need the Azure Account extension installed. Once that is installed you should see something like this:

There are more commands we can carry out if we right-click on an a Docker image in our container registry in Azure:

As you can see we can look at our resource in the portal. We can remove the entire repository but we can also PULL down whatever is there to our local machine.

Deploy to Cloud

There is one way to deploy to the Cloud:

  • Deploy from Docker Hub

This article Article covering this extension says deployment from Container Registry should be possible. I’m sure it is, I just couldn’t figure it out how to do it from the extension. I will update the articles as soon as I do figure it out.

Anyway, to Deploy from Docker Hub you just need to log in to it and right click your Docker image and select Deploy, like so:

Summary

We’ve shown you a lot of things you can do with this Visual Studio Code extension. You can manage your images, containers and do all sort of things with them like build them, run them, see the logs and even bring them to the Cloud.

I hope you found this useful and that you give the extension a go.

#docker #vscode #devops

What is GEEK

Buddha Community

Improve your Docker workflow with this VS Code extension
Nabunya  Jane

Nabunya Jane

1620290313

10 Most Useful VS Code Extensions For Web Development

Microsoft Visual Studio Code (VS Code) is one of the top code editors for software developers. Since its release, its popularity has surged not only because of the stable platform it provides, but also because of the extensible nature that Microsoft built into it.

T

his article will cover 10 developer problems those can be solved by below extensions and help you to become 10x engineer.

#vscode #extension #vs-code-extensions #web-development #vs code

How we created a GitLab Workflow Extension for VS Code

The people who work at GitLab are encouraged to build the things they want and need, which helps us expand the ways we work with our growing product. We’re thrilled to announce that we’ve created an official GitLab Workflow Extension for VS Code.

How did we get here?

More than two years agoFatih Acet, a former senior frontend engineer, Plan, started working on a VS Code extension to allow users to interact with GitLab from within their code editor. At GitLab, everything starts with a Merge Request, which is exactly how Fatih started building the extension. Fatih, along with more than 25 contributors, continued to expand on the extension by adding new features. The extension has been installed more than 160,000 times.

It’s been remarkable to see the way the community collaborated to build the extension, making it a tool that is valuable to their work. The GitLab Workflow Extension is the perfect case study of how developers can create meaningful work at this company.

When Fatih decided to move on from GitLab in March 2020, we had an opportunity to take over the GitLab Workflow Extension, turning it into a tool GitLab would officially maintain and support. We jumped at the opportunity to maintain an auxilary project outside of the main GitLab project. As we continue to move fast and create the best experiences possible for our users, we expect this extension to become a key component of our strategy.

How to use the extension

If you want to start using the extension, you can install it from within VS Code directly by searching for GitLab Workflow which is now published through an official GitLab account.

If you were already using the extension, it automatically updated to the GitLab publisher, and you might have already seen a few updates coming in.

What improvements have we made?

When we took over the extension, we worked with other teams across GitLab to immediately perform an application security review. Along the way, we made sure to create a security release-process. We did this to ensure that users were safe to continue using the extension and so that we could fix any problems that surface. We also worked through some automation to help with publishing the extension and begin to lay a foundation for future testing.

We also shipped version 3.0.0 which was spearheaded by our community and helped to resolve some long-standing bugs and issues. The extension has come a long way in just a few short weeks. We’re excited by the progress we’re making and the engagement we’re continuing to see, but there is still a lot that needs to be done.

What’s next?

Nothing in software development is perfect, and so we are aware of the shortcomings of the extension, some inconsistencies, and some long open feature requests. You can see our many to-dos on our GitLab Workflow Extension issues list. For now, we’re focused on triaging the existing issues and capturing any new bugs. You should see much more involvement from our Create: Editor team as we continue these efforts, and we’re looking forward to engaging with the community on these items.

We’re also evaluating the best path forward for maintaining the extension, by focusing on the test-suite and code-quality, so we won’t break things by accident. You can join us in our discussion on this issue. While this might slow down some new feature releases in the short-term, we’re confident these are the right long-term decisions to ensure you have an extension you can trust, so you can make the GitLab Extension an integral part of your workflow.

Everyone can contribute

The extension is open source, and we’re improving the “How to Contribute” guides alongside some other documentation. We want to have a space where everyone can contribute and make this extension better for all of us.

#visual studio code #code #vs code

Tyrique  Littel

Tyrique Littel

1604008800

Static Code Analysis: What It Is? How to Use It?

Static code analysis refers to the technique of approximating the runtime behavior of a program. In other words, it is the process of predicting the output of a program without actually executing it.

Lately, however, the term “Static Code Analysis” is more commonly used to refer to one of the applications of this technique rather than the technique itself — program comprehension — understanding the program and detecting issues in it (anything from syntax errors to type mismatches, performance hogs likely bugs, security loopholes, etc.). This is the usage we’d be referring to throughout this post.

“The refinement of techniques for the prompt discovery of error serves as well as any other as a hallmark of what we mean by science.”

  • J. Robert Oppenheimer

Outline

We cover a lot of ground in this post. The aim is to build an understanding of static code analysis and to equip you with the basic theory, and the right tools so that you can write analyzers on your own.

We start our journey with laying down the essential parts of the pipeline which a compiler follows to understand what a piece of code does. We learn where to tap points in this pipeline to plug in our analyzers and extract meaningful information. In the latter half, we get our feet wet, and write four such static analyzers, completely from scratch, in Python.

Note that although the ideas here are discussed in light of Python, static code analyzers across all programming languages are carved out along similar lines. We chose Python because of the availability of an easy to use ast module, and wide adoption of the language itself.

How does it all work?

Before a computer can finally “understand” and execute a piece of code, it goes through a series of complicated transformations:

static analysis workflow

As you can see in the diagram (go ahead, zoom it!), the static analyzers feed on the output of these stages. To be able to better understand the static analysis techniques, let’s look at each of these steps in some more detail:

Scanning

The first thing that a compiler does when trying to understand a piece of code is to break it down into smaller chunks, also known as tokens. Tokens are akin to what words are in a language.

A token might consist of either a single character, like (, or literals (like integers, strings, e.g., 7Bob, etc.), or reserved keywords of that language (e.g, def in Python). Characters which do not contribute towards the semantics of a program, like trailing whitespace, comments, etc. are often discarded by the scanner.

Python provides the tokenize module in its standard library to let you play around with tokens:

Python

1

import io

2

import tokenize

3

4

code = b"color = input('Enter your favourite color: ')"

5

6

for token in tokenize.tokenize(io.BytesIO(code).readline):

7

    print(token)

Python

1

TokenInfo(type=62 (ENCODING),  string='utf-8')

2

TokenInfo(type=1  (NAME),      string='color')

3

TokenInfo(type=54 (OP),        string='=')

4

TokenInfo(type=1  (NAME),      string='input')

5

TokenInfo(type=54 (OP),        string='(')

6

TokenInfo(type=3  (STRING),    string="'Enter your favourite color: '")

7

TokenInfo(type=54 (OP),        string=')')

8

TokenInfo(type=4  (NEWLINE),   string='')

9

TokenInfo(type=0  (ENDMARKER), string='')

(Note that for the sake of readability, I’ve omitted a few columns from the result above — metadata like starting index, ending index, a copy of the line on which a token occurs, etc.)

#code quality #code review #static analysis #static code analysis #code analysis #static analysis tools #code review tips #static code analyzer #static code analysis tool #static analyzer

Iliana  Welch

Iliana Welch

1595249460

Docker Explained: Docker Architecture | Docker Registries

Following the second video about Docker basics, in this video, I explain Docker architecture and explain the different building blocks of the docker engine; docker client, API, Docker Daemon. I also explain what a docker registry is and I finish the video with a demo explaining and illustrating how to use Docker hub

In this video lesson you will learn:

  • What is Docker Host
  • What is Docker Engine
  • Learn about Docker Architecture
  • Learn about Docker client and Docker Daemon
  • Docker Hub and Registries
  • Simple demo to understand using images from registries

#docker #docker hub #docker host #docker engine #docker architecture #api

Samanta  Moore

Samanta Moore

1621137960

Guidelines for Java Code Reviews

Get a jump-start on your next code review session with this list.

Having another pair of eyes scan your code is always useful and helps you spot mistakes before you break production. You need not be an expert to review someone’s code. Some experience with the programming language and a review checklist should help you get started. We’ve put together a list of things you should keep in mind when you’re reviewing Java code. Read on!

1. Follow Java Code Conventions

2. Replace Imperative Code With Lambdas and Streams

3. Beware of the NullPointerException

4. Directly Assigning References From Client Code to a Field

5. Handle Exceptions With Care

#java #code quality #java tutorial #code analysis #code reviews #code review tips #code analysis tools #java tutorial for beginners #java code review