1598321742
Learn to use the Deno built-in tools including a linter, test runner, script tools, and many others. We introduce each inclusion and provide usage advice.
One surprising difference between Deno and Node.js is the number of tools built into the runtime. Other than a Read-Eval-Print Loop (REPL) console, Node.js requires third-party modules to handle most indirect coding activities such as testing and linting. The Deno built-in tools provide almost everything you need out of the box.
Before we begin, a note. Deno is new! Use these tools with caution. Some may be unstable. Few have configuration options. Others may have undesirable side effects such as recursively processing every file in every subdirectory. It’s best to test tools from a dedicated project directory.
Install Deno on macOS or Linux using the following terminal command:
curl -fsSL https://deno.land/x/install/install.sh | sh
Or from Windows Powershell:
iwr https://deno.land/x/install/install.ps1 -useb | iex
Further installation options are provided in the Deno manual.
Enter deno --version
to check installation has been successful. The version numbers for the V8 JavaScript engine, TypeScript compiler, and Deno itself are displayed.
Upgrade Deno to the latest version with:
deno upgrade
Or upgrade to specific release such as v1.3.0:
deno upgrade --version 1.30.0
Most of the tools below are available in all versions, but later editions may have more features and bug fixes.
A list of tools and options can be viewed by entering:
deno help
Like Node.js, a REPL expression evaluation console can be accessed by entering deno
in your terminal. Each expression you enter returns a result or undefined
:
$ deno
Deno 1.3.0
exit using ctrl+d or close()
> const w = 'World';
undefined
> w
World
> console.log(`Hello ${w}!`);
Hello World!
undefined
> close()
$
Previously entered expressions can be re-entered by using the cursor keys to navigate through the expression history.
A tree of all module dependencies can be viewed by entering deno info <module>
where <module>
is the path/URL to an entry script.
Consider the following lib.js
library code with exported hello
and sum
functions:
// general library: lib.js
/**
* return "Hello <name>!" string
* @module lib
* @param {string} name
* @returns {string} Hello <name>!
*/
export function hello(name = 'Anonymous') {
return `Hello ${ name.trim() }!`;
};
/**
* Returns total of all arguments
* @module lib
* @param {...*} args
* @returns {*} total
*/
export function sum(...args) {
return [...args].reduce((a, b) => a + b);
}
These can be used from a main entry script, index.js
, in the same directory:
// main entry script: index.js
// import lib.js modules
import { hello, sum } from './lib.js';
const
spr = sum('Site', 'Point', '.com', ' ', 'reader'),
add = sum(1, 2, 3);
// output
console.log( hello(spr) );
console.log( 'total:', add );
The result of running deno run ./index.js
:
$ deno run ./index.js
Hello SitePoint.com reader!
total: 6
The dependencies used by index.js
can be examined with deno info ./index.js
:
$ deno info ./index.js
local: /home/deno/testing/index.js
type: JavaScript
deps:
file:///home/deno/testing/index.js
└── file:///home/deno/testing/lib.js
#deno #javascript #node #typescript #web-development
1620729846
Can you use WordPress for anything other than blogging? To your surprise, yes. WordPress is more than just a blogging tool, and it has helped thousands of websites and web applications to thrive. The use of WordPress powers around 40% of online projects, and today in our blog, we would visit some amazing uses of WordPress other than blogging.
What Is The Use Of WordPress?
WordPress is the most popular website platform in the world. It is the first choice of businesses that want to set a feature-rich and dynamic Content Management System. So, if you ask what WordPress is used for, the answer is – everything. It is a super-flexible, feature-rich and secure platform that offers everything to build unique websites and applications. Let’s start knowing them:
1. Multiple Websites Under A Single Installation
WordPress Multisite allows you to develop multiple sites from a single WordPress installation. You can download WordPress and start building websites you want to launch under a single server. Literally speaking, you can handle hundreds of sites from one single dashboard, which now needs applause.
It is a highly efficient platform that allows you to easily run several websites under the same login credentials. One of the best things about WordPress is the themes it has to offer. You can simply download them and plugin for various sites and save space on sites without losing their speed.
2. WordPress Social Network
WordPress can be used for high-end projects such as Social Media Network. If you don’t have the money and patience to hire a coder and invest months in building a feature-rich social media site, go for WordPress. It is one of the most amazing uses of WordPress. Its stunning CMS is unbeatable. And you can build sites as good as Facebook or Reddit etc. It can just make the process a lot easier.
To set up a social media network, you would have to download a WordPress Plugin called BuddyPress. It would allow you to connect a community page with ease and would provide all the necessary features of a community or social media. It has direct messaging, activity stream, user groups, extended profiles, and so much more. You just have to download and configure it.
If BuddyPress doesn’t meet all your needs, don’t give up on your dreams. You can try out WP Symposium or PeepSo. There are also several themes you can use to build a social network.
3. Create A Forum For Your Brand’s Community
Communities are very important for your business. They help you stay in constant connection with your users and consumers. And allow you to turn them into a loyal customer base. Meanwhile, there are many good technologies that can be used for building a community page – the good old WordPress is still the best.
It is the best community development technology. If you want to build your online community, you need to consider all the amazing features you get with WordPress. Plugins such as BB Press is an open-source, template-driven PHP/ MySQL forum software. It is very simple and doesn’t hamper the experience of the website.
Other tools such as wpFoRo and Asgaros Forum are equally good for creating a community blog. They are lightweight tools that are easy to manage and integrate with your WordPress site easily. However, there is only one tiny problem; you need to have some technical knowledge to build a WordPress Community blog page.
4. Shortcodes
Since we gave you a problem in the previous section, we would also give you a perfect solution for it. You might not know to code, but you have shortcodes. Shortcodes help you execute functions without having to code. It is an easy way to build an amazing website, add new features, customize plugins easily. They are short lines of code, and rather than memorizing multiple lines; you can have zero technical knowledge and start building a feature-rich website or application.
There are also plugins like Shortcoder, Shortcodes Ultimate, and the Basics available on WordPress that can be used, and you would not even have to remember the shortcodes.
5. Build Online Stores
If you still think about why to use WordPress, use it to build an online store. You can start selling your goods online and start selling. It is an affordable technology that helps you build a feature-rich eCommerce store with WordPress.
WooCommerce is an extension of WordPress and is one of the most used eCommerce solutions. WooCommerce holds a 28% share of the global market and is one of the best ways to set up an online store. It allows you to build user-friendly and professional online stores and has thousands of free and paid extensions. Moreover as an open-source platform, and you don’t have to pay for the license.
Apart from WooCommerce, there are Easy Digital Downloads, iThemes Exchange, Shopify eCommerce plugin, and so much more available.
6. Security Features
WordPress takes security very seriously. It offers tons of external solutions that help you in safeguarding your WordPress site. While there is no way to ensure 100% security, it provides regular updates with security patches and provides several plugins to help with backups, two-factor authorization, and more.
By choosing hosting providers like WP Engine, you can improve the security of the website. It helps in threat detection, manage patching and updates, and internal security audits for the customers, and so much more.
#use of wordpress #use wordpress for business website #use wordpress for website #what is use of wordpress #why use wordpress #why use wordpress to build a website
1597848060
rameworks and libraries can be said as the fundamental building blocks when developers build software or applications. These tools help in opting out the repetitive tasks as well as reduce the amount of code that the developers need to write for a particular software.
Recently, the Stack Overflow Developer Survey 2020 surveyed nearly 65,000 developers, where they voted their go-to tools and libraries. Here, we list down the top 12 frameworks and libraries from the survey that are most used by developers around the globe in 2020.
(The libraries are listed according to their number of Stars in GitHub)
**GitHub Stars: **147k
Rank: 5
**About: **Originally developed by researchers of Google Brain team, TensorFlow is an end-to-end open-source platform for machine learning. It has a comprehensive, flexible ecosystem of tools, libraries, and community resources that lets researchers push the state-of-the-art research in ML. It allows developers to easily build and deploy ML-powered applications.
Know more here.
**GitHub Stars: **98.3k
**Rank: **9
About: Created by Google, Flutter is a free and open-source software development kit (SDK) which enables fast user experiences for mobile, web and desktop from a single codebase. The SDK works with existing code and is used by developers and organisations around the world.
#opinions #developer tools #frameworks #java tools #libraries #most used tools by developers #python tools
1597781520
Deno is a JavaScript/TypeScript runtime with secure defaults and great developer experience. It’s built on V8, Rust, and Tokio. I suggest you watch these talks by Ryan: He talks about his mistakes with Nodejs here and a more in-depth look into deno here
In this article, we’re going to build a simple cli tool to demonstrate some of the features of deno. Our cli will be interacting with a COVID API to fetch live data.
Requirement: make sure you have deno installed. If you don’t, refer to this link. It’s pretty straightforward.
Deno has the entry file mod.ts
so we will follow the same in this article if you are following this article along with the coding you can create a folder named covid-cli
, inside that folder you can create a file called mod.ts
and copy the below code there.
const { args } = Deno;
import { parse } from "https://deno.land/std/flags/mod.ts";
console.dir(parse(args));
Here the parse(args, options = {});
contains two parameters where args is an Array
and options
is an object, let try to run the above code using this cmd.
deno run mod.ts arg1 -f hello --flag World --booleanFlag
#deno #cli tool #tool
1598001060
The DevOps methodology, a software and team management approach defined by the portmanteau of Development and Operations, was first coined in 2009 and has since become a buzzword concept in the IT field.
DevOps has come to mean many things to each individual who uses the term as DevOps is not a singularly defined standard, software, or process but more of a culture. Gartner defines DevOps as:
“DevOps represents a change in IT culture, focusing on rapid IT service delivery through the adoption of agile, lean practices in the context of a system-oriented approach. DevOps emphasizes people (and culture), and seeks to improve collaboration between operations and development teams. DevOps implementations utilize technology — especially automation tools that can leverage an increasingly programmable and dynamic infrastructure from a life cycle perspective.”
As you can see from the above definition, DevOps is a multi-faceted approach to the Software Development Life Cycle (SDLC), but its main underlying strength is how it leverages technology and software to streamline this process. So with the right approach to DevOps, notably adopting its philosophies of co-operation and implementing the right tools, your business can increase deployment frequency by a factor of 30 and lead times by a factor of 8000 over traditional methods, according to a CapGemini survey.
This list is designed to be as comprehensive as possible. The article comprises both very well established tools for those who are new to the DevOps methodology and those tools that are more recent releases to the market — either way, there is bound to be a tool on here that can be an asset for you and your business. For those who already live and breathe DevOps, we hope you find something that will assist you in your growing enterprise.
With such a litany of tools to choose from, there is no “right” answer to what tools you should adopt. No single tool will cover all your needs and will be deployed across a variety of development and Operational teams, so let’s break down what you need to consider before choosing what tool might work for you.
With all that in mind, I hope this selection of tools will aid you as your business continues to expand into the DevOps lifestyle.
Continuous Integration and Delivery
AWS CloudFormation is an absolute must if you are currently working, or planning to work, in the AWS Cloud. CloudFormation allows you to model your AWS infrastructure and provision all your AWS resources swiftly and easily. All of this is done within a JSON or YAML template file and the service comes with a variety of automation features ensuring your deployments will be predictable, reliable, and manageable.
Link: https://aws.amazon.com/cloudformation/
Azure Resource Manager (ARM) is Microsoft’s answer to an all-encompassing IAC tool. With its ARM templates, described within JSON files, Azure Resource Manager will provision your infrastructure, handle dependencies, and declare multiple resources via a single template.
Link: https://azure.microsoft.com/en-us/features/resource-manager/
Much like the tools mentioned above, Google Cloud Deployment Manager is Google’s IAC tool for the Google Cloud Platform. This tool utilizes YAML for its config files and JINJA2 or PYTHON for its templates. Some of its notable features are synchronistic deployment and ‘preview’, allowing you an overhead view of changes before they are committed.
Link: https://cloud.google.com/deployment-manager/
Terraform is brought to you by HashiCorp, the makers of Vault and Nomad. Terraform is vastly different from the above-mentioned tools in that it is not restricted to a specific cloud environment, this comes with increased benefits for tackling complex distributed applications without being tied to a single platform. And much like Google Cloud Deployment Manager, Terraform also has a preview feature.
Link: https://www.terraform.io/
Chef is an ideal choice for those who favor CI/CD. At its heart, Chef utilizes self-described recipes, templates, and cookbooks; a collection of ready-made templates. Cookbooks allow for consistent configuration even as your infrastructure rapidly scales. All of this is wrapped up in a beautiful Ruby-based DSL pie.
Link: https://www.chef.io/products/chef-infra/
#tools #devops #devops 2020 #tech tools #tool selection #tool comparison
1642148640
Hello everyone, today we will talk about Ansible, a fantastic software tool that allows you to automate cross-platform computer support in a simple but effective way.
Ansible is a tool that generates written instructions for automating IT professionals' work throughout the entire system infrastructure.
It's designed particularly for IT professionals who use it for application deployment, configuration management, intra-service orchestration, and practically anything else a systems administrator does on a weekly or daily basis.
Ansible is simple to install because it doesn't require any agent software or other security infrastructure.
While Ansible is at the cutting edge of automation, systems administration, and DevOps, it's also valuable as a tool for devs to use in their daily work.
Ansible allows you to set up not just one machine but a complete network of them all at once, and it doesn't require any programming knowledge.
Ansible connects to nodes on a network (clients, servers, etc.) and then send a little program called an Ansible module to each node.
It then runs these modules through SSH and deletes them once they're done.
Your Ansible control node must have login access to the managed nodes for this interaction to work.
The most frequent method of authentication is SSH keys, but alternative methods are also allowed.
If you want to see how to install and start using Ansible, we'll cover that below.
Now let's take a look at Ansible's architecture and how it manages operations.
Plugins are supplementary pieces of code that enhance functionality, and you've probably used them in many other tools and platforms. You can use Ansible's built-in plugins or create your own.
Examples are:
Modules are short programs that Ansible distributes to all nodes or remote hosts from a central control workstation. Modules control things like services and packages and can be executed via playbooks.
Ansible runs all of the modules needed to install updates or complete whatever operation is required and then removes them after they're done.
Ansible uses an inventory file to track which hosts are part of your infrastructure and then accesses them to perform commands and playbooks.
Ansible works in parallel with various systems in your infrastructure. It accomplishes this by picking methods mentioned in Ansible's inventory file, which is saved in the host location by default.
Once the inventory is registered, you can use a simple text file to assign variables to any of the hosts, and you may retrieve inventory from a variety of sources.
IT professionals can use Ansible playbooks to program applications, services, server nodes, and other devices without starting from scratch. Ansible playbooks, along with the conditions, variables, and tasks included within them, can be stored, shared, and reused forever.
Ansible playbooks function similarly to task manuals. They're simple YAML files, a human-readable data serialization language.
Playbooks are at the heart of what makes Ansible so popular. They specify activities that can be completed quickly without requiring the user to know or remember any specific syntax.
In a more extensive or more uniform system, Ansible may be a better fit. It also provides a set of modules for managing various methods and cloud infrastructure.
Modernization and digital transformation require automation that's both necessary and purposeful. We need a new management solution in today's dynamic contexts to increase speed, scale, and stability throughout IT infrastructure.
Technology is our most potent instrument for product improvement. Previously, accomplishing this required a significant amount of manual labor and intricate coordination. But today, Ansible - a simple yet powerful IT automation engine used by thousands of enterprises to simplify their setups and speed DevOps operations - is available.
Run the following commands to configure the PPA on your machine and install Ansible:
Update the repository:
sudo apt-get update
Install the software properties:
sudo apt-get install software-properties-common
And then install Ansible like this:
sudo apt-add-repository --yes --update ppa:ansible/ansible
Then run this:
sudo apt-get install ansible
You should have something similar to what is shown below:
Now that you have successfully installed Ansible, let's test if it's working by using the command below:
ansible --version
We'll use the command below to instruct Ansible to target all systems for the inventory host localhost, and we'll run the module ping from your local console (rather than ssh).
ansible all -i localhost, --connection=local -m ping
You should get a response similar to what you can see below:
We'll make changes to the host's file in /etc/ansible/hosts
. This is the default file where Ansible searches for any defined hosts (and groups) where the given commands should be executed remotely.
sudo nano /etc/ansible/hosts
Add the lines below to the file and save the modifications:
[local]
localhost
Execute this command with your adjusted inventory file:
ansible all --connection=local -m ping
The response should look similar to what we have below:
We deploy our Ansible test program to our remote server using a Digital Ocean droplet.
Use the command below to ssh into the server:
ssh username@IP_Address
Note: we have already configured an ssh key in our profile, which was selected when creating the droplet.
We will edit our hosts file in /etc/ansible/hosts using the command below:
sudo nano /etc/ansible/hosts
Add the lines below to the file and save the modifications:
[remote]
remote_test
[remote:vars]
ansible_host=IP_ADDRESS_OF_VIRTUAL_MACHINE
ansible_user=USERNAME
To see if Ansible can connect to your remote compute instance over SSH, let's type the following command:
ansible remote -m ping
We'll make an Ansible playbook using the command below, which is the typical way of telling Ansible which commands to run on the remote server and in what order. The playbook is written in .yml and follows a strict format.
In the official Ansible documentation, you can learn more about playbooks.
nano my-playbook.yml
Add the following code, which tells Ansible to install Docker in several steps:
---
- name: install docker
hosts: remote
become_method: sudo
become_user: root
vars: #local variables
docker_packages:
- apt-transport-https
- ca-certificates
- curl
- software-properties-common
tasks:
- name: Update apt packages
become: true #make sure you execute the task with sudo privileges
apt: #use apt module
update_cache: yes #equivalent of apt-get update
- name: Install packages needed for Docker
become: true
apt:
name: "{{ docker_packages }}" #uses our declared variable docker_packages
state: present #indicates the desired package state
force_apt_get: yes #forces to use apt-get
- name: Add official GPG key of Docker
shell: curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
- name: Save the current Ubuntu release version into a variable
shell: lsb_release -cs
register: ubuntu_version #Output gets stored in this variable
- name: Set right Docker directory
become: true
shell: add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu {{ ubuntu_version.stdout }} stable"
- name: Update apt packages
become: true
apt:
update_cache: yes
- name: Install Docker
become: true
apt:
name: docker-ce
state: present
force_apt_get: yes
- name: Test Docker with hello world example
become: true
shell: docker run hello-world
register: hello_world_output
- name: Show output of hello word example
debug: #use debug module
msg: "Container Output: {{hello_world_output.stdout}}"
We can now execute it with the command below:
ansible-playbook my-playbook.yml -l remote
After that, we'll see some magic happen (it might take a while), and somewhere in the last debug message in our terminal, we should see "Hello from Docker!"
In this article, we had a detailed look into Ansible, its benefits, how it works and what it can do, its architecture, plugins, playbook, inventory, and how to configure and deploy Docker with Ansible on a remote server.
Thank you for reading!