1563867497
Because objects in #JavaScript are references values, you can’t simply just copy using the =
. But no worries, here are 3 ways for you to clone an object 👍
const food = { beef: '🥩', bacon: '🥓' }
// "Spread"
{ ...food }
// "Object.assign"
Object.assign({}, food)
// "JSON"
JSON.parse(JSON.stringify(food))
// RESULT:
// { beef: '🥩', bacon: '🥓' }
Your first question might be, whey can’t I use =
. Let’s see what happens if we do that:
const obj = {one: 1, two: 2};
const obj2 = obj;
console.log(
obj, // {one: 1, two: 2};
obj2 // {one: 1, two: 2};
)
So far, both object seems to output the same thing. So no problem, right. But let’s see what happens if we edit our second object:
const obj2.three = 3;
console.log(obj2);
// {one: 1, two: 2, three: 3}; <-- ✅
console.log(obj);
// {one: 1, two: 2, three: 3}; <-- 😱
WTH?! I changed obj2
but why was obj
also affected. That’s because Objects are reference types. So when you use =
, it copied the pointer to the memory space it occupies. Reference types don’t hold values, they are a pointer to the value in memory.
If you want to learn more about this, check out Gordon’s Zhu Watch and Code course. It’s free to enroll and watch the video “Comparison with objects”. He gives a super awesome explanation on it.
Using spread will clone your object. Note this will be a shallow copy. As of this post, the spread operator for cloning objects is in Stage 4. So it’s not officially in the specifications yet. So if you were to use this, you would need to compile it with Babel (or something similar).
const food = { beef: '🥩', bacon: '🥓' };
const cloneFood = { ...food };
console.log(cloneFood);
// { beef: '🥩', bacon: '🥓' }
Alternatively, Object.assign
is in the official released and also create a shallow copy of the object.
const food = { beef: '🥩', bacon: '🥓' };
const cloneFood = Object.assign({}, food);
console.log(cloneFood);
// { beef: '🥩', bacon: '🥓' }
This final way will give you a deep copy. Now I will mention, this is a quick and dirty way of deep cloning an object. For a more robust solution, I would recommend using something like lodash
const food = { beef: '🥩', bacon: '🥓' };
const cloneFood = JSON.parse(JSON.stringify(food))
console.log(cloneFood);
// { beef: '🥩', bacon: '🥓' }
Here’s a comment from the community. Yes, it was for my previous post, How to Deep Clone an Array. But the idea still applies to objects.
Alfredo Salzillo: I’d like you to note that there are some differences between deepClone and JSON.stringify/parse.
Here’s an example:
const lodashClonedeep = require("lodash.clonedeep");
const arrOfFunction = [() => 2, {
test: () => 3,
}, Symbol('4')];
// deepClone copy by refence function and Symbol
console.log(lodashClonedeep(arrOfFunction));
// JSON replace function with null and function in object with undefined
console.log(JSON.parse(JSON.stringify(arrOfFunction)));
// function and symbol are copied by reference in deepClone
console.log(lodashClonedeep(arrOfFunction)[0] === lodashClonedeep(arrOfFunction)[0]);
console.log(lodashClonedeep(arrOfFunction)[2] === lodashClonedeep(arrOfFunction)[2]);
@OlegVaraksin: The JSON method has troubles with circular dependencies. Furthermore, the order of properties in the cloned object may be different.
When I used spread ...
to copy an object, I’m only creating a shallow copy. If the array is nested or multi-dimensional, it won’t work. Here’s our example we will be using:
const nestedObject = {
country: '🇨🇦',
{
city: 'vancouver'
}
};
Let’s clone our object using spread:
const shallowClone = { ...nestedObject };
// Changed our cloned object
clonedNestedObject.country = '🇹🇼'
clonedNestedObject.country.city = 'taipei';
So we changed our cloned object by changing the city. Let’s see the output.
console.log(shallowClone);
// {country: '🇹🇼', {city: 'taipei'}} <-- ✅
console.log(nestedObject);
// {country: '🇨🇦', {city: 'taipei'}} <-- 😱
A shallow copy means the first level is copied, deeper levels are referenced.
Let’s take the same example but applying a deep copy using “JSON”
const deepClone = JSON.parse(JSON.stringify(nestedObject));
console.log(deepClone);
// {country: '🇹🇼', {city: 'taipei'}} <-- ✅
console.log(nestedObject);
// {country: '🇨🇦', {city: 'vancouver'}} <-- ✅
As you can see, the deep copy is a true copy for nested objects. Often time shallow copy is good enough, you don’t really need a deep copy. It’s like a nail gun vs a hammer. Most of the time the hammer is perfectly fine. Using a nail gun for some small arts and craft is often case an overkill, a hammer is just fine. It’s all about using the right tool for the right job 🤓
Unfortunately, I can’t write a test for spread because it’s not officially in the spec yet. Nevertheless, I included it in the test so you can run it in the future 😝. But the result shows Object.assign
is a lot faster than JSON
.
@d9el: It’s important to note that Object.assign is a function which modifies and returns the target object. In Samantha’s example using the following,
const cloneFood = Object.assign({}, food)
{}
is the object that is modified. The target object is not referenced by any variable at that point, but because Object.assign
returns the target object, we are able to store the resulting assigned object into the cloneFood
variable. We could switch our example up and use the following:
const food = { beef: '🌽', bacon: '🥓' };
Object.assign(food, { beef: '🥩' });
console.log(food);
// { beef: '🥩', bacon: '🥓' }
Obviously, the value of beef
in our food object is wrong, so we can assign the correct value of beef
using Object.assign
. We aren’t actually using the returned value of the function at all, but we are modifying our target object which we have referenced with the const food
.
Spread on the other hand is an operator which copies properties of one object into a new object. If we wanted to replicate the above example using spread to modify our variable food...
const food = { beef: '🌽', bacon: '🥓' };
food = {
...food,
beef: '🥩',
}
// TypeError: invalid assignment to const `food'
...
we get an error, because we use spread when creating new objects, and therefore are assigning a whole new object to food
which was declared with const
, which is illegal. So we can either choose to declare a new variable to hold our new object in, like the following:
const food = { beef: '🌽', bacon: '🥓' };
const newFood = {
...food,
beef: '🥩',
}
console.log(newFood);
// { beef: '🥩', bacon: '🥓' }
or we could declare food
with let
or var
which would allow us to assign a whole new object:
let food = { beef: '🌽', bacon: '🥓' };
food = {
...food,
beef: '🥩',
}
console.log(food);
// { beef: '🥩', bacon: '🥓' }
Thanks: @d9el
$.extend();
function_.clone()
Object.fromEntries(Object.entries(food))
[shallow] clones the object.#javascript
1591611780
How can I find the correct ulimit values for a user account or process on Linux systems?
For proper operation, we must ensure that the correct ulimit values set after installing various software. The Linux system provides means of restricting the number of resources that can be used. Limits set for each Linux user account. However, system limits are applied separately to each process that is running for that user too. For example, if certain thresholds are too low, the system might not be able to server web pages using Nginx/Apache or PHP/Python app. System resource limits viewed or set with the NA command. Let us see how to use the ulimit that provides control over the resources available to the shell and processes.
#[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object]
1591993440
We are going to build a full stack Todo App using the MEAN (MongoDB, ExpressJS, AngularJS and NodeJS). This is the last part of three-post series tutorial.
MEAN Stack tutorial series:
AngularJS tutorial for beginners (Part I)
Creating RESTful APIs with NodeJS and MongoDB Tutorial (Part II)
MEAN Stack Tutorial: MongoDB, ExpressJS, AngularJS and NodeJS (Part III) 👈 you are here
Before completing the app, let’s cover some background about the this stack. If you rather jump to the hands-on part click here to get started.
#[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object]
1595434320
Mit dem integrierten Debugger von Visual Studio Code lassen sich ASP.NET Core bzw. .NET Core Applikationen einfach und problemlos debuggen. Der Debugger unterstützt auch Remote Debugging, somit lassen sich zum Beispiel .NET Core Programme, die in einem Docker-Container laufen, debuggen.
Als Beispiel Applikation reicht das Default-Template für MVC Applikationen dotnet new mvc
$ md docker-core-debugger
$ cd docker-core-debugger
$ dotnet new mvc
Mit dotnet run prüfen wir kurz, ob die Applikation läuft und unter der Adresse http://localhost:5000 erreichbar ist.
$ dotnet run
$ Hosting environment: Production
$ Content root path: D:\Temp\docker-aspnetcore
$ Now listening on: http://localhost:5000
Die .NET Core Applikation builden wir mit dotnet build und publishen alles mit Hilfe von dotnet publish
$ dotnet build
$ dotnet publish -c Debug -o out --runtime linux-x64
Dabei gilt es zu beachten, dass die Build Configuration mit -c Debug gesetzt ist und das Output Directory auf -o out. Sonst findet Docker die nötigen Binaries nicht. Für den Docker Container brauchen wir nun ein Dockerfile, dass beim Start vorgängig den .NET Core command line debugger (VSDBG) installiert. Das Installations-Script für VSDBG ist unter https://aka.ms/getvsdbgsh abfrufbar.
FROM microsoft/aspnetcore:latest
WORKDIR /app
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
unzip procps \
&& rm -rf /var/lib/apt/lists/* \
&& curl -sSL https://aka.ms/getvsdbgsh | bash /dev/stdin -v latest -l /vsdbg
COPY ./out .
ENTRYPOINT ["dotnet", "docker-core-debugger.dll"]
Den Docker Container erstellen wir mit dem docker build Kommando
$ docker build -t coreapp .
und starten die Applikation mit docker run.
$ docker run -d -p 8080:80 --name coreapp coreapp
Jetzt muss Visual Studio Code nur noch wissen, wo unsere Applikation läuft. Dazu definieren wir eine launch.json vom Typ attach und konfigurieren die nötigen Parameter für den Debugger.
{
"version": "0.2.0",
"configurations": [
{
"name": ".NET Core Remote Attach",
"type": "coreclr",
"request": "attach",
"processId": "${command:pickRemoteProcess}",
"pipeTransport": {
"pipeProgram": "docker",
"pipeArgs": ["exec", "-i coreapp ${debuggerCommand}"],
"quoteArgs": false,
"debuggerPath": "/vsdbg/vsdbg",
"pipeCwd": "${workspaceRoot}"
},
"logging": {
"engineLogging": true,
"exceptions": true,
"moduleLoad": true,
"programOutput": true
},
}
]
}
Mit F5 starten wir den Debugger. Wenn alles klappt, sollte eine Auswahl der Prozesse des Docker-Containers sichtbar sein.
Nun muss der dotnet Prozess ausgewählt werden. Der Visual Studio Code Debugger verbindet sich darauf mit VSDBG und wir können wie gewohnt unseren Code debuggen. Dazu setzen wir einen Breakpoint in der Index-Action des HomeControllers und rufen mit dem Browser die URL http://localhost:8080/ auf.
#[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object]
1592610180
CentOS Linux 8.2 (2004) released. It is a Linux distribution derived from RHEL (Red Hat Enterprise Linux) 8.2 source code. CentOS was created when Red Hat stopped providing RHEL free. CentOS 8.2 gives complete control of its open-source software packages and is fully customized for research needs or for running a high-performance website without the need for license fees. Let us see what’s new in CentOS 8.2 (2004) and how to upgrade existing CentOS 8.1.1199 server to 8.2.2004 using the command line.
#[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object]
1598195340
How do I configure Amazon SES With Postfix mail server to send email under a CentOS/RHEL/Fedora/Ubuntu/Debian Linux server?
Amazon Simple Email Service (SES) is a hosted email service for you to send and receive email using your email addresses and domains. Typically SES used for sending bulk email or routing emails without hosting MTA. We can use Perl/Python/PHP APIs to send an email via SES. Another option is to configure Linux or Unix box running Postfix to route all outgoing emails via SES.
Before getting started with Amazon SES and Postfix, you need to sign up for AWS, including SES. You need to verify your email address and other settings. Make sure you create a user for SES access and download credentials too.
If sendmail installed remove it. Debian/Ubuntu Linux user type the following apt command/apt-get command:
$`` sudo apt --purge remove sendmail
CentOS/RHEL user type the following yum command or dnf command on Fedora/CentOS/RHEL 8.x:
$`` sudo yum remove sendmail
$`` sudo dnf remove sendmail
Sample outputs from CentOS 8 server:
Dependencies resolved.
===============================================================================
Package Architecture Version Repository Size
===============================================================================
Removing:
sendmail x86_64 8.15.2-32.el8 @AppStream 2.4 M
Removing unused dependencies:
cyrus-sasl x86_64 2.1.27-1.el8 @BaseOS 160 k
procmail x86_64 3.22-47.el8 @AppStream 369 k
Transaction Summary
===============================================================================
Remove 3 Packages
Freed space: 2.9 M
Is this ok [y/N]: y
#[object object] #[object object] #[object object] #[object object] #[object object] #[object object] #[object object]