Archie  Clayton

Archie Clayton

1548059633

Creating a custom shader in Three.js

3D stuff in the browser is awesome. After playing around with threejs for some time and making a mini-game at school I started to like it a lot. A classmate that is really into graphics programming told me a little bit about WebGL and shaders. It seemed really cool and I promised myself I would make my own shader. Of course some other shiny thing caught my attention and I forgot about it but, from today on I can finally say that I have created a shader and used it within threejs.

Three JS

Before going all in on shaders it is probably a good idea to explain what three js is. Threejs is a javascript library to ease the process of creating 3D scenes on a canvas. Other popular solutions like a-frame and whitestorm jsare build on top of it. If you have ever played around with those but want even more control definitely try it out! (If you are a TypeScript lover, three js has type definitions 😉).

The most popular intro to this library is creating a cube and making it spin. There is a written tutorial in the threejs documentation and a brilliant youtube tutorial by CJ Gammon that is part of his 'diving in: three js' series.


Creating this cube is a basically preparing a film set and placing it inside of that set. You create a scene and a camera and pass these to a renderer to say: "hey this is my movie set". Then you can place mesh, which is basically an object, within the scene. This mesh consists of a geometry (the shape of the object) and a material (the color, behavior towards light and more). Based on the material you have chosen, you might want to add different kinds of lights to the scene. In order to animate the object and actually display everything you create a loop. Within this loop you tell the renderer to show the scene. Your code might look like this:

window.addEventListener('load', init)
let scene
let camera
let renderer
let sceneObjects = []

function init() {
scene = new THREE.Scene()

camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000)
camera.position.z = 5

renderer = new THREE.WebGLRenderer()
renderer.setSize(window.innerWidth, window.innerHeight)

document.body.appendChild(renderer.domElement)
adjustLighting()
addBasicCube()
animationLoop()
}

function adjustLighting() {
let pointLight = new THREE.PointLight(0xdddddd)
pointLight.position.set(-5, -3, 3)
scene.add(pointLight)

let ambientLight = new THREE.AmbientLight(0x505050)
scene.add(ambientLight)

}

function addBasicCube() {
let geometry = new THREE.BoxGeometry(1, 1, 1)
let material = new THREE.MeshLambertMaterial()

let mesh = new THREE.Mesh(geometry, material)
mesh.position.x = -2
scene.add(mesh)
sceneObjects.push(mesh)
}

function animationLoop() {
renderer.render(scene, camera)

for(let object of sceneObjects) {
object.rotation.x += 0.01
object.rotation.y += 0.03
}

requestAnimationFrame(animationLoop)
}

Shaders

Shaders are basically functions or small scripts that are executed by the GPU. This is where WebGL and GLSL (OpenGL Shading Language) come into play. WebGL is a browser API that allows javascript to run code on the GPU. This can increase the performance of certain scripts because your GPU is optimized for doing graphics related calculations. WebGL even allows us to write code that will be executed directly by the GPU in the GLSL language. These pieces of GLSL code are our shaders and since threejs has a WebGL renderer we can write shaders to modify our mesh. In threejs you can create custom material by using the ‘shader material’. This material accepts two shaders, a vertex shader and a fragment shader. Let’s try to make ‘gradient material’.

Vertex Shader

A vertex shader is a function that is applied on every vertex (point) of a mesh. It is usually used to distort or animate the shape of a mesh. Within our script it looks something like this:

function vertexShader() {
return `
varying vec3 vUv;

void main() {
  vUv = position; 

  vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
  gl_Position = projectionMatrix * modelViewPosition; 
}

} </pre><p>The first thing that you probably notice is that all our GLSL code is in a string. We do this because WebGL will pass this piece of code to our GPU and we have to pass the code to WebGL within javascript. The second thing you might notice is that we are using variables that we did not create. This is because threejs passes those variables to the GPU for us.</p><p>Within this piece of code we calculate where the points of our mesh should be placed. We do this by calculating where the points are in the scene by multiplying the position of the mesh in the scene (modelViewMatrix) and the position of the point. After that we multiply this value with the camera's relation to the scene (projectionMatrix) so the camera settings within threejs are respected by our shader. The gl_Position is the value that the GPU takes to draw our points.</p><p>Right now this vertex shader doesn't change anything about our shape. So why even bother creating this at all? We will need the positions of parts of our mesh to create a nice gradient. By creating a 'varying' variable we can pass the position to another shader.</p><h2>Fragment shader</h2><p>A fragment shader is a function that is applied on every fragment of our mesh. A fragment is a result of a process called rasterization which turns the entire mesh into a collection of triangles. For every pixel that is covered by our mesh there will be at least one fragment. The fragment shader is usually used to do color transformations on pixels. Our fragment shader looks like this:</p><pre class="ql-syntax" spellcheck="false"> return
uniform vec3 colorA;
uniform vec3 colorB;
varying vec3 vUv;

  void main() {
    gl_FragColor = vec4(mix(colorA, colorB, vUv.z), 1.0);
  }

`
}

As you can see we take the value of the position that was passed by the vertex shader. We want to apply a mix of the colors A and B based on the position of the fragment on the z axis of our mesh. But where do the colors A and B come from? These are ‘uniform’ variables which means they are passed into the shader from the outside. The mix function will calculate the RGB value we want to draw for this fragment. This color and an additional value for the opacity are passed to gl_FragColor. Our GPU will set the color of a fragment to this color.

Creating the material

Now that we’ve created the shaders we can finally build our threejs mesh with a custom material.

function addExperimentalCube() {
let uniforms = {
colorB: {type: ‘vec3’, value: new THREE.Color(0xACB6E5)},
colorA: {type: ‘vec3’, value: new THREE.Color(0x74ebd5)}
}

let geometry = new THREE.BoxGeometry(1, 1, 1)
let material = new THREE.ShaderMaterial({
uniforms: uniforms,
fragmentShader: fragmentShader(),
vertexShader: vertexShader(),
})

let mesh = new THREE.Mesh(geometry, material)
mesh.position.x = 2
scene.add(mesh)
sceneObjects.push(mesh)
}

This is where everything comes together. Our ‘uniforms’ colorA and colorB are created and passed along with the vertex shader and fragment shader into the shader material. The material and geometry are used to create a mesh and the mesh is added to the scene.



I build this in glitch. A friend recommended it and it is great! Some add blockers block you loading the embed though, so here is a direct link just in case.

The left cube is a cube using mesh lambert material, the right cube uses our own ‘gradient material’. As you can see our material looks pretty sweet but ignores the light settings in the scene. This is because we didn’t do the math in our fragment shader to take the light into account. This is hopefully something I figure out soon 😝.

Resources

It took some time to figure this out and if you liked this you should really check out the sources I have used to learn and understand this:


#javascript #three-js

What is GEEK

Buddha Community

NBB: Ad-hoc CLJS Scripting on Node.js

Nbb

Not babashka. Node.js babashka!?

Ad-hoc CLJS scripting on Node.js.

Status

Experimental. Please report issues here.

Goals and features

Nbb's main goal is to make it easy to get started with ad hoc CLJS scripting on Node.js.

Additional goals and features are:

  • Fast startup without relying on a custom version of Node.js.
  • Small artifact (current size is around 1.2MB).
  • First class macros.
  • Support building small TUI apps using Reagent.
  • Complement babashka with libraries from the Node.js ecosystem.

Requirements

Nbb requires Node.js v12 or newer.

How does this tool work?

CLJS code is evaluated through SCI, the same interpreter that powers babashka. Because SCI works with advanced compilation, the bundle size, especially when combined with other dependencies, is smaller than what you get with self-hosted CLJS. That makes startup faster. The trade-off is that execution is less performant and that only a subset of CLJS is available (e.g. no deftype, yet).

Usage

Install nbb from NPM:

$ npm install nbb -g

Omit -g for a local install.

Try out an expression:

$ nbb -e '(+ 1 2 3)'
6

And then install some other NPM libraries to use in the script. E.g.:

$ npm install csv-parse shelljs zx

Create a script which uses the NPM libraries:

(ns script
  (:require ["csv-parse/lib/sync$default" :as csv-parse]
            ["fs" :as fs]
            ["path" :as path]
            ["shelljs$default" :as sh]
            ["term-size$default" :as term-size]
            ["zx$default" :as zx]
            ["zx$fs" :as zxfs]
            [nbb.core :refer [*file*]]))

(prn (path/resolve "."))

(prn (term-size))

(println (count (str (fs/readFileSync *file*))))

(prn (sh/ls "."))

(prn (csv-parse "foo,bar"))

(prn (zxfs/existsSync *file*))

(zx/$ #js ["ls"])

Call the script:

$ nbb script.cljs
"/private/tmp/test-script"
#js {:columns 216, :rows 47}
510
#js ["node_modules" "package-lock.json" "package.json" "script.cljs"]
#js [#js ["foo" "bar"]]
true
$ ls
node_modules
package-lock.json
package.json
script.cljs

Macros

Nbb has first class support for macros: you can define them right inside your .cljs file, like you are used to from JVM Clojure. Consider the plet macro to make working with promises more palatable:

(defmacro plet
  [bindings & body]
  (let [binding-pairs (reverse (partition 2 bindings))
        body (cons 'do body)]
    (reduce (fn [body [sym expr]]
              (let [expr (list '.resolve 'js/Promise expr)]
                (list '.then expr (list 'clojure.core/fn (vector sym)
                                        body))))
            body
            binding-pairs)))

Using this macro we can look async code more like sync code. Consider this puppeteer example:

(-> (.launch puppeteer)
      (.then (fn [browser]
               (-> (.newPage browser)
                   (.then (fn [page]
                            (-> (.goto page "https://clojure.org")
                                (.then #(.screenshot page #js{:path "screenshot.png"}))
                                (.catch #(js/console.log %))
                                (.then #(.close browser)))))))))

Using plet this becomes:

(plet [browser (.launch puppeteer)
       page (.newPage browser)
       _ (.goto page "https://clojure.org")
       _ (-> (.screenshot page #js{:path "screenshot.png"})
             (.catch #(js/console.log %)))]
      (.close browser))

See the puppeteer example for the full code.

Since v0.0.36, nbb includes promesa which is a library to deal with promises. The above plet macro is similar to promesa.core/let.

Startup time

$ time nbb -e '(+ 1 2 3)'
6
nbb -e '(+ 1 2 3)'   0.17s  user 0.02s system 109% cpu 0.168 total

The baseline startup time for a script is about 170ms seconds on my laptop. When invoked via npx this adds another 300ms or so, so for faster startup, either use a globally installed nbb or use $(npm bin)/nbb script.cljs to bypass npx.

Dependencies

NPM dependencies

Nbb does not depend on any NPM dependencies. All NPM libraries loaded by a script are resolved relative to that script. When using the Reagent module, React is resolved in the same way as any other NPM library.

Classpath

To load .cljs files from local paths or dependencies, you can use the --classpath argument. The current dir is added to the classpath automatically. So if there is a file foo/bar.cljs relative to your current dir, then you can load it via (:require [foo.bar :as fb]). Note that nbb uses the same naming conventions for namespaces and directories as other Clojure tools: foo-bar in the namespace name becomes foo_bar in the directory name.

To load dependencies from the Clojure ecosystem, you can use the Clojure CLI or babashka to download them and produce a classpath:

$ classpath="$(clojure -A:nbb -Spath -Sdeps '{:aliases {:nbb {:replace-deps {com.github.seancorfield/honeysql {:git/tag "v2.0.0-rc5" :git/sha "01c3a55"}}}}}')"

and then feed it to the --classpath argument:

$ nbb --classpath "$classpath" -e "(require '[honey.sql :as sql]) (sql/format {:select :foo :from :bar :where [:= :baz 2]})"
["SELECT foo FROM bar WHERE baz = ?" 2]

Currently nbb only reads from directories, not jar files, so you are encouraged to use git libs. Support for .jar files will be added later.

Current file

The name of the file that is currently being executed is available via nbb.core/*file* or on the metadata of vars:

(ns foo
  (:require [nbb.core :refer [*file*]]))

(prn *file*) ;; "/private/tmp/foo.cljs"

(defn f [])
(prn (:file (meta #'f))) ;; "/private/tmp/foo.cljs"

Reagent

Nbb includes reagent.core which will be lazily loaded when required. You can use this together with ink to create a TUI application:

$ npm install ink

ink-demo.cljs:

(ns ink-demo
  (:require ["ink" :refer [render Text]]
            [reagent.core :as r]))

(defonce state (r/atom 0))

(doseq [n (range 1 11)]
  (js/setTimeout #(swap! state inc) (* n 500)))

(defn hello []
  [:> Text {:color "green"} "Hello, world! " @state])

(render (r/as-element [hello]))

Promesa

Working with callbacks and promises can become tedious. Since nbb v0.0.36 the promesa.core namespace is included with the let and do! macros. An example:

(ns prom
  (:require [promesa.core :as p]))

(defn sleep [ms]
  (js/Promise.
   (fn [resolve _]
     (js/setTimeout resolve ms))))

(defn do-stuff
  []
  (p/do!
   (println "Doing stuff which takes a while")
   (sleep 1000)
   1))

(p/let [a (do-stuff)
        b (inc a)
        c (do-stuff)
        d (+ b c)]
  (prn d))
$ nbb prom.cljs
Doing stuff which takes a while
Doing stuff which takes a while
3

Also see API docs.

Js-interop

Since nbb v0.0.75 applied-science/js-interop is available:

(ns example
  (:require [applied-science.js-interop :as j]))

(def o (j/lit {:a 1 :b 2 :c {:d 1}}))

(prn (j/select-keys o [:a :b])) ;; #js {:a 1, :b 2}
(prn (j/get-in o [:c :d])) ;; 1

Most of this library is supported in nbb, except the following:

  • destructuring using :syms
  • property access using .-x notation. In nbb, you must use keywords.

See the example of what is currently supported.

Examples

See the examples directory for small examples.

Also check out these projects built with nbb:

API

See API documentation.

Migrating to shadow-cljs

See this gist on how to convert an nbb script or project to shadow-cljs.

Build

Prequisites:

  • babashka >= 0.4.0
  • Clojure CLI >= 1.10.3.933
  • Node.js 16.5.0 (lower version may work, but this is the one I used to build)

To build:

  • Clone and cd into this repo
  • bb release

Run bb tasks for more project-related tasks.

Download Details:
Author: borkdude
Download Link: Download The Source Code
Official Website: https://github.com/borkdude/nbb 
License: EPL-1.0

#node #javascript

Laravel 8 Create Custom Helper Function Example

Today, We will see laravel 8 create custom helper function example, as we all know laravel provides many ready mate function in their framework, but many times we need to require our own customized function to use in our project that time we need to create custom helper function, So, here i am show you custom helper function example in laravel 8.

Laravel 8 Create Custom Helper Function Example

https://websolutionstuff.com/post/laravel-8-create-custom-helper-function-example


Read Also : Cron Job Scheduling In Laravel

https://websolutionstuff.com/post/cron-job-scheduling-in-laravel

#laravel 8 create custom helper function example #laravel #custom helper function #how to create custom helper in laravel 8 #laravel helper functions #custom helper functions in laravel

Custom AngularJS Web App Development Company in USA

Looking for the best custom AngularJS app development company? AppClues Infotech is a top-rated AngularJS app development company in USA producing robust, highly interactive and data-driven AngularJS web and mobile applications with advanced features & technologies.

For more info:
Website: https://www.appcluesinfotech.com/
Email: info@appcluesinfotech.com
Call: +1-978-309-9910

#custom angular js web app development company in usa #best angular js app development company in usa #hire angular js app developers in usa #top angular js app development company #professional angular js app developers #leading angular js app development agency

Archie  Clayton

Archie Clayton

1548059633

Creating a custom shader in Three.js

3D stuff in the browser is awesome. After playing around with threejs for some time and making a mini-game at school I started to like it a lot. A classmate that is really into graphics programming told me a little bit about WebGL and shaders. It seemed really cool and I promised myself I would make my own shader. Of course some other shiny thing caught my attention and I forgot about it but, from today on I can finally say that I have created a shader and used it within threejs.

Three JS

Before going all in on shaders it is probably a good idea to explain what three js is. Threejs is a javascript library to ease the process of creating 3D scenes on a canvas. Other popular solutions like a-frame and whitestorm jsare build on top of it. If you have ever played around with those but want even more control definitely try it out! (If you are a TypeScript lover, three js has type definitions 😉).

The most popular intro to this library is creating a cube and making it spin. There is a written tutorial in the threejs documentation and a brilliant youtube tutorial by CJ Gammon that is part of his 'diving in: three js' series.


Creating this cube is a basically preparing a film set and placing it inside of that set. You create a scene and a camera and pass these to a renderer to say: "hey this is my movie set". Then you can place mesh, which is basically an object, within the scene. This mesh consists of a geometry (the shape of the object) and a material (the color, behavior towards light and more). Based on the material you have chosen, you might want to add different kinds of lights to the scene. In order to animate the object and actually display everything you create a loop. Within this loop you tell the renderer to show the scene. Your code might look like this:

window.addEventListener('load', init)
let scene
let camera
let renderer
let sceneObjects = []

function init() {
scene = new THREE.Scene()

camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000)
camera.position.z = 5

renderer = new THREE.WebGLRenderer()
renderer.setSize(window.innerWidth, window.innerHeight)

document.body.appendChild(renderer.domElement)
adjustLighting()
addBasicCube()
animationLoop()
}

function adjustLighting() {
let pointLight = new THREE.PointLight(0xdddddd)
pointLight.position.set(-5, -3, 3)
scene.add(pointLight)

let ambientLight = new THREE.AmbientLight(0x505050)
scene.add(ambientLight)

}

function addBasicCube() {
let geometry = new THREE.BoxGeometry(1, 1, 1)
let material = new THREE.MeshLambertMaterial()

let mesh = new THREE.Mesh(geometry, material)
mesh.position.x = -2
scene.add(mesh)
sceneObjects.push(mesh)
}

function animationLoop() {
renderer.render(scene, camera)

for(let object of sceneObjects) {
object.rotation.x += 0.01
object.rotation.y += 0.03
}

requestAnimationFrame(animationLoop)
}

Shaders

Shaders are basically functions or small scripts that are executed by the GPU. This is where WebGL and GLSL (OpenGL Shading Language) come into play. WebGL is a browser API that allows javascript to run code on the GPU. This can increase the performance of certain scripts because your GPU is optimized for doing graphics related calculations. WebGL even allows us to write code that will be executed directly by the GPU in the GLSL language. These pieces of GLSL code are our shaders and since threejs has a WebGL renderer we can write shaders to modify our mesh. In threejs you can create custom material by using the ‘shader material’. This material accepts two shaders, a vertex shader and a fragment shader. Let’s try to make ‘gradient material’.

Vertex Shader

A vertex shader is a function that is applied on every vertex (point) of a mesh. It is usually used to distort or animate the shape of a mesh. Within our script it looks something like this:

function vertexShader() {
return `
varying vec3 vUv;

void main() {
  vUv = position; 

  vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
  gl_Position = projectionMatrix * modelViewPosition; 
}

} </pre><p>The first thing that you probably notice is that all our GLSL code is in a string. We do this because WebGL will pass this piece of code to our GPU and we have to pass the code to WebGL within javascript. The second thing you might notice is that we are using variables that we did not create. This is because threejs passes those variables to the GPU for us.</p><p>Within this piece of code we calculate where the points of our mesh should be placed. We do this by calculating where the points are in the scene by multiplying the position of the mesh in the scene (modelViewMatrix) and the position of the point. After that we multiply this value with the camera's relation to the scene (projectionMatrix) so the camera settings within threejs are respected by our shader. The gl_Position is the value that the GPU takes to draw our points.</p><p>Right now this vertex shader doesn't change anything about our shape. So why even bother creating this at all? We will need the positions of parts of our mesh to create a nice gradient. By creating a 'varying' variable we can pass the position to another shader.</p><h2>Fragment shader</h2><p>A fragment shader is a function that is applied on every fragment of our mesh. A fragment is a result of a process called rasterization which turns the entire mesh into a collection of triangles. For every pixel that is covered by our mesh there will be at least one fragment. The fragment shader is usually used to do color transformations on pixels. Our fragment shader looks like this:</p><pre class="ql-syntax" spellcheck="false"> return
uniform vec3 colorA;
uniform vec3 colorB;
varying vec3 vUv;

  void main() {
    gl_FragColor = vec4(mix(colorA, colorB, vUv.z), 1.0);
  }

`
}

As you can see we take the value of the position that was passed by the vertex shader. We want to apply a mix of the colors A and B based on the position of the fragment on the z axis of our mesh. But where do the colors A and B come from? These are ‘uniform’ variables which means they are passed into the shader from the outside. The mix function will calculate the RGB value we want to draw for this fragment. This color and an additional value for the opacity are passed to gl_FragColor. Our GPU will set the color of a fragment to this color.

Creating the material

Now that we’ve created the shaders we can finally build our threejs mesh with a custom material.

function addExperimentalCube() {
let uniforms = {
colorB: {type: ‘vec3’, value: new THREE.Color(0xACB6E5)},
colorA: {type: ‘vec3’, value: new THREE.Color(0x74ebd5)}
}

let geometry = new THREE.BoxGeometry(1, 1, 1)
let material = new THREE.ShaderMaterial({
uniforms: uniforms,
fragmentShader: fragmentShader(),
vertexShader: vertexShader(),
})

let mesh = new THREE.Mesh(geometry, material)
mesh.position.x = 2
scene.add(mesh)
sceneObjects.push(mesh)
}

This is where everything comes together. Our ‘uniforms’ colorA and colorB are created and passed along with the vertex shader and fragment shader into the shader material. The material and geometry are used to create a mesh and the mesh is added to the scene.



I build this in glitch. A friend recommended it and it is great! Some add blockers block you loading the embed though, so here is a direct link just in case.

The left cube is a cube using mesh lambert material, the right cube uses our own ‘gradient material’. As you can see our material looks pretty sweet but ignores the light settings in the scene. This is because we didn’t do the math in our fragment shader to take the light into account. This is hopefully something I figure out soon 😝.

Resources

It took some time to figure this out and if you liked this you should really check out the sources I have used to learn and understand this:


#javascript #three-js

Hire Dedicated Node.js Developers - Hire Node.js Developers

If you look at the backend technology used by today’s most popular apps there is one thing you would find common among them and that is the use of NodeJS Framework. Yes, the NodeJS framework is that effective and successful.

If you wish to have a strong backend for efficient app performance then have NodeJS at the backend.

WebClues Infotech offers different levels of experienced and expert professionals for your app development needs. So hire a dedicated NodeJS developer from WebClues Infotech with your experience requirement and expertise.

So what are you waiting for? Get your app developed with strong performance parameters from WebClues Infotech

For inquiry click here: https://www.webcluesinfotech.com/hire-nodejs-developer/

Book Free Interview: https://bit.ly/3dDShFg

#hire dedicated node.js developers #hire node.js developers #hire top dedicated node.js developers #hire node.js developers in usa & india #hire node js development company #hire the best node.js developers & programmers