1593144001
Semantic segmentation is the problem of detecting and delineating each object of interest appearing in an image. Currently, there are several approaches that solve this problem and produce results as seen below.
Figure 1: Semantic segmentation example (Source)
This kind of segmentation is predicting every pixel in the image and is known as Dense Prediction as well. It’s important to notice that the instances of the same class are not being separated, the model only cares about the pixel’s category. As shown in Figure 1, the method can say that there are chairs in certain positions, but cannot distinguish them.
One of the main applications of this technique is in Autonomous Vehicles, where cars need to understand their environment. Semantic Segmentation is able to assign a meaning to the scenes and put the car in the context, indicating the lane position, if there is some obstruction, as fallen trees or pedestrians crossing the road, and recognizing other cars.
Video 1: Example of Semantic Segmentation for Autonomous Driving
Therefore, applying Semantic Segmentation algorithms in urban street scenes is one of the main Computer Vision challenges nowadays. A popular dataset to evaluate model performance is the Cityscapes. It contains 30 classes from 50 different cities varying the season and wheater conditions.
Figure 2 shows how different algorithms have reached the state of the art in this dataset over time.
Figure 2: Semantic Segmentation on Cityscapes dataset
In this project, it was used a mid-level model that can deliver a reasonable precision and run in real-time. The RefineNet [2] was firstly introduced at the end of 2016 from researches of The University of Adelaide and converted to a Light-Weight model in 2018 [3], allowing real-time inferences.
#tensorflowjs #machine-learning #artificial-intelligence #tensorflow
1632537859
Not babashka. Node.js babashka!?
Ad-hoc CLJS scripting on Node.js.
Experimental. Please report issues here.
Nbb's main goal is to make it easy to get started with ad hoc CLJS scripting on Node.js.
Additional goals and features are:
Nbb requires Node.js v12 or newer.
CLJS code is evaluated through SCI, the same interpreter that powers babashka. Because SCI works with advanced compilation, the bundle size, especially when combined with other dependencies, is smaller than what you get with self-hosted CLJS. That makes startup faster. The trade-off is that execution is less performant and that only a subset of CLJS is available (e.g. no deftype, yet).
Install nbb
from NPM:
$ npm install nbb -g
Omit -g
for a local install.
Try out an expression:
$ nbb -e '(+ 1 2 3)'
6
And then install some other NPM libraries to use in the script. E.g.:
$ npm install csv-parse shelljs zx
Create a script which uses the NPM libraries:
(ns script
(:require ["csv-parse/lib/sync$default" :as csv-parse]
["fs" :as fs]
["path" :as path]
["shelljs$default" :as sh]
["term-size$default" :as term-size]
["zx$default" :as zx]
["zx$fs" :as zxfs]
[nbb.core :refer [*file*]]))
(prn (path/resolve "."))
(prn (term-size))
(println (count (str (fs/readFileSync *file*))))
(prn (sh/ls "."))
(prn (csv-parse "foo,bar"))
(prn (zxfs/existsSync *file*))
(zx/$ #js ["ls"])
Call the script:
$ nbb script.cljs
"/private/tmp/test-script"
#js {:columns 216, :rows 47}
510
#js ["node_modules" "package-lock.json" "package.json" "script.cljs"]
#js [#js ["foo" "bar"]]
true
$ ls
node_modules
package-lock.json
package.json
script.cljs
Nbb has first class support for macros: you can define them right inside your .cljs
file, like you are used to from JVM Clojure. Consider the plet
macro to make working with promises more palatable:
(defmacro plet
[bindings & body]
(let [binding-pairs (reverse (partition 2 bindings))
body (cons 'do body)]
(reduce (fn [body [sym expr]]
(let [expr (list '.resolve 'js/Promise expr)]
(list '.then expr (list 'clojure.core/fn (vector sym)
body))))
body
binding-pairs)))
Using this macro we can look async code more like sync code. Consider this puppeteer example:
(-> (.launch puppeteer)
(.then (fn [browser]
(-> (.newPage browser)
(.then (fn [page]
(-> (.goto page "https://clojure.org")
(.then #(.screenshot page #js{:path "screenshot.png"}))
(.catch #(js/console.log %))
(.then #(.close browser)))))))))
Using plet
this becomes:
(plet [browser (.launch puppeteer)
page (.newPage browser)
_ (.goto page "https://clojure.org")
_ (-> (.screenshot page #js{:path "screenshot.png"})
(.catch #(js/console.log %)))]
(.close browser))
See the puppeteer example for the full code.
Since v0.0.36, nbb includes promesa which is a library to deal with promises. The above plet
macro is similar to promesa.core/let
.
$ time nbb -e '(+ 1 2 3)'
6
nbb -e '(+ 1 2 3)' 0.17s user 0.02s system 109% cpu 0.168 total
The baseline startup time for a script is about 170ms seconds on my laptop. When invoked via npx
this adds another 300ms or so, so for faster startup, either use a globally installed nbb
or use $(npm bin)/nbb script.cljs
to bypass npx
.
Nbb does not depend on any NPM dependencies. All NPM libraries loaded by a script are resolved relative to that script. When using the Reagent module, React is resolved in the same way as any other NPM library.
To load .cljs
files from local paths or dependencies, you can use the --classpath
argument. The current dir is added to the classpath automatically. So if there is a file foo/bar.cljs
relative to your current dir, then you can load it via (:require [foo.bar :as fb])
. Note that nbb
uses the same naming conventions for namespaces and directories as other Clojure tools: foo-bar
in the namespace name becomes foo_bar
in the directory name.
To load dependencies from the Clojure ecosystem, you can use the Clojure CLI or babashka to download them and produce a classpath:
$ classpath="$(clojure -A:nbb -Spath -Sdeps '{:aliases {:nbb {:replace-deps {com.github.seancorfield/honeysql {:git/tag "v2.0.0-rc5" :git/sha "01c3a55"}}}}}')"
and then feed it to the --classpath
argument:
$ nbb --classpath "$classpath" -e "(require '[honey.sql :as sql]) (sql/format {:select :foo :from :bar :where [:= :baz 2]})"
["SELECT foo FROM bar WHERE baz = ?" 2]
Currently nbb
only reads from directories, not jar files, so you are encouraged to use git libs. Support for .jar
files will be added later.
The name of the file that is currently being executed is available via nbb.core/*file*
or on the metadata of vars:
(ns foo
(:require [nbb.core :refer [*file*]]))
(prn *file*) ;; "/private/tmp/foo.cljs"
(defn f [])
(prn (:file (meta #'f))) ;; "/private/tmp/foo.cljs"
Nbb includes reagent.core
which will be lazily loaded when required. You can use this together with ink to create a TUI application:
$ npm install ink
ink-demo.cljs
:
(ns ink-demo
(:require ["ink" :refer [render Text]]
[reagent.core :as r]))
(defonce state (r/atom 0))
(doseq [n (range 1 11)]
(js/setTimeout #(swap! state inc) (* n 500)))
(defn hello []
[:> Text {:color "green"} "Hello, world! " @state])
(render (r/as-element [hello]))
Working with callbacks and promises can become tedious. Since nbb v0.0.36 the promesa.core
namespace is included with the let
and do!
macros. An example:
(ns prom
(:require [promesa.core :as p]))
(defn sleep [ms]
(js/Promise.
(fn [resolve _]
(js/setTimeout resolve ms))))
(defn do-stuff
[]
(p/do!
(println "Doing stuff which takes a while")
(sleep 1000)
1))
(p/let [a (do-stuff)
b (inc a)
c (do-stuff)
d (+ b c)]
(prn d))
$ nbb prom.cljs
Doing stuff which takes a while
Doing stuff which takes a while
3
Also see API docs.
Since nbb v0.0.75 applied-science/js-interop is available:
(ns example
(:require [applied-science.js-interop :as j]))
(def o (j/lit {:a 1 :b 2 :c {:d 1}}))
(prn (j/select-keys o [:a :b])) ;; #js {:a 1, :b 2}
(prn (j/get-in o [:c :d])) ;; 1
Most of this library is supported in nbb, except the following:
:syms
.-x
notation. In nbb, you must use keywords.See the example of what is currently supported.
See the examples directory for small examples.
Also check out these projects built with nbb:
See API documentation.
See this gist on how to convert an nbb script or project to shadow-cljs.
Prequisites:
To build:
bb release
Run bb tasks
for more project-related tasks.
Download Details:
Author: borkdude
Download Link: Download The Source Code
Official Website: https://github.com/borkdude/nbb
License: EPL-1.0
#node #javascript
1621242214
(https://analyticsindiamag.com/google-arts-culture-uses-ai-to-preserve-endangered-languages/)
Semantic Segmentation laid down the fundamental path to advanced Computer Vision tasks such as object detection, shape recognition, autonomous driving, robotics, and virtual reality. Semantic segmentation can be defined as the process of pixel-level image classification into two or more Object classes. It differs from image classification entirely, as the latter performs image-level classification. For instance, consider an image that consists mainly of a zebra, surrounded by grass fields, a tree and a flying bird. Image classification tells us that the image belongs to the ‘zebra’ class. It can not tell where the zebra is or what its size or pose is. But, semantic segmentation of that image may tell that there is a zebra, grass field, a bird and a tree in the given image (classifies parts of an image into separate classes). And it tells us which pixels in the image belong to which class.
In this article, we discuss semantic segmentation using TensorFlow Keras. Readers are expected to have a fundamental knowledge of deep learning, image classification and transfer learning. Nevertheless, the following articles might fulfil these prerequisites with a quick and clear understanding:
Let’s dive deeper into hands-on learning.
#developers corner #densenet #image classification #keras #object detection #object segmentation #pix2pix #segmentation #semantic segmentation #tensorflow #tensorflow 2.0 #unet
1621644000
Data management, analytics, data science, and real-time systems will converge this year enabling new automated and self-learning solutions for real-time business operations.
The global pandemic of 2020 has upended social behaviors and business operations. Working from home is the new normal for many, and technology has accelerated and opened new lines of business. Retail and travel have been hit hard, and tech-savvy companies are reinventing e-commerce and in-store channels to survive and thrive. In biotech, pharma, and healthcare, analytics command centers have become the center of operations, much like network operation centers in transport and logistics during pre-COVID times.
While data management and analytics have been critical to strategy and growth over the last decade, COVID-19 has propelled these functions into the center of business operations. Data science and analytics have become a focal point for business leaders to make critical decisions like how to adapt business in this new order of supply and demand and forecast what lies ahead.
In the next year, I anticipate a convergence of data, analytics, integration, and DevOps to create an environment for rapid development of AI-infused applications to address business challenges and opportunities. We will see a proliferation of API-led microservices developer environments for real-time data integration, and the emergence of data hubs as a bridge between at-rest and in-motion data assets, and event-enabled analytics with deeper collaboration between data scientists, DevOps, and ModelOps developers. From this, an ML engineer persona will emerge.
#analytics #artificial intelligence technologies #big data #big data analysis tools #from our experts #machine learning #real-time decisions #real-time analytics #real-time data #real-time data analytics
1612606870
Build a Real Time chat application that can integrated into your social handles. Add more life to your website or support portal with a real time chat solutions for mobile apps that shows online presence indicators, typing status, timestamp, multimedia sharing and much more. Users can also log into the live chat app using their social media logins sparing them from the need to remember usernames and passwords. For more information call us at +18444455767 or email us at hello@sisgain.com or Visit: https://sisgain.com/instant-real-time-chat-solutions-mobile-apps
#real time chat solutions for mobile apps #real time chat app development solutions #live chat software for mobile #live chat software solutions #real time chat app development #real time chat applications in java script
1593144001
Semantic segmentation is the problem of detecting and delineating each object of interest appearing in an image. Currently, there are several approaches that solve this problem and produce results as seen below.
Figure 1: Semantic segmentation example (Source)
This kind of segmentation is predicting every pixel in the image and is known as Dense Prediction as well. It’s important to notice that the instances of the same class are not being separated, the model only cares about the pixel’s category. As shown in Figure 1, the method can say that there are chairs in certain positions, but cannot distinguish them.
One of the main applications of this technique is in Autonomous Vehicles, where cars need to understand their environment. Semantic Segmentation is able to assign a meaning to the scenes and put the car in the context, indicating the lane position, if there is some obstruction, as fallen trees or pedestrians crossing the road, and recognizing other cars.
Video 1: Example of Semantic Segmentation for Autonomous Driving
Therefore, applying Semantic Segmentation algorithms in urban street scenes is one of the main Computer Vision challenges nowadays. A popular dataset to evaluate model performance is the Cityscapes. It contains 30 classes from 50 different cities varying the season and wheater conditions.
Figure 2 shows how different algorithms have reached the state of the art in this dataset over time.
Figure 2: Semantic Segmentation on Cityscapes dataset
In this project, it was used a mid-level model that can deliver a reasonable precision and run in real-time. The RefineNet [2] was firstly introduced at the end of 2016 from researches of The University of Adelaide and converted to a Light-Weight model in 2018 [3], allowing real-time inferences.
#tensorflowjs #machine-learning #artificial-intelligence #tensorflow