Dominic  Feeney

Dominic Feeney

1636451820

How to High Fidelity Posture Tracking with MediaPipe and TensorFlow.js

Today we’re excited to launch MediaPipe's BlazePose in our new pose-detection API. With today’s release we enable developers to use the same models on the web that are powering MLKit Pose and MediaPipe Python unlocking the same great performance across all devices. The new TensorFlow. MediaPipe capitalizes on WASM with GPU accelerated processing and provides faster out-of-the-box inference speed.

The MediaPipe runtime currently lacks Node and iOS Safari support, but we'll be adding the support soon.

#mediapipe #tensorflow #javascript #BlazePose

What is GEEK

Buddha Community

How to High Fidelity Posture Tracking with MediaPipe and TensorFlow.js

NBB: Ad-hoc CLJS Scripting on Node.js

Nbb

Not babashka. Node.js babashka!?

Ad-hoc CLJS scripting on Node.js.

Status

Experimental. Please report issues here.

Goals and features

Nbb's main goal is to make it easy to get started with ad hoc CLJS scripting on Node.js.

Additional goals and features are:

  • Fast startup without relying on a custom version of Node.js.
  • Small artifact (current size is around 1.2MB).
  • First class macros.
  • Support building small TUI apps using Reagent.
  • Complement babashka with libraries from the Node.js ecosystem.

Requirements

Nbb requires Node.js v12 or newer.

How does this tool work?

CLJS code is evaluated through SCI, the same interpreter that powers babashka. Because SCI works with advanced compilation, the bundle size, especially when combined with other dependencies, is smaller than what you get with self-hosted CLJS. That makes startup faster. The trade-off is that execution is less performant and that only a subset of CLJS is available (e.g. no deftype, yet).

Usage

Install nbb from NPM:

$ npm install nbb -g

Omit -g for a local install.

Try out an expression:

$ nbb -e '(+ 1 2 3)'
6

And then install some other NPM libraries to use in the script. E.g.:

$ npm install csv-parse shelljs zx

Create a script which uses the NPM libraries:

(ns script
  (:require ["csv-parse/lib/sync$default" :as csv-parse]
            ["fs" :as fs]
            ["path" :as path]
            ["shelljs$default" :as sh]
            ["term-size$default" :as term-size]
            ["zx$default" :as zx]
            ["zx$fs" :as zxfs]
            [nbb.core :refer [*file*]]))

(prn (path/resolve "."))

(prn (term-size))

(println (count (str (fs/readFileSync *file*))))

(prn (sh/ls "."))

(prn (csv-parse "foo,bar"))

(prn (zxfs/existsSync *file*))

(zx/$ #js ["ls"])

Call the script:

$ nbb script.cljs
"/private/tmp/test-script"
#js {:columns 216, :rows 47}
510
#js ["node_modules" "package-lock.json" "package.json" "script.cljs"]
#js [#js ["foo" "bar"]]
true
$ ls
node_modules
package-lock.json
package.json
script.cljs

Macros

Nbb has first class support for macros: you can define them right inside your .cljs file, like you are used to from JVM Clojure. Consider the plet macro to make working with promises more palatable:

(defmacro plet
  [bindings & body]
  (let [binding-pairs (reverse (partition 2 bindings))
        body (cons 'do body)]
    (reduce (fn [body [sym expr]]
              (let [expr (list '.resolve 'js/Promise expr)]
                (list '.then expr (list 'clojure.core/fn (vector sym)
                                        body))))
            body
            binding-pairs)))

Using this macro we can look async code more like sync code. Consider this puppeteer example:

(-> (.launch puppeteer)
      (.then (fn [browser]
               (-> (.newPage browser)
                   (.then (fn [page]
                            (-> (.goto page "https://clojure.org")
                                (.then #(.screenshot page #js{:path "screenshot.png"}))
                                (.catch #(js/console.log %))
                                (.then #(.close browser)))))))))

Using plet this becomes:

(plet [browser (.launch puppeteer)
       page (.newPage browser)
       _ (.goto page "https://clojure.org")
       _ (-> (.screenshot page #js{:path "screenshot.png"})
             (.catch #(js/console.log %)))]
      (.close browser))

See the puppeteer example for the full code.

Since v0.0.36, nbb includes promesa which is a library to deal with promises. The above plet macro is similar to promesa.core/let.

Startup time

$ time nbb -e '(+ 1 2 3)'
6
nbb -e '(+ 1 2 3)'   0.17s  user 0.02s system 109% cpu 0.168 total

The baseline startup time for a script is about 170ms seconds on my laptop. When invoked via npx this adds another 300ms or so, so for faster startup, either use a globally installed nbb or use $(npm bin)/nbb script.cljs to bypass npx.

Dependencies

NPM dependencies

Nbb does not depend on any NPM dependencies. All NPM libraries loaded by a script are resolved relative to that script. When using the Reagent module, React is resolved in the same way as any other NPM library.

Classpath

To load .cljs files from local paths or dependencies, you can use the --classpath argument. The current dir is added to the classpath automatically. So if there is a file foo/bar.cljs relative to your current dir, then you can load it via (:require [foo.bar :as fb]). Note that nbb uses the same naming conventions for namespaces and directories as other Clojure tools: foo-bar in the namespace name becomes foo_bar in the directory name.

To load dependencies from the Clojure ecosystem, you can use the Clojure CLI or babashka to download them and produce a classpath:

$ classpath="$(clojure -A:nbb -Spath -Sdeps '{:aliases {:nbb {:replace-deps {com.github.seancorfield/honeysql {:git/tag "v2.0.0-rc5" :git/sha "01c3a55"}}}}}')"

and then feed it to the --classpath argument:

$ nbb --classpath "$classpath" -e "(require '[honey.sql :as sql]) (sql/format {:select :foo :from :bar :where [:= :baz 2]})"
["SELECT foo FROM bar WHERE baz = ?" 2]

Currently nbb only reads from directories, not jar files, so you are encouraged to use git libs. Support for .jar files will be added later.

Current file

The name of the file that is currently being executed is available via nbb.core/*file* or on the metadata of vars:

(ns foo
  (:require [nbb.core :refer [*file*]]))

(prn *file*) ;; "/private/tmp/foo.cljs"

(defn f [])
(prn (:file (meta #'f))) ;; "/private/tmp/foo.cljs"

Reagent

Nbb includes reagent.core which will be lazily loaded when required. You can use this together with ink to create a TUI application:

$ npm install ink

ink-demo.cljs:

(ns ink-demo
  (:require ["ink" :refer [render Text]]
            [reagent.core :as r]))

(defonce state (r/atom 0))

(doseq [n (range 1 11)]
  (js/setTimeout #(swap! state inc) (* n 500)))

(defn hello []
  [:> Text {:color "green"} "Hello, world! " @state])

(render (r/as-element [hello]))

Promesa

Working with callbacks and promises can become tedious. Since nbb v0.0.36 the promesa.core namespace is included with the let and do! macros. An example:

(ns prom
  (:require [promesa.core :as p]))

(defn sleep [ms]
  (js/Promise.
   (fn [resolve _]
     (js/setTimeout resolve ms))))

(defn do-stuff
  []
  (p/do!
   (println "Doing stuff which takes a while")
   (sleep 1000)
   1))

(p/let [a (do-stuff)
        b (inc a)
        c (do-stuff)
        d (+ b c)]
  (prn d))
$ nbb prom.cljs
Doing stuff which takes a while
Doing stuff which takes a while
3

Also see API docs.

Js-interop

Since nbb v0.0.75 applied-science/js-interop is available:

(ns example
  (:require [applied-science.js-interop :as j]))

(def o (j/lit {:a 1 :b 2 :c {:d 1}}))

(prn (j/select-keys o [:a :b])) ;; #js {:a 1, :b 2}
(prn (j/get-in o [:c :d])) ;; 1

Most of this library is supported in nbb, except the following:

  • destructuring using :syms
  • property access using .-x notation. In nbb, you must use keywords.

See the example of what is currently supported.

Examples

See the examples directory for small examples.

Also check out these projects built with nbb:

API

See API documentation.

Migrating to shadow-cljs

See this gist on how to convert an nbb script or project to shadow-cljs.

Build

Prequisites:

  • babashka >= 0.4.0
  • Clojure CLI >= 1.10.3.933
  • Node.js 16.5.0 (lower version may work, but this is the one I used to build)

To build:

  • Clone and cd into this repo
  • bb release

Run bb tasks for more project-related tasks.

Download Details:
Author: borkdude
Download Link: Download The Source Code
Official Website: https://github.com/borkdude/nbb 
License: EPL-1.0

#node #javascript

Dominic  Feeney

Dominic Feeney

1636451820

How to High Fidelity Posture Tracking with MediaPipe and TensorFlow.js

Today we’re excited to launch MediaPipe's BlazePose in our new pose-detection API. With today’s release we enable developers to use the same models on the web that are powering MLKit Pose and MediaPipe Python unlocking the same great performance across all devices. The new TensorFlow. MediaPipe capitalizes on WASM with GPU accelerated processing and provides faster out-of-the-box inference speed.

The MediaPipe runtime currently lacks Node and iOS Safari support, but we'll be adding the support soon.

#mediapipe #tensorflow #javascript #BlazePose

Chiko Nakamura

1583822026

Face and hand tracking in the browser with MediaPipe and TensorFlow.js

Originally published by Ann Yuan and Andrey Vakunov, Software Engineers at Google at blog.tensorflow.org

Today we’re excited to release two new packages: facemesh and handpose for tracking key landmarks on faces and hands respectively. This release has been a collaborative effort between the MediaPipe and TensorFlow.js teams within Google Research.

This is image title
Facemesh package

This is image title
Handpose package

Try the demos live in your browser

The facemesh package finds facial boundaries and landmarks within an image, and handpose does the same for hands. These packages are small, fast, and run entirely within the browser so data never leaves the user’s device, preserving user privacy. You can try them out right now using these links:

These packages are also available as part of MediaPipe, a library for building multimodal perception pipelines:

We hope real time face and hand tracking will enable new modes of interactivity. For example, facial geometry location is the basis for classifying expressions, and hand tracking is the first step for gesture recognition. We’re excited to see how applications with such capabilities will push the boundaries of interactivity and accessibility on the web.

Deep dive: Facemesh

This is image title

The facemesh package infers approximate 3D facial surface geometry from an image or video stream, requiring only a single camera input without the need for a depth sensor. This geometry locates features such as the eyes, nose, and lips within the face, including details such as lip contours and the facial silhouette. This information can be used for downstream tasks such as expression classification (but not for identification). Refer to our model card for details on how the model performs across different datasets. This package is also available through MediaPipe.

Performance characteristics

Facemesh is a lightweight package containing only ~3MB of weights, making it ideally suited for real-time inference on a variety of mobile devices. When testing, note that TensorFlow.js also provides several different backends to choose from, including WebGL and WebAssembly (WASM) with XNNPACK for devices with lower-end GPU’s. The table below shows how the package performs across a few different devices and TensorFlow.js backends:

This is image title

Installation

There are two ways to install the facemesh package:

  1. Through NPM

import * as facemesh from '@tensorflow-models/facemesh;


  1. Through script tags:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-core"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-converter"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/facemesh"></script>


Usage

Once the package is installed, you only need to load the model weights and pass in an image to start detecting facial landmarks:


// Load the MediaPipe facemesh model assets.
const model = await facemesh.load();
 
// Pass in a video stream to the model to obtain 
// an array of detected faces from the MediaPipe graph.
const video = document.querySelector("video");
const faces = await model.estimateFaces(video);
 
// Each face object contains a `scaledMesh` property,
// which is an array of 468 landmarks.
faces.forEach(face => console.log(face.scaledMesh));


The input to estimateFaces can be a video, a static image, or even an ImageData interface for use in node.js pipelines. Facemesh then returns an array of prediction objects for the faces in the input, which include information about each face (e.g. a confidence score, and the locations of 468 landmarks within the face). Here is a sample prediction object:


{
    faceInViewConfidence: 1,
    boundingBox: {
        topLeft: [232.28, 145.26], // [x, y]
        bottomRight: [449.75, 308.36],
    },
    mesh: [
        [92.07, 119.49, -17.54], // [x, y, z]
        [91.97, 102.52, -30.54],
        ...
    ],
    scaledMesh: [
        [322.32, 297.58, -17.54],
        [322.18, 263.95, -30.54]
    ],
    annotations: {
        silhouette: [
            [326.19, 124.72, -3.82],
            [351.06, 126.30, -3.00],
            ...
        ],
        ...
    }
}

Refer to our README for more details about the API.

Deep dive: Handpose

The handpose package detects hands in an input image or video stream, and returns twenty-one 3-dimensional landmarks locating features within each hand. Such landmarks include the locations of each finger joint and the palm. In August 2019, we released the model through MediaPipe - you can find more information about the model architecture in our blogpost accompanying the release. Refer to our model card for details on how handpose performs across different datasets. This package is also available through MediaPipe.

Performance characteristics

Handpose is a relatively lightweight package consisting of ~12MB weights, making it suitable for real-time inference. The table below shows how the package performs across different devices:

This is image title

Installation

There are two ways to install the handpose package.

  1. Through NPM:

import * as handtrack from '@tensorflow-models/handpose;


  1. Through script tags:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-core"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-converter"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/handpose"></script>


Usage

Once the package is installed, you just need to load the model weights and pass in an image to start tracking hand landmarks:


// Load the MediaPipe handpose model assets.
const model = await handpose.load();
 
// Pass in a video stream to the model to obtain 
// a prediction from the MediaPipe graph.
const video = document.querySelector("video");
const hands = await model.estimateHands(video);
 
// Each hand object contains a `landmarks` property,
// which is an array of 21 3-D landmarks.
hands.forEach(hand => console.log(hand.landmarks));

As with facemesh, the input to estimateHands can be a video, a static image, or an ImageData interface. The package then returns an array of objects describing hands in the input. Here is a sample prediction object:


{
    handInViewConfidence: 1,
    boundingBox: {
        topLeft: [162.91, -17.42], // [x, y]
        bottomRight: [548.56, 368.23],
    },
    landmarks: [
        [472.52, 298.59, 0.00], // [x, y, z]
        [412.80, 315.64, -6.18],
        ...
    ],
    annotations: {
        indexFinger: [
            [412.80, 315.64, -6.18],
            [350.02, 298.38, -7.14],
            ...
        ],
        ...
    }
}


Refer to our README for more details about the API.

Looking ahead

We plan to continue improving facemesh and handpose. We will add support for multi-hand tracking in the near future. We are also always working on speeding up our models, especially on mobile devices. In the past months of development, we have seen performance for facemesh and handpose improve significantly, and we believe this trend will continue. The MediaPipe team is developing more streamlined model architectures, and the TensorFlow.js team is always investigating ways to speed up inference, such as operator fusion. Faster inference will in turn unlock larger, more accurate models for use in real time pipelines.

#TensorFlow.js #tensorflow #javascript

How to Categorize TensorFlow.js Images Made easy

TensorFlow.js Image Classification Made Easy
In this video you’re going to discover an easy way how to train a convolutional neural network for image classification and use the created TensorFlow.js image classifier afterwards to score x-ray images locally in your web browser.

TensorFlow.JS is a great machine learning javascript-based framework to run your machine learning models locally in the web browser as well as on your server using node.js.
But defining your model structure and training it, is way more complex than just using a trained model.
Azure Custom Vision - one of the various Cognitive Services - offers you an easy way to avoid this hassle.

#TensorFlow.js #TensorFlow #js

Beth  Cooper

Beth Cooper

1659694200

Easy Activity Tracking for Models, Similar to Github's Public Activity

PublicActivity

public_activity provides easy activity tracking for your ActiveRecord, Mongoid 3 and MongoMapper models in Rails 3 and 4.

Simply put: it can record what happens in your application and gives you the ability to present those recorded activities to users - in a similar way to how GitHub does it.

!! WARNING: README for unreleased version below. !!

You probably don't want to read the docs for this unreleased version 2.0.

For the stable 1.5.X readme see: https://github.com/chaps-io/public_activity/blob/1-5-stable/README.md

About

Here is a simple example showing what this gem is about:

Example usage

Tutorials

Screencast

Ryan Bates made a great screencast describing how to integrate Public Activity.

Tutorial

A great step-by-step guide on implementing activity feeds using public_activity by Ilya Bodrov.

Online demo

You can see an actual application using this gem here: http://public-activity-example.herokuapp.com/feed

The source code of the demo is hosted here: https://github.com/pokonski/activity_blog

Setup

Gem installation

You can install public_activity as you would any other gem:

gem install public_activity

or in your Gemfile:

gem 'public_activity'

Database setup

By default public_activity uses Active Record. If you want to use Mongoid or MongoMapper as your backend, create an initializer file in your Rails application with the corresponding code inside:

For Mongoid:

# config/initializers/public_activity.rb
PublicActivity.configure do |config|
  config.orm = :mongoid
end

For MongoMapper:

# config/initializers/public_activity.rb
PublicActivity.configure do |config|
  config.orm = :mongo_mapper
end

(ActiveRecord only) Create migration for activities and migrate the database (in your Rails project):

rails g public_activity:migration
rake db:migrate

Model configuration

Include PublicActivity::Model and add tracked to the model you want to keep track of:

For ActiveRecord:

class Article < ActiveRecord::Base
  include PublicActivity::Model
  tracked
end

For Mongoid:

class Article
  include Mongoid::Document
  include PublicActivity::Model
  tracked
end

For MongoMapper:

class Article
  include MongoMapper::Document
  include PublicActivity::Model
  tracked
end

And now, by default create/update/destroy activities are recorded in activities table. This is all you need to start recording activities for basic CRUD actions.

Optional: If you don't need #tracked but still want the comfort of #create_activity, you can include only the lightweight Common module instead of Model.

Custom activities

You can trigger custom activities by setting all your required parameters and triggering create_activity on the tracked model, like this:

@article.create_activity key: 'article.commented_on', owner: current_user

See this entry http://rubydoc.info/gems/public_activity/PublicActivity/Common:create_activity for more details.

Displaying activities

To display them you simply query the PublicActivity::Activity model:

# notifications_controller.rb
def index
  @activities = PublicActivity::Activity.all
end

And in your views:

<%= render_activities(@activities) %>

Note: render_activities is an alias for render_activity and does the same.

Layouts

You can also pass options to both activity#render and #render_activity methods, which are passed deeper to the internally used render_partial method. A useful example would be to render activities wrapped in layout, which shares common elements of an activity, like a timestamp, owner's avatar etc:

<%= render_activities(@activities, layout: :activity) %>

The activity will be wrapped with the app/views/layouts/_activity.html.erb layout, in the above example.

Important: please note that layouts for activities are also partials. Hence the _ prefix.

Locals

Sometimes, it's desirable to pass additional local variables to partials. It can be done this way:

<%= render_activity(@activity, locals: {friends: current_user.friends}) %>

Note: Before 1.4.0, one could pass variables directly to the options hash for #render_activity and access it from activity parameters. This functionality is retained in 1.4.0 and later, but the :locals method is preferred, since it prevents bugs from shadowing variables from activity parameters in the database.

Activity views

public_activity looks for views in app/views/public_activity.

For example, if you have an activity with :key set to "activity.user.changed_avatar", the gem will look for a partial in app/views/public_activity/user/_changed_avatar.html.(|erb|haml|slim|something_else).

Hint: the "activity." prefix in :key is completely optional and kept for backwards compatibility, you can skip it in new projects.

If you would like to fallback to a partial, you can utilize the fallback parameter to specify the path of a partial to use when one is missing:

<%= render_activity(@activity, fallback: 'default') %>

When used in this manner, if a partial with the specified :key cannot be located it will use the partial defined in the fallback instead. In the example above this would resolve to public_activity/_default.html.(|erb|haml|slim|something_else).

If a view file does not exist then ActionView::MisingTemplate will be raised. If you wish to fallback to the old behaviour and use an i18n based translation in this situation you can specify a :fallback parameter of text to fallback to this mechanism like such:

<%= render_activity(@activity, fallback: :text) %>

i18n

Translations are used by the #text method, to which you can pass additional options in form of a hash. #render method uses translations when view templates have not been provided. You can render pure i18n strings by passing {display: :i18n} to #render_activity or #render.

Translations should be put in your locale .yml files. To render pure strings from I18n Example structure:

activity:
  article:
    create: 'Article has been created'
    update: 'Someone has edited the article'
    destroy: 'Some user removed an article!'

This structure is valid for activities with keys "activity.article.create" or "article.create". As mentioned before, "activity." part of the key is optional.

Testing

For RSpec you can first disable public_activity and add require helper methods in the rails_helper.rb with:

#rails_helper.rb
require 'public_activity/testing'

PublicActivity.enabled = false

In your specs you can then blockwise decide whether to turn public_activity on or off.

# file_spec.rb
PublicActivity.with_tracking do
  # your test code goes here
end

PublicActivity.without_tracking do
  # your test code goes here
end

Documentation

For more documentation go here

Common examples

Set the Activity's owner to current_user by default

You can set up a default value for :owner by doing this:

  1. Include PublicActivity::StoreController in your ApplicationController like this:
class ApplicationController < ActionController::Base
  include PublicActivity::StoreController
end
  1. Use Proc in :owner attribute for tracked class method in your desired model. For example:
class Article < ActiveRecord::Base
  tracked owner: Proc.new{ |controller, model| controller.current_user }
end

Note: current_user applies to Devise, if you are using a different authentication gem or your own code, change the current_user to a method you use.

Disable tracking for a class or globally

If you need to disable tracking temporarily, for example in tests or db/seeds.rb then you can use PublicActivity.enabled= attribute like below:

# Disable p_a globally
PublicActivity.enabled = false

# Perform some operations that would normally be tracked by p_a:
Article.create(title: 'New article')

# Switch it back on
PublicActivity.enabled = true

You can also disable public_activity for a specific class:

# Disable p_a for Article class
Article.public_activity_off

# p_a will not do anything here:
@article = Article.create(title: 'New article')

# But will be enabled for other classes:
# (creation of the comment will be recorded if you are tracking the Comment class)
@article.comments.create(body: 'some comment!')

# Enable it again for Article:
Article.public_activity_on

Create custom activities

Besides standard, automatic activities created on CRUD actions on your model (deactivatable), you can post your own activities that can be triggered without modifying the tracked model. There are a few ways to do this, as PublicActivity gives three tiers of options to be set.

Instant options

Because every activity needs a key (otherwise: NoKeyProvided is raised), the shortest and minimal way to post an activity is:

@user.create_activity :mood_changed
# the key of the action will be user.mood_changed
@user.create_activity action: :mood_changed # this is exactly the same as above

Besides assigning your key (which is obvious from the code), it will take global options from User class (given in #tracked method during class definition) and overwrite them with instance options (set on @user by #activity method). You can read more about options and how PublicActivity inherits them for you here.

Note the action parameter builds the key like this: "#{model_name}.#{action}". You can read further on options for #create_activity here.

To provide more options, you can do:

@user.create_activity action: 'poke', parameters: {reason: 'bored'}, recipient: @friend, owner: current_user

In this example, we have provided all the things we could for a standard Activity.

Use custom fields on Activity

Besides the few fields that every Activity has (key, owner, recipient, trackable, parameters), you can also set custom fields. This could be very beneficial, as parameters are a serialized hash, which cannot be queried easily from the database. That being said, use custom fields when you know that you will set them very often and search by them (don't forget database indexes :) ).

Set owner and recipient based on associations

class Comment < ActiveRecord::Base
  include PublicActivity::Model
  tracked owner: :commenter, recipient: :commentee

  belongs_to :commenter, :class_name => "User"
  belongs_to :commentee, :class_name => "User"
end

Resolve parameters from a Symbol or Proc

class Post < ActiveRecord::Base
  include PublicActivity::Model
  tracked only: [:update], parameters: :tracked_values
  
  def tracked_values
   {}.tap do |hash|
     hash[:tags] = tags if tags_changed?
   end
  end
end

Setup

Skip this step if you are using ActiveRecord in Rails 4 or Mongoid

The first step is similar in every ORM available (except mongoid):

PublicActivity::Activity.class_eval do
  attr_accessible :custom_field
end

place this code under config/initializers/public_activity.rb, you have to create it first.

To be able to assign to that field, we need to move it to the mass assignment sanitizer's whitelist.

Migration

If you're using ActiveRecord, you will also need to provide a migration to add the actual field to the Activity. Taken from our tests:

class AddCustomFieldToActivities < ActiveRecord::Migration
  def change
    change_table :activities do |t|
      t.string :custom_field
    end
  end
end

Assigning custom fields

Assigning is done by the same methods that you use for normal parameters: #tracked, #create_activity. You can just pass the name of your custom variable and assign its value. Even better, you can pass it to #tracked to tell us how to harvest your data for custom fields so we can do that for you.

class Article < ActiveRecord::Base
  include PublicActivity::Model
  tracked custom_field: proc {|controller, model| controller.some_helper }
end

Help

If you need help with using public_activity please visit our discussion group and ask a question there:

https://groups.google.com/forum/?fromgroups#!forum/public-activity

Please do not ask general questions in the Github Issues.


Author: public-activity
Source code: https://github.com/public-activity/public_activity
License: MIT license

#ruby  #ruby-on-rails