CLBLAS.jl: CLBLAS integration for Julia

CLBLAS.jl

Build status:

AMD clBLAS bindings for Julia.

Installation notes

if the download of the binary fails on linux, you may also try to install clblas with sudo apt-get install libclblas-dev and then rerun Pkg.build("CLBLAS").

Current status

  • Low-level bindings
  • Partial implementation of high-level API similar to Base.LinAlg.BLAS

Example

using OpenCL
import OpenCL.cl.CLArray
import CLBLAS
const clblas = CLBLAS


clblas.setup()
device, ctx, queue = clblas.get_next_compute_context()
alpha = 1.0
beta = 0.0

hA = rand(5, 10)
hB = rand(10, 5)
A = CLArray(queue, hA)
B = CLArray(queue, hB)
C = cl.zeros(queue, 5, 5)

clblas.gemm!('N', 'N', alpha, A, B, beta, C)
hC = cl.to_host(C)
if isapprox(hC, hA * hB)
    info("Success!")
else
    error("Results diverged")
end

Caveats

  • Complex64/CL_float2 doesn't work by default. This is caused by some weird incompatibility between clang (default for Julia) and gcc (default for clBLAS), so the only way to fix it right now is to manually compile clBLAS using clang,

Download Details:

Author: JuliaGPU
Source Code: https://github.com/JuliaGPU/CLBLAS.jl 
License: Apache-2.0 license

#julia #integration 

CLBLAS.jl: CLBLAS integration for Julia
Elian  Harber

Elian Harber

1665528600

Concourse: A Container-based Continuous Thing-doer Written in Go

Concourse: the continuous thing-doer.  

Concourse is an automation system written in Go. It is most commonly used for CI/CD, and is built to scale to any kind of automation pipeline, from simple to complex.

Concourse is very opinionated about a few things: idempotency, immutability, declarative config, stateless workers, and reproducible builds.

booklit pipeline

The road to Concourse v10

Concourse v10 is the code name for a set of features which, when used in combination, will have a massive impact on Concourse's capabilities as a generic continuous thing-doer. These features, and how they interact, are described in detail in the Core roadmap: towards v10 and Re-inventing resource types blog posts. (These posts are slightly out of date, but they get the idea across.)

Notably, v10 will make Concourse not suck for multi-branch and/or pull-request driven workflows - examples of spatial change, where the set of things to automate grows and shrinks over time.

Because v10 is really an alias for a ton of separate features, there's a lot to keep track of - here's an overview:

FeatureRFCStatus
set_pipeline step✔ [#31][rfc-31]✔ v5.8.0 (experimental)
Var sources for creds✔ [#39][rfc-39]✔ v5.8.0 (experimental), TODO: [#5813][issue-5813]
Archiving pipelines✔ [#33][rfc-33]✔ v6.5.0
Instanced pipelines✔ [#34][rfc-34]✔ v7.0.0 (experimental)
Static across step🚧 [#29][rfc-29]✔ v6.5.0 (experimental)
Dynamic across step🚧 [#29][rfc-29]✔ v7.4.0 (experimental, not released yet)
Projects🚧 [#32][rfc-32]🙏 RFC needs feedback!
load_var step✔ [#27][rfc-27]✔ v6.0.0 (experimental)
get_var step✔ [#27][rfc-27]🚧 [#5815][issue-5815] in progress!
[Prototypes][prototypes]✔ [#37][rfc-37]⚠ Pending first use of protocol (any of the below)
run step🚧 [#37][rfc-37]⚠ Pending its own RFC, but feel free to experiment
Resource prototypes✔ [#38][rfc-38]🙏 [#5870][issue-5870] looking for volunteers!
Var source prototypes 🚧 [#6275][issue-6275] planned, may lead to RFC
Notifier prototypes🚧 [#28][rfc-28]⚠ RFC not ready

The Concourse team at VMware will be working on these features, however in the interest of growing a healthy community of contributors we would really appreciate any volunteers. This roadmap is very easy to parallelize, as it is comprised of many orthogonal features, so the faster we can power through it, the faster we can all benefit. We want these for our own pipelines too! 😆

If you'd like to get involved, hop in Discord or leave a comment on any of the issues linked above so we can coordinate. We're more than happy to help figure things out or pick up any work that you don't feel comfortable doing (e.g. UI, unfamiliar parts, etc.).

Thanks to everyone who has contributed so far, whether in code or in the community, and thanks to everyone for their patience while we figure out how to support such common functionality the "Concoursey way!" 🙏

Installation

Concourse is distributed as a single concourse binary, available on the Releases page.

If you want to just kick the tires, jump ahead to the Quick Start.

In addition to the concourse binary, there are a few other supported formats. Consult their GitHub repos for more information:

Quick Start

$ wget https://concourse-ci.org/docker-compose.yml
$ docker-compose up
Creating docs_concourse-db_1 ... done
Creating docs_concourse_1    ... done

Concourse will be running at 127.0.0.1:8080. You can log in with the username/password as test/test.

:warning: If you are using an M1 mac: M1 macs are incompatible with the containerd runtime. After downloading the docker-compose file, change CONCOURSE_WORKER_RUNTIME: "containerd" to CONCOURSE_WORKER_RUNTIME: "houdini". This feature is experimental

Next, install fly by downloading it from the web UI and target your local Concourse as the test user:

$ fly -t ci login -c http://127.0.0.1:8080 -u test -p test
logging in to team 'main'

target saved

Configuring a Pipeline

There is no GUI for configuring Concourse. Instead, pipelines are configured as declarative YAML files:

resources:
- name: booklit
  type: git
  source: {uri: "https://github.com/vito/booklit"}

jobs:
- name: unit
  plan:
  - get: booklit
    trigger: true
  - task: test
    file: booklit/ci/test.yml

Most operations are done via the accompanying fly CLI. If you've got Concourse installed, try saving the above example as booklit.yml, target your Concourse instance, and then run:

fly -t ci set-pipeline -p booklit -c booklit.yml

These pipeline files are self-contained, maximizing portability from one Concourse instance to the next.

Learn More

Contributing

Our user base is basically everyone that develops software (and wants it to work).

It's a lot of work, and we need your help! If you're interested, check out our contributing docs.

Download Details:

Author: Concourse
Source Code: https://github.com/concourse/concourse 
License: Apache-2.0 license

#go #golang #integration #concourse #hacktoberfest 

Concourse: A Container-based Continuous Thing-doer Written in Go
Nigel  Uys

Nigel Uys

1663165980

9 Best Libraries for Continuous integration in Go

In today's post we will learn about 9 Best Libraries for Continuous integration in Go. 

What is continuous integration?

Continuous integration (CI) is the practice of automating the integration of code changes from multiple contributors into a single software project. It’s a primary DevOps best practice, allowing developers to frequently merge code changes into a central repository where builds and tests then run. Automated tools are used to assert the new code’s correctness before integration.

A source code version control system is the crux of the CI process. The version control system is also supplemented with other checks like automated code quality tests, syntax style review tools, and more.  

Table of contents:

  • CDS - Enterprise-Grade CI/CD and DevOps Automation Open Source Platform.
  • Drone - Drone is a Continuous Integration platform built on Docker, written in Go.
  • Duci - A simple ci server no needs domain specific languages.
  • Go-fuzz-action - Use Go 1.18's built-in fuzz testing in GitHub Actions.
  • Gomason - Test, Build, Sign, and Publish your go binaries from a clean workspace.
  • Gotestfmt - Go test output for humans.
  • Goveralls - Go integration for Coveralls.io continuous code coverage tracking system.
  • Overalls - Multi-Package go project coverprofile for tools like goveralls.
  • Roveralls - Recursive coverage testing tool.

1 - CDS: Enterprise-Grade CI/CD and DevOps Automation Open Source Platform.

CDS is an Enterprise-Grade Continuous Delivery & DevOps Automation Platform written in Go(lang).

Intuitive UI

CDS provides an intuitive UI that allows you to build complex workflows, run them and dig into the logs when needed.

create and run workflow with CDS ui 

Create and run workflow with CDS ui.

The most powerful Command Line for a CI/CD Platform

cdsctl is the CDS Command Line - you can script everything with it, cdsctl also provide some cool commands such as cdsctl shell to browse your projects and workflows without the need to open a browser.

See all cdsctl commands

create workflow as code with CDS command line 

Create workflow as code with CDS command line.

Want a try?

Docker-Compose is your friend, see Ready To Run Tutorials

Blog posts and talks

View on Github

2 - Drone: Drone is a Continuous Integration platform built on Docker, written in Go.

What is Drone?

Drone is a continuous delivery system built on container technology. Drone uses a simple YAML build file, to define and execute build pipelines inside Docker containers.

Setup Documentation

This section of the documentation will help you install and configure the Drone Server and one or many Runners. A runner is a standalone daemon that polls the server for pending pipelines to execute.

Usage Documentation

Our documentation can help you get started with the different types of pipelines/builds. There are different runners / plugins / extensions designed for different use cases to help make an efficient and simple build pipeline

Plugin Index

Plugins are used in build steps to perform actions, eg send a message to slack or push a container to a registry. We have an extensive list of community plugins to customize your build pipeline, you can find those here.

Example .drone.yml build file.

This build file contains a single pipeline (you can have multiple pipelines too) that builds a go application. The front end with npm. Publishes the docker container to a registry and announces the results to a slack room.

name: default

kind: pipeline
type: docker

steps:
- name: backend
  image: golang
  commands:
    - go get
    - go build
    - go test

- name: frontend
  image: node:6
  commands:
    - npm install
    - npm test

- name: publish
  image: plugins/docker
  settings:
    repo: octocat/hello-world
    tags: [ 1, 1.1, latest ]
    registry: index.docker.io

- name: notify
  image: plugins/slack
  settings:
    channel: developers
    username: drone

View on Github

3 - Duci: A simple ci server no needs domain specific languages.

duci [zushi] (Docker Under Continuous Integration) is a simple ci server.

DSL is Unnecessary For CI

Let's define the task in the task runner.
Let's define the necessary infrastructure for the task in the Dockerfile.
duci just only execute the task in docker container.

How to use

Target Repository

The target repository must have Dockerfile in repository root or .duci/Dockerfile.
If there is .duci/Dockerfile, duci read it preferentially.

In Dockerfile, I suggest to use ENTRYPOINT.

e.g.

ENTRYPOINT ["mvn"]
CMD ["compile"]
ENTRYPOINT ["fastlane"]
CMD ["build"]

When push to github, duci execute mvn compile / fastlane build.
And when comment ci test on github pull request, execute mvn test / fastlane test.

Using host environment variables

If exists ARG instruction in Dockerfile, override value from host environment variable.

ARG FOO=default
ARG BAR

and you can use as envrionment variable in command.

ARG FOO=default
ENV FOO=$FOO

Runtime configuration

volumes

You can use volumes options for external dependency, cache and etc.
Set configurations in .duci/config.yml

volumes:
  - '/path/to/host/dir:/path/to/container/dir'

environment variable

You can set environment variables in docker container.
Add the following to .duci/config.yml

environments:
  - ENVIRONMENT_VAIRABLE=value

View on Github

4 - Go-fuzz-action: Use Go 1.18's built-in fuzz testing in GitHub Actions.

GitHub Action for Go fuzz testing. This Action runs Go's built-in fuzz testing, added in Go 1.18, on your code.

Do you find this useful?

You can sponsor me here!

Inputs

  • fuzz-time [REQUIRED]: Fuzz target iteration duration, specified as a time.Duration (for example 1h30s). Corresponds to -fuzztime flag for the go test command. Ensure this is less than your job/workflow timeout.
  • packages [optional]: Run fuzz test on these packages. Corresponds to the [packages] input for the go test command.
    • Default: .
  • fuzz-regexp [optional]: Run the fuzz test matching the regular expression. Corresponds to the -fuzz flag for the go test command.
    • Default: Fuzz
  • fuzz-minimize-time [optional]: Fuzz minimization duration, specified as a time.Duration (for example 1h30s). Corresponds to -fuzzminimizetime flag for the go test command. If you provide this input, ensure it is less than your job timeout.
    • Default: 10s

Returns:

  • SUCCESS: if your fuzz tests don't raise a failure within the fuzz-time input constraint.
  • FAILURE: if your fuzz tests raise a failure within the fuzz-time input constraint.
    • The workflow run logs will include instructions on how to download (using the GitHub CLI) the failing seed corpus to your local machine for remediation, regardless of run trigger.
    • If you run this Action in a PR workflow, it'll comment these instructions on your PR: image

Usage

⚠️This Action is not tested on windows GitHub Actions runners! Use with windows runner OS at your own risk!

Create a .github/workflows/go-fuzz-test.yml in your repository containing:

name: Go fuzz test
on:
  push:
  pull_request:
jobs:
  fuzz-test:
    name: Fuzz test
    runs-on: ubuntu-latest
    steps:
      - uses: jidicula/go-fuzz-action@v1.1.0
        with:
          fuzz-time: 30s

View on Github

5 - Gomason: Test, Build, Sign, and Publish your go binaries from a clean workspace.

Tool for testing, building, signing and publishing binaries. Think of it as an on premesis CI/CD system- that also performs code signing and publishing of artifacts.

You could do this via a CI/CD System and an artifact repository of some flavor. But wiring that up properly takes time, experience, and tends to be very specific to your particular system and repository.

Gomason attempts to abstract all of that. It will:

Run tests and report on results

Build binaries for the target OS/Arch and other files based on templates.

Sign the binaries and files thus built.

Publish the files, their signatures, and their checksums to the destination of your choice.

It does all of this based on config file called 'metadata.json' which you place in the root of your repository.

None of this is exactly rocket science, but I have done it enough times, in enough different ways, that it was finally time to say 'enough' and be done with it.

Gomason comes from an experience I had where management was so astounding anti-testing that I needed to come up with a way to do clean-room CI testing quickly, easily and transparently, but also fly under the radar. They didn't need to know I was 'wasting time' testing my work. (yeah, I couldn't believe it either. Blew my mind.)

What started out as a sort of subversive method of continuing to test my own code expanded after we parted ways. See, signing binaries and uploading the various bits to a repo isn't exactly rocket science, but it's also dreadfully boring once you've done it a few times. I figured DRY, so I made a 'one and done' means for doing so.

CI systems like Artifactory Pro can sign binaries, but they don't really have provenance on who the author was. The bits arrived there, presumably after authentication (but that depends on the config), but you don't really know who did the signing.

Enter gomason, which can do the building and signing locally with personal keys and then upload. Presumably you'd also require authentication on upload, but now you've actually established 2 things- someone with credentials has uploaded this, and they've personally signed what they uploaded. Whether you trust that signature is up to you, but we've provided an easy means to extend what a traditional CI system can do.

Gomason uses gox the Go cross compiler to do it's compiling. It builds whatever versions you like, but they need to be specified in the metadata file detailed below in gox-like format.

Code is downloaded via go get. If you have your VCS configured so that you can do that without authentication, then everything will just work.

Signing is currently done via GPG. I intend to support other signing methods such as Keybase.io, but at the moment, gpg is all you get. If your signing keys are in gpg, and you have the gpg-agent running, it should just work.

Installation

go get github.com/nikogura/gomason

Usage

Test the master branch in a clean GOPATH return success/failure:

gomason test

Test the master branch and see what's going on behind the scenes:

gomason test -v

Test another branch verbosely:

gomason test -v -b <branch name>

Publish the master branch after building:

gomason publish -v

Build and publish a branch without testing (I know, I know, don't test?!!?!)

This can be occasionally useful for publishing 3rd party tools internally when you need to make internal tweaks to support your use case.

Sometimes you don't have the wherewithal to build and maintain a full test suite for a 3rd party tool.

gomason publish -vs -b <branch name>

Other options can be found by running:

gomason help

View on Github

6 - Gotestfmt: Go test output for humans.

Are you tired of scrolling through endless Golang test logs in GitHub Actions (or other CI systems)?

Then this is the tool for you. Run it locally, or in any CI system with the following command line like this:

set -euo pipefail
go test -json -v ./... 2>&1 | tee /tmp/gotest.log | gotestfmt

Tadam, your tests will now show up in a beautifully formatted fashion. Plug it into your CI and you're done. Installation is also easy:

Note: Please always save the original log. You will need it if you have to file a bug report for gotestfmt.

Installing

You can install gotestfmt using the following methods.

Manually

You can download the binary manually from the releases section. The binaries have no dependencies and should run without any problems on any of the listed operating systems.

Using go install

You can install gotestfmt using the go install command:

go install github.com/haveyoudebuggedit/gotestfmt/v2/cmd/gotestfmt@latest

You can then use the gotestfmt command, provided that your Go bin directory is added to your system path.

Using a container

You can also run gotestfmt in a container. For example:

go test -json ./... | docker run ghcr.io/haveyoudebuggedit/gotestfmt:latest

If you have a high volume of requests you may want to mirror the image to your own registry.

View on Github

7 - Goveralls: Go integration for Coveralls.io continuous code coverage tracking system.

Installation

goveralls requires a working Go installation (Go-1.2 or higher).

$ go install github.com/mattn/goveralls@latest

Usage

First you will need an API token. It is found at the bottom of your repository's page when you are logged in to Coveralls.io. Each repo has its own token.

$ cd $GOPATH/src/github.com/yourusername/yourpackage
$ goveralls -repotoken your_repos_coveralls_token

You can set the environment variable $COVERALLS_TOKEN to your token so you do not have to specify it at each invocation.

You can also run this reporter for multiple passes with the flag -parallel or by setting the environment variable COVERALLS_PARALLEL=true (see coveralls docs for more details).

Continuous Integration

There is no need to run go test separately, as goveralls runs the entire test suite.

Github Actions

shogo82148/actions-goveralls is available on GitHub Marketplace. It provides the shorthand of the GitHub Actions YAML configure.

name: Quality
on: [push, pull_request]
jobs:
  test:
    name: Test with Coverage
    runs-on: ubuntu-latest
    steps:
    - name: Set up Go
      uses: actions/setup-go@v2
      with:
        go-version: '1.16'
    - name: Check out code
      uses: actions/checkout@v2
    - name: Install dependencies
      run: |
        go mod download
    - name: Run Unit tests
      run: |
        go test -race -covermode atomic -coverprofile=covprofile ./...
    - name: Install goveralls
      run: go install github.com/mattn/goveralls@latest
    - name: Send coverage
      env:
        COVERALLS_TOKEN: ${{ secrets.GITHUB_TOKEN }}
      run: goveralls -coverprofile=covprofile -service=github
    # or use shogo82148/actions-goveralls
    # - name: Send coverage
    #   uses: shogo82148/actions-goveralls@v1
    #   with:
    #     path-to-profile: covprofile

View on Github

8 - Overalls: Multi-Package go project coverprofile for tools like goveralls.

Package overalls takes multi-package go projects, runs test coverage tests on all packages in each directory and finally concatenates into a single file for tools like goveralls and codecov.io.

Usage and documentation

Example

overalls -project=github.com/go-playground/overalls -covermode=count -debug

then with other tools such as goveralls

goveralls -coverprofile=overalls.coverprofile -service semaphore -repotoken $COVERALLS_TOKEN

or codecov.io

mv overalls.coverprofile coverage.txt
export CODECOV_TOKEN=###
bash <(curl -s https://codecov.io/bash)

note:

goveralls and codecover currently do not calculate coverage the same way as go tool cover see here and here.

overalls (and go test) by default will not calculate coverage "across" packages. E.g. if a test in package A covers code in package B overalls will not count it. You may or may not want this depending on whether you're more concerned about unit test coverage or integration test coverage. To enable add the coverpkg flag. overalls -project=github.com/go-playground/overalls -covermode=count -debug -- -coverpkg=./...

$ overalls -help

usage: overalls -project=[path] -covermode[mode] OPTIONS -- TESTOPTIONS

overalls recursively traverses your projects directory structure
running 'go test -covermode=count -coverprofile=profile.coverprofile'
in each directory with go test files, concatenates them into one
coverprofile in your root directory named 'overalls.coverprofile'

OPTIONS
  -project
	Your project path as an absolute path or relative to the '$GOPATH/src' directory
	example: -project=github.com/go-playground/overalls

  -covermode
    Mode to run when testing files.
    default:count

OPTIONAL

  -ignore
    A comma separated list of directory names to ignore, relative to project path.
    example: -ignore=[.git,.hiddentdir...]
    default: '.git'

  -debug
    A flag indicating whether to print debug messages.
    example: -debug
    default:false

  -concurrency
    Limit the number of packages being processed at one time.
    The minimum value must be 2 or more when set.
    example: -concurrency=5
    default: unlimited

TESTOPTIONS

Any flags after -- will be passed as-is to go test. For example:

overalls -project=$PROJECT -debug -- -race -v

Will call go test -race -v under the hood in addition to the -coverprofile commands.

View on Github

9 - Roveralls: Recursive coverage testing tool.

roveralls runs coverage tests on a package and all its sub-packages. The coverage profile is output as a single file called 'roveralls.coverprofile' for use by tools such as goveralls.

This tool was inspired by github.com/go-playground/overalls written by Dean Karn, but I found it difficult to test and brittle so I decided to rewrite it from scratch. Thanks for the inspiration Dean.

Usage

At its simplest, to test the current package and sub-packages and create a roveralls.coverprofile file in the directory that you run the command:

$ roveralls

To see the help for the command:

$ roveralls -help

    roveralls runs coverage tests on a package and all its sub-packages.  The
    coverage profile is output as a single file called 'roveralls.coverprofile'
    for use by tools such as goveralls.

    Usage of roveralls:
      -covermode count,set,atomic
          Mode to run when testing files: count,set,atomic (default "count")
      -help
          Display this help
      -ignore dir1,dir2,...
          Comma separated list of directory names to ignore: dir1,dir2,... (default ".git,vendor")
      -short
          Tell long-running tests to shorten their run time
      -v	Verbose output

View Output in a Web Browser

To view the code coverage for you packge in a browser:

$ go tool cover -html=roveralls.coverprofile

View on Github

Thank you for following this article.

Related videos:

5 Best CI/CD Tools for 2022

#go #golang #continuous #integration 

9 Best Libraries for Continuous integration in Go

Nintegration.jl: Multidimensional Numerical Integration In Pure Julia

NIntegration.jl 

This is library intended to provided multidimensional numerical integration routines in pure Julia

Status

For the time being this library can only perform integrals in three dimensions.

TODO

  •  Add rules for other dimensions
  •  Make sure it works properly with complex valued functions
  •  Parallelize
  •  Improve the error estimates (the Cuba library and consequently Cuba.jl seem to calculate tighter errors)

Installation

NIntegration.jl should work on Julia 1.0 and later versions and can be installed from a Julia session by running

julia> using Pkg
julia> Pkg.add(PackageSpec(url = "https://github.com/pabloferz/NIntegration.jl.git"))

Usage

Once installed, run

using NIntegration

To integrate a function f(x, y, z) on the hyperrectangle defined by xmin and xmax, just call

nintegrate(
    f::Function, xmin::NTuple{N}, xmax::NTuple{N};
    reltol = 1e-6, abstol = eps(), maxevals = 1000000
)

The above returns a tuple (I, E, n, R) of the calculated integral I, the estimated error E, the number of integrand evaluations n, and a list R of the subregions in which the integration domain was subdivided.

If you need to evaluate multiple functions (f₁, f₂, ...) on the same integration domain, you can evaluate the function f with more "features" and use its subregions list to estimate the integral for the rest of the functions in the list, e.g.

(I, E, n, R) = nintegrate(f, xmin, xmin)
I₁ = nintegrate(f₁, R)

Technical Algorithms and References

The integration algorithm is based on the one decribed in:

  • J. Berntsen, T. O. Espelid, and A. Genz, "An Adaptive Algorithm for the Approximate Calculation of Multiple Integrals," ACM Trans. Math. Soft., 17 (4), 437-451 (1991).

Acknowdlegments

The author expresses his gratitude to Professor Alan Genz for some useful pointers.

This work was financially supported by CONACYT through grant 354884.

Download Details:

Author: Pabloferz
Source Code: https://github.com/pabloferz/NIntegration.jl 
License: MIT license

#julia #integration 

Nintegration.jl: Multidimensional Numerical Integration In Pure Julia

Cuba.jl: Library for Multidimensional Numerical integration

Cuba.jl

Introduction

Cuba.jl is a library for multidimensional numerical integration with different algorithms in Julia.

This is just a Julia wrapper around the C Cuba library, version 4.2, by Thomas Hahn. All the credits goes to him for the underlying functions, blame me for any problem with the Julia interface. Feel free to report bugs and make suggestions at https://github.com/giordano/Cuba.jl/issues.

All algorithms provided by Cuba library are supported in Cuba.jl:

  • vegas (type: Monte Carlo; variance reduction with importance sampling)
  • suave (type: Monte Carlo; variance reduction with globally adaptive subdivision + importance sampling)
  • divonne (type: Monte Carlo or deterministic; variance reduction with stratified sampling, aided by methods from numerical optimization)
  • cuhre (type: deterministic; variance reduction with globally adaptive subdivision)

Integration is performed on the n-dimensional unit hypercube [0, 1]^n. For more details on the algorithms see the manual included in Cuba library and available in deps/usr/share/cuba.pdf after successful installation of Cuba.jl.

Cuba.jl is available for GNU/Linux, FreeBSD, Mac OS, and Windows (i686 and x86_64 architectures).

Installation

The latest version of Cuba.jl is available for Julia 1.3 and later versions, and can be installed with Julia built-in package manager. In a Julia session, after entering the package manager mode with ], run the command

pkg> update
pkg> add Cuba

Older versions are also available for Julia 0.4-1.2.

Usage

After installing the package, run

using Cuba

or put this command into your Julia script.

Cuba.jl provides the following functions to integrate:

vegas(integrand, ndim, ncomp[; keywords...])
suave(integrand, ndim, ncomp[; keywords...])
divonne(integrand, ndim, ncomp[; keywords...])
cuhre(integrand, ndim, ncomp[; keywords...])

These functions wrap the 64-bit integers functions provided by the Cuba library.

The only mandatory argument is:

  • function: the name of the function to be integrated

Optional positional arguments are:

  • ndim: the number of dimensions of the integration domain. Defaults to 1 in vegas and suave, to 2 in divonne and cuhre. Note: ndim must be at least 2 with the latest two methods.
  • ncomp: the number of components of the integrand. Defaults to 1

ndim and ncomp arguments must appear in this order, so you cannot omit ndim but not ncomp. integrand should be a function integrand(x, f) taking two arguments:

  • the input vector x of length ndim
  • the output vector f of length ncomp, used to set the value of each component of the integrand at point x

Also anonymous functions are allowed as integrand. For those familiar with Cubature.jl package, this is the same syntax used for integrating vector-valued functions.

For example, the integral

∫_0^1 cos(x) dx = sin(1) = 0.8414709848078965

can be computed with one of the following commands

julia> vegas((x, f) -> f[1] = cos(x[1]))
Component:
 1: 0.8414910005259609 ± 5.2708169787733e-5 (prob.: 0.028607201257039333)
Integrand evaluations: 13500
Number of subregions:  0
Note: The desired accuracy was reached

julia> suave((x, f) -> f[1] = cos(x[1]))
Component:
 1: 0.8411523690658836 ± 8.357995611133613e-5 (prob.: 1.0)
Integrand evaluations: 22000
Number of subregions:  22
Note: The desired accuracy was reached

julia> divonne((x, f) -> f[1] = cos(x[1]))
Component:
 1: 0.841468071955942 ± 5.3955070531551656e-5 (prob.: 0.0)
Integrand evaluations: 1686
Number of subregions:  14
Note: The desired accuracy was reached

julia> cuhre((x, f) -> f[1] = cos(x[1]))
Component:
 1: 0.8414709848078966 ± 2.2204460420128823e-16 (prob.: 3.443539937576958e-5)
Integrand evaluations: 195
Number of subregions:  2
Note: The desired accuracy was reached

The integrating functions vegas, suave, divonne, and cuhre return an Integral object whose fields are

integral :: Vector{Float64}
error    :: Vector{Float64}
probl    :: Vector{Float64}
neval    :: Int64
fail     :: Int32
nregions :: Int32

The first three fields are vectors with length ncomp, the last three ones are scalars. The Integral object can also be iterated over like a tuple. In particular, if you assign the output of integration functions to the variable named result, you can access the value of the i-th component of the integral with result[1][i] or result.integral[i] and the associated error with result[2][i] or result.error[i]. The details of other quantities can be found in Cuba manual.

All other arguments listed in Cuba documentation can be passed as optional keywords.

Documentation

A more detailed manual of Cuba.jl, with many complete examples, is available at https://giordano.github.io/Cuba.jl/stable/.

Related projects

There are other Julia packages for multidimenensional numerical integration:

Download Details:

Author: Giordano
Source Code: https://github.com/giordano/Cuba.jl 
License: View license

#julia #integration #math 

Cuba.jl: Library for Multidimensional Numerical integration
Nat  Grady

Nat Grady

1661252125

ShinyTree: Shiny integration with The jsTree Library

shinyTree

The shinyTree package enables Shiny application developers to use the jsTree library in their applications.

shiny tree screenshot

Installation

You can install the latest development version of the code using the devtools R package.

# Install devtools, if you haven't already.
install.packages("devtools")

library(devtools)
install_github("shinyTree/shinyTree")

Getting Started

01-simple (Live Demo)

library(shiny)
runApp(system.file("examples/01-simple", package = "shinyTree"))

A simple example to demonstrate the usage of the shinyTree package.

02-attributes (Live Demo)

library(shiny)
runApp(system.file("examples/02-attributes", package = "shinyTree"))

Manage properties of your tree by adding attributes to your list when rendering.

03-checkbox (Live Demo)

library(shiny)
runApp(system.file("examples/03-checkbox", package = "shinyTree"))

Use checkboxes to allow users to more easily manage which nodes are selected.

04-selected (Live Demo)

library(shiny)
runApp(system.file("examples/04-selected", package = "shinyTree"))

An example demonstrating how to set an input to the value of the currently selected node in the tree.

05-structure (Live Demo)

library(shiny)
runApp(system.file("examples/05-structure", package = "shinyTree"))

Demonstrates the low-level usage of a shinyTree as an input in which all attributes describing the state of the tree can be read.

06-search (Live Demo)

library(shiny)
runApp(system.file("examples/06-search", package = "shinyTree"))

An example showing the use of the search plugin to allow users to more easily navigate the nodes in your tree.

07-drag-and-drop (Live Demo)

library(shiny)
runApp(system.file("examples/07-drag-and-drop", package = "shinyTree"))

An example demonstrating the use of the drag-and-drop feature which allows the user to reorder the nodes in the tree.

08-class (Live Demo)

library(shiny)
runApp(system.file("examples/08-class", package = "shinyTree"))

An example demonstrating the use of the ability to style nodes using custom classes.

09-themes

library(shiny)
runApp(system.file("examples/09-themes", package = "shinyTree"))

An example demonstrating the use of built-in tree themes.

10-node-ids

library(shiny)
runApp(system.file("examples/10-node-ids", package = "shinyTree"))

An example demonstrating the ability to label and return node identifiers and classes.

11-tree-update

library(shiny)
runApp(system.file("examples/11-tree-update", package = "shinyTree"))

An example demonstrating the ability to update a tree with a new tree model. This was broken in the original version as the tree was destroyed upon initialization.

12-types

library(shiny)
runApp(system.file("examples/12-types", package = "shinyTree"))

An example demonstrating node types with custom icons.

13-icons

library(shiny)
runApp(system.file("examples/13-icons", package = "shinyTree"))

An example demonstrating various ways to use icons on nodes.

14-files

library(shiny)
runApp(system.file("examples/14-files", package = "shinyTree"))

Demonstrates how to create a file browser tree.

15-files

library(shiny)
runApp(system.file("examples/15-data", package = "shinyTree"))

Demonstrates how to attach and retreive metadata from a node.

16-async

library(shiny)
runApp(system.file("examples/16-async", package = "shinyTree"))

Demonstrates how to render a tree asynchronously.

17-contextmenu

library(shiny)
runApp(system.file("examples/17-contextmenu", package = "shinyTree"))

Demonstrates how to enable the contextmenu.

18-modules

library(shiny)
runApp(system.file("examples/18-modules/app.R", package="shinyTree"))
runApp(system.file("examples/18-modules/app_types.R", package="shinyTree"))

Demonstrates how to use shinyTree with shiny modules.

19-data.tree

library(shiny)
runApp(system.file("examples/19-data.tree", package = "shinyTree"))

Demonstrates how to pass a data.tree to shinyTree.

20-api

library(shiny)
runApp(system.file("examples/20-api", package = "shinyTree"))

An example demonstrating how to extend the operations on the tree to the rest of the jsTree's core functionality.

21-options

library(shiny)
runApp(system.file("examples/21-options/app_setState_refresh.R", package="shinyTree"))

Demonstrates how to fine-tune shinyTree's behaviour with options. Specifically: When internal jstree code calls set_state or refresh, a callback is made so that the shiny server is notified and observe and observeEvents for the tree are fired. This can be useful if the developer would like observe and observeEvents to run after using updateTree. (By default, updateTree does not run observe or observeEvent because it is assumed that the shiny application knows that the tree is being changed already.)

23-file-icons

library(shiny)
library(shinyTree)
runApp(system.file("examples/23-file-icons", package = "shinyTree"))

An example demonstrating how to create a file tree with individual icons.

Known Bugs

See the Issues page for information on outstanding issues.

Download Details:

Author: shinyTree
Source Code: https://github.com/shinyTree/shinyTree 
License: View license

#r #tree #integration 

ShinyTree: Shiny integration with The jsTree Library

Cubature.jl: The Cubature Module for Julia

The Cubature module for Julia

This module provides one- and multi-dimensional adaptive integration routines for the Julia language, including support for vector-valued integrands and facilitation of parallel evaluation of integrands, based on the Cubature Package by Steven G. Johnson.

See also the HCubature package for a pure-Julia implementation of h-adaptive cubature using the same algorithm (which is therefore much more flexible in the types that it can integrate).

h-adaptive versus p-adaptive integration

Adaptive integration works by evaluating the integrand at more and more points until the integrand converges to a specified tolerance (with the error estimated by comparing integral estimates with different numbers of points). The Cubature module implements two schemes for this adaptation: h-adaptivity (routines hquadrature, hcubature, hquadrature_v, and hcubature_v) and p-adaptivity (routines pquadrature, pcubature, pquadrature_v, and pcubature_v). The h- and p-adaptive routines accept the same parameters, so you can use them interchangeably, but they have very different convergence characteristics.

h-adaptive integration works by recursively subdividing the integration domain into smaller and smaller regions, applying the same fixed-order (fixed number of points) integration rule within each sub-region and subdividing a region if its error estimate is too large. (Technically, we use a Gauss-Kronrod rule in 1d and a Genz-Malik rule in higher dimensions.) This is well-suited for functions that have localized sharp features (peaks, kinks, etcetera) in a portion of the domain, because it will adaptively add more points in this region while using a coarser set of points elsewhere. The h-adaptive routines should be your default choice if you know very little about the function you are integrating.

p-adaptive integration works by repeatedly doubling the number of points in the same domain, fitting to higher and higher degree polynomials (in a stable way) until convergence is achieved to the specified tolerance. (Technically, we use Clenshaw-Curtis quadrature rules.) This is best-suited for integrating smooth functions (infinitely differentiable, ideally analytic) in low dimensions (ideally 1 or 2), especially when high accuracy is required.

One technical difference that is sometimes important for functions with singularities at the edges of the integration domain: our h-adaptive algorithm only evaluates the integrand at the interior of the domain (never at the edges), whereas our p-adaptive algorithm also evaluates the integrand at the edges.

(The names "h-adaptive" and "p-adaptive" refer to the fact that the size of the subdomains is often denoted h while the degree of the polynomial fitting is often called p.)

Usage

Before using any of the routines below (and after installing, see above), you should include using Cubature in your code to import the functions from the Cubature module.

One-dimensional integrals of real-valued integrands

The simplest case is to integrate a single real-valued integrand f(x) from xmin to xmax, in which case you can call (similar to Julia's built-in quadgk routine):

(val,err) = hquadrature(f::Function, xmin::Real, xmax::Real;
                        reltol=1e-8, abstol=0, maxevals=0)

for h-adaptive integration, or pquadrature (with the same arguments) for p-adaptive integration. The return value is a tuple of val (the estimated integral) and err (the estimated absolute error in val, usually a conservative upper bound). The required arguments are:

f is the integrand, a function f(x::Float64) that accepts a real argument (in the integration domain) and returns a real value.

xmin and xmax are the boundaries of the integration domain. (That is, f is integrated from xmin to xmax.) They must be finite; to compute integrals over infinite or semi-infinite domains, you can use a change of variables.

There are also the following optional keyword arguments:

reltol is the required relative error tolerance: the adaptive integration will terminate when errreltol*|val|; the default is 1e-8.

The optional argument abstol is a required absolute error tolerance: the adaptive integration will terminate when errabstol. More precisely, the integration will terminate when either the relative- or the absolute-error tolerances are met. abstol defaults to 0, which means that it is ignored, but it can be useful to specify an absoute error tolerance for integrands that may integrate to zero (or nearly zero) because of large cancellations, in which case the problem is ill-conditioned and a small relative error tolerance may be unachievable.

The optional argument maxevals specifies a (rough) maximum number of function evaluations: the integration will be terminated (and the current estimates returned) if this number is exceeded. The default maxevals is 0, in which case maxevals is ignored (no maximum).

Here is an example that integrates f(x) = x^3 from 0 to 1, printing the x coordinates that are evaluated:

hquadrature(x -> begin println(x); x^3; end, 0,1)

and returning (0.25,2.7755575615628914e-15), which is the correct answer 0.25. If we instead integrate from -1 to 1, the function may never exit: the exact integral is zero, and it is nearly impossible to satisfy the default reltol bound in floating-point arithmetic. In that case, you have to specify an abstol as explained above:

hquadrature(x -> begin println(x); x^3; end, -1,1, abstol=1e-8)

in which case it quickly returns.

Multi-dimensional integrals of real-valued integrands

The next simplest case is to integrate a single real-valued integrand f(x) over a multidimensional box, with each coordinate x[i] integrated from xmin[i] to xmax[i].

(val,err) = hcubature(f::Function, xmin, xmax;
                      reltol=1e-8, abstol=0, maxevals=0)

for h-adaptive integration, or pcubature (with the same arguments) for p-adaptive integration. The return value is a tuple of val (the estimated integral) and err (the estimated absolute error in val, usually a conservative upper bound). The arguments are:

f is the integrand, a function f(x::Vector{Float64}) that accepts a vector x (in the integration domain) and returns a real value.

xmin and xmax are arrays or tuples (or any iterable container) specifying the boundaries xmin[i] and xmax[i] of the integration domain in each coordinate. They must have length(xmin) == length(xmax). (As above, the components must be finite, but you can treat infinite domains via a change of variables).

The optional keyword arguments reltol, abstol, and maxevals specify termination criteria as for hquadrature above.

Here is the same 1d example as above, integrating f(x) = x^3 from 0 to 1 while the x coordinates that are evaluated:

hcubature(x -> begin println(x[1]); x[1]^3; end, 0,1)

which again returns the correct integral 0.25. The only difference from before is that the argument x of our integrand is now an array, so we must use x[1] to access its value. If we have multiple coordinates, we use x[1], x[2], etcetera, as in this example integrating f(x,y) = x^3 y in the unit box [0,1]x[0,1] (the exact integral is 0.125):

hcubature(x -> begin println(x[1],",",x[2]); x[1]^3*x[2]; end, [0,0],[1,1])

Integrals of vector-valued integrands

In many applications, one wishes to compute integrals of several different integrands over the same domain. Of course, you could simply call hquadrature or hcubature multiple times, once for each integrand. However, in cases where the integrands are closely related functions, it is sometimes much more efficient to compute them together for a given point x than computing them separately. For example, if you have a complex-valued integrand, you could compute two separate integrals of the real and imaginary parts, but it is often more efficient and convenient to compute the real and imaginary parts at the same time.

The Cubature module supports this situation by allowing you to integrate a vector-valued integrand, computing fdim real integrals at once for any given dimension fdim (the dimension of the integrand, which is independent of the dimensionality of the integration domain). This is achieved by calling one of:

(val,err) = hquadrature(fdim::Integer, f::Function, xmin, xmax;
                        reltol=1e-8, abstol=0, maxevals=0,
                        error_norm = Cubature.INDIVIDUAL)
(val,err) = hcubature(fdim::Integer, f::Function, xmin, xmax;
                      reltol=1e-8, abstol=0, maxevals=0,
                      error_norm = Cubature.INDIVIDUAL)

for h-adaptive integration, or pquadrature/pcubature (with the same arguments) for p-adaptive integration. The return value is a tuple of two vectors of length fdim: val (the estimated integrals val[i]) and err (the estimated absolute errors err[i] in val[i]). The arguments are:

fdim the dimension (number of components) of the integrand, i.e. the number of real-valued integrals to perform simultaneously

f, the integrand. This is a function f(x, v) of two arguments: the point x in the integration domain (a Float64 for hquadrature and a Vector{Float64} for hcubature), and the vector v::Vector{Float64} of length fdim which is used to output the integrand values. That is, the function f should set v[i] to the value of the i-th integrand upon return. (The return value of f is ignored.) Note: the contents of v must be overwritten in-place by f. If you are not setting v[i] individually, you should do v[:] = ... and not v = ....

xmin and xmax specify the boundaries of the integration domain, as for hquadrature and hcubature of scalar integrands above.

The optional keyword arguments reltol, abstol, and maxevals specify termination criteria as for hquadrature above.

The optional keyword argument error_norm specifies how the convergence criteria for the different integrands are combined. That is, given a vector val of integral estimates and a vector err of error estimates, how do we decide whether to stop? error_norm should be one of the following constants:

Cubature.INDIVIDUAL, the default. This terminates the integration when all of the integrals, taken individually, converge. That is, it checks err[i]reltol*|val[i]| or err[i]abstol, and only stops when one of these is true for all i.

Cubature.PAIRED. This is like Cubature.INDIVIDUAL, but applies the convergence criteria to consecutive pairs of integrands, as if these integrands were real and imaginary parts of complex numbers. (This is mainly useful for integrating complex functions in cases where you only care about error in the complex plane as opposed to error in the real and imaginary parts taken individually.)

Cubature.L1, Cubature.L2, or Cubature.LINF. These terminate the integration when |err| ≤ reltol*|val| or |err| ≤ abstol, where |...| denotes a norm applied to the whole vector of errors or integrals. In particular, the L1 norm (sum of absolute values), the L2 norm (the root-mean-square value), or the L-infinity norm (the maximum absolute value), respectively. These are useful if you only care about the error in the vector of integrals taken as a whole in some norm, rather than the relative error in the components taken individually (which could be large if some of the components integrate almost to zero). We provide three different norms for completeness, but probably the choice of norm doesn't matter too much; pick Cubature.L1 if you aren't sure.

Here is an example, similar to above, which integrates a vector of three integrands (x, x^2, x^3) from 0 to 1:

hquadrature(3, (x,v) -> v[:] = x.^[1:3], 0,1)

returning ([0.5, 0.333333, 0.25],[5.55112e-15, 3.70074e-15, 2.77556e-15]), which are of course the correct integrals.

Parallelizing the integrand evaluation

These numerical integration algorithms actually call your integrand function for batches of points at a time, not just point-by-point. It is useful to expose this information for parellelization: your code may be able to evaluate the integrand in parallel for multiple points.

This is provided by a "vectorized" interface to the Cubature module: functions hquadrature_v, pquadrature_v, hcubature_v, and pcubature_v, which have exactly the same arguments as the functions described in the previous sections, except that the integrand function f must accept different arguments.

In particular, for the _v integration routines, the integrand must be a function f(x,v) where x is an array of n points to evaluate and v is an array in which to store the values of the integrands at those points. n is determined at runtime and varies between calls to f. The shape of the arrays depends upon which routine is called:

For hquadrature_v and pquadrature_v with real-valued integrands (no fdim argument), x and v are both 1d Float64 arrays of length n of the points (input) and values (output), respectively.

For hcubature_v and pcubature_v with real-valued integrands (no fdim argument) in d integration dimensions, x is a 2d Float64 array of size d×n holding the points x[:,i] at which to evaluate the integrand, and v is a 1d Float64 array of length n in which to store the resulting integrand values.

For hquadrature_v and pquadrature_v with vector-valued integrands (an fdim argument), x is a 1d Float64 array of length n of points at which to evaluate the integrands, and v is a 2d Float64 array of size fdim×n in which to store the values v[:,i] at these points.

For hcubature_v and pcubature_v with vector-valued integrands (an fdim argument) in d integration dimensions, x is a 2d Float64 array of length d×n of points x[:,i] at which to evaluate the integrands, and v is a 2d Float64 array of size fdim×n in which to store the values v[:,i] at these points.

Technical Algorithms and References

The h-adaptive integration routines are based on those described in:

  • A. C. Genz and A. A. Malik, "An adaptive algorithm for numeric integration over an N-dimensional rectangular region," J. Comput. Appl. Math., vol. 6 (no. 4), 295-302 (1980).
  • J. Berntsen, T. O. Espelid, and A. Genz, "An adaptive algorithm for the approximate calculation of multiple integrals," ACM Trans. Math. Soft., vol. 17 (no. 4), 437-451 (1991).

which we implemented in a C library, the Cubature Package, that is called from Julia.

Note that we do ''not'' use any of the original DCUHRE code by Genz, which is not under a free/open-source license.) Our code is based in part on code borrowed from the HIntLib numeric-integration library by Rudolf Schürer and from code for Gauss-Kronrod quadrature (for 1d integrals) from the GNU Scientific Library, both of which are free software under the GNU GPL. (Another free-software multi-dimensional integration library, unrelated to our code here but also implementing the Genz-Malik algorithm among other techniques, is Cuba.)

The hcubature_v technique is adapted from adapted from I. Gladwell, "Vectorization of one dimensional quadrature codes," pp. 230--238 in Numerical Integration. Recent Developments, Software and Applications, G. Fairweather and P. M. Keast, eds., NATO ASI Series C203, Dordrecht (1987), as described in J. M. Bull and T. L. Freeman, "Parallel Globally Adaptive Algorithms for Multi-dimensional Integration," http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.6638 (1994).

The p-adaptive integration algorithm is simply a tensor product of nested Clenshaw-Curtis quadrature rules for power-of-two sizes, using a pre-computed table of points and weights up to order 2^20.

Download Details:

Author: JuliaMath
Source Code: https://github.com/JuliaMath/Cubature.jl 
License: View license

#julia #integration #math 

Cubature.jl: The Cubature Module for Julia

HCubature.jl: Pure-Julia Multidimensional H-adaptive integration

HCubature

The HCubature module is a pure-Julia implementation of multidimensional "h-adaptive" integration. That is, given an n-dimensional integral

n-dimensional integral

then hcubature(f, a, b) computes the integral, adaptively subdividing the integration volume into smaller and smaller pieces until convergence is achieved to the desired tolerance (specified by optional rtol and atol keyword arguments, described in more detail below.

Because hcubature is written purely in Julia, the integrand f(x) can return any vector-like object (technically, any type supporting +, -, * real, and norm: a Banach space). You can integrate real, complex, and matrix-valued integrands, for example.

Usage

Assuming you've installed the HCubature package (via Pkg.add) and loaded it with using HCubature, you can then use it by calling the hcubature function:

hcubature

hcubature(f, a, b; norm=norm, rtol=sqrt(eps), atol=0, maxevals=typemax(Int), initdiv=1)

This computes the n-dimensional integral of f(x), where n == length(a) == length(b), over the hypercube whose corners are given by the vectors (or tuples) a and b. That is, dimension x[i] is integrated from a[i] to b[i]. The return value of hcubature is a tuple (I, E) of the estimated integral I and an estimated error E.

f should be a function f(x) that takes an n-dimensional vector x and returns the integrand at x. The integrand can be any type that supports +, -, * real, and norm functions. For example, the integrand can be real or complex numbers, vectors, matrices, etcetera. (For performance, the StaticArrays package is recommended for use with vector/matrix-valued integrands.)

The integrand f(x) will be always be passed an SVector{n,T}, where SVector is an efficient vector type defined in the StaticArrays package and T is a floating-point type determined by promoting the endpoint a and b coordinates to a floating-point type. (Your integrand f should be type-stable: it should always return a value of the same type, given this type of x.)

The integrand will never be evaluated exactly at the boundaries of the integration volume. (So, for example, it is possible to have an integrand that blows up at the boundaries, as long as the integral is finite, though such singularities will slow convergence.)

The integration volume is adaptively subdivided, using a cubature rule due to Genz and Malik (1980), until the estimated error E satisfies E ≤ max(rtol*norm(I), atol), i.e. rtol and atol are the relative and absolute tolerances requested, respectively. It also stops if the number of f evaluations exceeds maxevals. If neither atol nor rtol are specified, the default rtol is the square root of the precision eps(T) of the coordinate type T described above. Initially, the volume is divided into initdiv segments along each dimension.

The error is estimated by norm(I - I′), where I′ is an alternative estimated integral (via an "embedded" lower-order cubature rule.) By default, the norm function used (for both this and the convergence test above) is norm, but you can pass an alternative norm by the norm keyword argument. (This is especially useful when f returns a vector of integrands with different scalings.)

hquadrature

hquadrature(f, a, b; norm=norm, rtol=sqrt(eps), atol=0, maxevals=typemax(Int), initdiv=1)

Compute the (1d) integral of f(x) from a to b. The return value of hcubature is a tuple (I, E) of the estimated integral I and an estimated error E.

The other parameters are the same as hcubature (above). hquadrature is just a convenience wrapper around hcubature so that you can work with scalar x, a, and b, rather than 1-component vectors.

Alternatively, for 1d integrals you can import the QuadGK module and call the quadgk function, which provides additional flexibility e.g. in choosing the order of the quadrature rule. (QuadGK is used internally anyway by HCubature to compute the quadrature rule.)

Algorithm

The algorithm of hcubature is based on the one described in:

Download Details:

Author: JuliaMath
Source Code: https://github.com/JuliaMath/HCubature.jl 
License: View license

#julia #integration #math 

HCubature.jl: Pure-Julia Multidimensional H-adaptive integration

QuadGK.jl: Adaptive 1d Numerical Gauss–Kronrod integration In Julia

QuadGK.jl

This package provides support for one-dimensional numerical integration in Julia using adaptive Gauss-Kronrod quadrature. The code was originally part of Base Julia. It supports integration of arbitrary numeric types, including arbitrary precision (BigFloat), and even integration of arbitrary normed vector spaces (e.g. matrix-valued integrands).

The package provides three functions: quadgk, gauss, and kronrod. quadgk performs the integration, gauss computes Gaussian quadrature points and weights for integrating over the interval [a, b], and kronrod computes Kronrod points, weights, and embedded Gaussian quadrature weights for integrating over [-1, 1]. Typical usage looks like:

using QuadGK
integral, err = quadgk(x -> exp(-x^2), 0, 1, rtol=1e-8)

which computes the integral of exp(–x²) from x=0 to x=1 to a relative tolerance of 10⁻⁸, and returns the approximate integral = 0.746824132812427 and error estimate err = 7.887024366937112e-13 (which is actually smaller than the requested tolerance: convergence was very rapid because the integrand is smooth).

For more information, see the documentation.

In-place operations for array-valued integrands

For integrands whose values are small arrays whose length is known at compile-time, it is usually most efficient to modify your integrand to return an SVector from the StaticArrays.jl package.

However, for integrands that return large or variabley-length arrays, we also provide a function quadgk!(f!, result, a,b...) in order to exploit in-place operations where possible. The result argument is used to store the estimated integral I in-place, and the integrand function is now of the form f!(r, x) and should write f(x) in-place into the result array r.

Gaussian quadrature and arbitrary weight functions

If you are computing many similar integrals of smooth functions, you may not need an adaptive integration — with a little experimentation, you may be able to decide on an appropriate number N of integration points in advance, and re-use this for all of your integrals. In this case you can use x, w = gauss(N, a, b) to find the quadrature points x and weights w, so that sum(f.(x) .* w) is an N-point approximation to ∫f(x)dx from a to b.

For computing many integrands of similar functions with singularities, x, w = gauss(W, N, a, b) function allows you to pass a weight function W(x) as the first argument, so that sum(f.(x) .* w) is an N-point approximation to ∫W(x)f(x)dx from a to b. In this way, you can put all of the singularities etcetera into W and precompute an accurate quadrature rule as long as the remaining f(x) terms are smooth. For example,

using QuadGK
x, w = gauss(x -> exp(-x) / sqrt(x), 10, 0, -log(1e-10), rtol=1e-9)

computes the points and weights for performing ∫exp(-x)f(x)/√x dx integrals from 0 to -log(1e-10) ≈ 23, so that there is a 1/√x singularity in the integrand at x=0 and a rapid decay for increasing x. (The gauss function currently does not support infinite integration intervals, but for a rapidly decaying weight function you can approximate an infinite interval to any desired accuracy by a sufficiently broad interval, with a tradeoff in computational expense.) For example, with f(x) = sin(x), the exact answer is 0.570370556005742…. Using the points and weights above with sum(sin.(x) .* w), we obtain 0.5703706212868831, which is correct to 6–7 digits using only 10 f(x) evaluations. Obtaining similar accuracy for the same integral from quadgk requires nearly 300 function evaluations. However, the gauss function itself computes many (2N) numerical integrals of your weight function (multiplied by polynomials), so this is only more efficient if your f(x) is very expensive or if you need to compute a large number of integrals with the same W.

See the gauss documentation for more information. See also our example using a weight function interpolated from tabulated data.

Similar packages

The FastGaussQuadrature.jl package provides non-adaptive Gaussian quadrature variety of built-in weight functions — it is a good choice you need to go to very high orders N, e.g. to integrate rapidly oscillating functions, or use weight functions that incorporate some standard singularity in your integrand. QuadGK, on the other hand, keeps the order N of the quadrature rule fixed and improves accuracy by subdividing the integration domain, which can be better if fine resolution is required only in a part of your domain (e.g if your integrand has a sharp peak or singularity somewhere that is not known in advance).

For multidimensional integration, see the HCubature.jl, Cubature.jl, and Cuba.jl packages.

Documentation:

Download Details:

Author: JuliaMath
Source Code: https://github.com/JuliaMath/QuadGK.jl 
License: MIT license

#julia #integration #math 

QuadGK.jl: Adaptive 1d Numerical Gauss–Kronrod integration In Julia

Taylorintegration.jl: ODE Integration using Taylor's Method, in Julia

TaylorIntegration.jl

ODE integration using Taylor's method in Julia.

Installation

TaylorIntegration.jl is a registered package, and is simply installed by running

pkg> add TaylorIntegration

Supporting and citing

This package is developed as part of academic research. If you would like to help supporting it, please star the repository as such metrics may help us secure funding. If you use this software, we would be grateful if you could cite our work as follows (Bibtex entry can be found here):

J.A. Pérez-Hernández and L. Benet
TaylorIntegration.jl: Taylor Integration in Julia
https://github.com/PerezHz/TaylorIntegration.jl
DOI:[10.5281/zenodo.2562352](https://doi.org/10.5281/zenodo.2562352)

Examples

Acknowledgments

We acknowledge financial support from DGAPA-PAPIIT grants IG-100616 and IG-100819.

Build status: Build status

Coverage: Coverage Status codecov

Documentation:

DOI (Zenodo): DOI

Authors

  • Jorge A. Pérez, Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México (UNAM)
  • Luis Benet, Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México (UNAM)

Comments, suggestions, and improvements are welcome and appreciated.

Download Details:

Author: PerezHz
Source Code: https://github.com/PerezHz/TaylorIntegration.jl 
License: View license

#julia #differentiators #integration 

Taylorintegration.jl: ODE Integration using Taylor's Method, in Julia
Hermann  Frami

Hermann Frami

1655236380

Serverless plugin for Kumologica

Serverless plugin for Kumologica  

Serverless plugin that allows deployment of kumologica flow into aws account.

Installation

New Kumologica flow: using template

  1. Use kumologica serverless template

Create new kumologica project with hello world flow using serverless template:

sls create --template-url https://github.com/KumologicaHQ/serverless-templates/tree/master/helloworld-api --path helloworld-api

This will create new directory: helloworld-api with following files:

  • hello-world-flow.json
  • package.json
  • serverless.yml
  1. Install kumologica-serverless-plugin
sls plugin install --name kumologica-serverless-plugin

Flow is ready to edit with kumologica designer and use by serverless.

Existing Kumologica flow: changes to serverless.yaml

  1. Add plugin to serverless.yml
plugins:
  - kumologica-serverless-plugin
  1. Install kumologica-serverless-plugin
sls plugin install --name kumologica-serverless-plugin
  1. Update functions

Replace function name with the flow file name (without .json extension). For example for flow: demo-flow.json the functions declaration will look like:


functions:
  demo-flow: # name of your flow file (without .json extension)
 

Development

Download Kumologica Designer to edit flow, implement business logic and unit tests.

Kumologica Designer Screenshot

This is the only tool you will need to build serverless integrations to run on your cloud.

How it Works

Kumologica executes flows on aws lambda node.js runtime. Artefacts like lambda source file and package.json are generated by kumologica-serverless-plugin. The only file required is the kumologica json flow file.

Kumologica flow may interact with several aws services. In such a case correct permissions must be added to policies that are assigned to lambda's role. Kumologica serverless plugin will introspect flow and create policy with all required permissions and attach policy to the lambda role generated by serverless. This feature may be disabled if required.

Usage

IAM Policy

IAM Policy with all required permissions is added to lambda role by default. To disable policy creation, add custom property inferIamPolicies into serverless.yml file and set it to false:

custom:
  kumologica:
    inferIamPolicies: false # true by default

Policy creation by plugin is possible if resources used by flow are defined as:

  • static string values provided inside flow properties
  • resources are defined in environment variables and referenced in flow using env.{name} expression.

The resources may also be referenced using input message or calculated using variables. In such a case the exact value is unknown and policy can not be created. In this scenario:

  • the policy must be defined within serverless.yml file or arn of policy must be provided for the flow/function see
  • the inferIamPolicies custom parameter must be set to false;

Test cases

Kumologica flow is internally divided into two sections: main and test. The test section should contain test cases and are not needed for correctly running flow in aws lambda. To remove test related nodes from flow during deployment use excludeTest custom property in serverless.yml and set it to true. The kumologica-serverless plugin will remove test nodes from flow during deployment:

custom:
  kumologica:
    excludeTest: true      # false by default

Examples

Most Basic example

Below is a serverless.yml file that will automatically update Role's policies. In this scenario flow has ARNs entered as a string values in flow properties.

service: hello-world

provider:
  name: aws
  runtime: nodejs12.x

functions:
  demo-flow: # name of your flow file (without .json extension)
    events:
      - http:
          path: hello
          method: get

plugins:
  - kumologica-serverless-plugin

Use of Lambda's Environment variables

Below example shows flow that references ARNs via lambda's environment variables (the arn of dynamo db table that flow uses). This allows greater flexibility in allowing the same flow to be deployed into multiple accounts or configurations without need of flow change.

The kumologica-serverless-plugin will add specific actions from flows for resource arn:aws:dynamodb:ap-southeast-2:{account}:table/contacts to the lambda's role during deployment.

service: hello-world

provider:
  name: aws
  runtime: nodejs12.x

functions:
  demo-flow: # name of your flow file (without .json extension)
    environment:
      dynamodbArn: arn:aws:dynamodb:{self:provider.region}:{accountId}:table/contacts
    events:
      - http:
          path: hello
          method: get

plugins:
  - kumologica-serverless-plugin

Explicit IAM Role statements

Below example relates to flow that uses ARN of resource from input message or is calculated at run time. In such a case the ARN is not known at deploy time. This requires disabling inferIamPolicies.

Additionally, the example shows that all test cases that are added into test parts of flow will be removed.

service: hello-world

provider:
  name: aws
  runtime: nodejs12.x
  iamRoleStatements:
    - Effect: "Allow"
      Action:
      - dynamodb:Query
      - dynamodb:Scan
      Resource: "arn:aws:dynamodb:{self:provider.region}:{accountId}:table/contacts"

functions:
  demo-flow: # name of your flow file (without .json extension)
    events:
      - http:
          path: hello
          method: get

custom:
  kumologica:
    inferIamPolicies: false # true by default
    excludeTest: true       # false by default

plugins:
  - kumologica-serverless-plugin

Dependencies

Author: KumologicaHQ
Source Code: https://github.com/KumologicaHQ/kumologica-serverless-plugin 
License: MIT license

#serverless #plugin #lambda #integration 

Serverless plugin for Kumologica
Aketch  Rachel

Aketch Rachel

1638039600

Overview Of integration Testing For Beginners

👉Below, we'll take a closer look at integration testing, why it's important, best practices for integration testing, integration testing tools, and more.

⭐️You can see more at the link at the end of the article. Thank you for your interest in the blog, if you find it interesting, please give me a like, comment and share with everyone. Thanks! ❤️

#integration 

Overview Of integration Testing For Beginners

Dart/Flutter Package for Nordigen EU PSD2 AISP API Integration

Development of a Null Safe Dart/Flutter Package for Nordigen EU PSD2 AISP Banking API Integration with relevant Data Models, proper encapsulation with the exposing of parameters, and succinct documentation.

For more information about the API view Nordigen's Account Information API documentation.

Usage Steps

Go through the Nordigen's Account Information API documentation.

Register and get the API Access Token from https://ob.nordigen.com.

Initialise the NordigenAccountInfoAPI Class with the token recieved from Step 2.

Call any of the NordigenAccountInfoAPI Class methods to directly interact with Nordigen Server's endpoints while having the internal requests and relevant headers abstracted, based on your need.

Utilize any of the available Data Classes to modularly and sufficiently store and process the information during any of the API usage steps. The Data Classes have functionality to be constructed fromMap() and to be easily converted back toMap() as well as to be serialized, at any point.

Available Methods

NordigenAccountInfoAPI({required String accessToken}) (Class constuctor)

Call it with accessToken parameter which is the access token recieved from https://ob.nordigen.com/, to access API features.

Analogous to Step 1 of Account Information API documentation.

getASPSPsForCountry({required String countryCode})

Gets the ASPSPs (Banks) in the Country represented by the given two-letter countryCode (ISO 3166).

Analogous to Step 2 of Account Information API documentation.

createEndUserAgreement({required String endUserID, required String aspspID, int maxHistoricalDays = 90})

Creates an End User Agreement for the given endUserID, aspspID and for the given maxHistoricalDays (default 90 days) and returns the resulting EndUserAgreementModel.

Analogous to Step 3 of Account Information API documentation.

createRequisition({required String endUserID, required String redirect, required String reference, List<String> agreements = const <String>[]})

Create a Requisition for the given endUserID and returns the resulting RequisitionModel. reference is additional layer of unique ID. Should match Step 3 if done. redirect is the link where the end user will be redirected after finishing authentication in ASPSP. agreements is as an array of ID(s) from Step 3 or empty array if that step was skipped.

Analogous to Step 4.1 of Account Information API documentation.

fetchRedirectLinkForRequisition({required String aspspID, required String requisitionID})

Provides a redirect link for the Requisition represented by the requisitionID passed in, for the ASPSP represented by the given aspspID.

Analogous to Step 4.2 of Account Information API documentation.

getRequisitionFromID({required String requisitionID})

Gets the Requisition identified by requisitionID.

getEndUserAccountIDs({required String requisitionID})

Gets the Account IDs of the User for the Requisition identified by requisitionID.

Analogous to Step 5 of Account Information API documentation.

getAccountDetails({required String accountID})

Gets the Details of the Bank Account identified by accountID. Account Model follows schema in https://nordigen.com/en/docs/account-information/overview/parameters-and-responses/.

Analogous to Step 6 of Account Information API documentation for Account Details.

getAccountTransactions({required String accountID})

Gets the Transactions of the Bank Account identified by accountID as a Map<String, List<TransactionData>> with keys 'booked' and 'pending' representing List of Booked and pending transactions respectively.

Analogous to Step 6 of Account Information API documentation for Account Transactions.

getAccountBalances({required String accountID})

Gets the Balances of the Bank Account identified by accountID as dynamic. Will be depreciated later when documentation provides example of potentially fetched Balance Data.

Analogous to Step 6 of Account Information API documentation for Account Balances.

There are also various other methods for implementing POST, GET and DELETE requests across various endpoints in Nordigen Server, which are self explanatory:

getASPSPUsingID({required String aspspID})

getEndUserAgreementUsingID({required String endUserAgreementID})

getEndUserAgreementsUsingUserID({required String endUserID})

deleteEndUserAgreementUsingID({required String endUserAgreementID})

getRequisitions({int limit = 100, int offset = 0,})

getRequisitionUsingID({required String requisitionID})

deleteRequisitionUsingID({required String requisitionID})

getAccountMetaData({required String accountID})

Available Data Classes

Refer https://nordigen.com/en/docs/account-information/overview/parameters-and-responses/ for most of the Data Schema and the mentioned URLs in the special cases.

ASPSP({required String id, required String name, String bic = '', int transactionTotalDays = 90, required List<String> countries})

ASPSP (Bank) Data Model for Nordigen. Contains the id of the ASPSP, its name, bic, transactionTotalDays and the countries associated with the ASPSP.

EndUserAgreementModel({required String id, String created, String? accepted, int maxHistoricalDays = 90, int accessValidForDays = 90, required String endUserID, required String aspspID}):

End-user Agreement Data Model for Nordigen. Contains the id of the Agreement, its created time string, accepted, the number of maxHistoricalDays and accessValidForDays, and the endUserID and aspspID relevant to the Agreement.

RequisitionModel({required String id, required String redirectURL, required String reference, String status = '', List<String> agreements = const <String>[], List<String> accounts = const <String>[], required String endUserID}):

Requisition Data Model for Nordigen. Contains the id of the Requisition, its status, end-user agreements, the redirectURL to which it should redirect, reference ID if any, accounts associated, and the associated endUserID.

AccountMetaData({required String id, String created, String? lastAccessed, String iban, String aspspIdentifier, String status = ''}) Account meta-data model for Nordigen. Contains the id of the Bank Account, its created and lastAccessed date and time, iban, status and the aspspIdentifier identifiying its ASPSP. Refer to https://nordigen.com/en/docs/account-information/overview/parameters-and-responses/

AccountDetails({String? id, String? iban, String? msisdn, required String currency, String? ownerName, String? name, String? displayName, String? product, String? cashAccountType, String? status, String? bic, String? linkedAccounts, String? usage, String? details, List<Balance>? balances, List<String>? links}):

Bank Account Details Model for Nordigen. Refer to https://nordigen.com/en/docs/account-information/output/accounts/ for full Data Schema.

TransactionData({required String id, String? debtorName, Map<String, dynamic>? debtorAccount, String? bankTransactionCode, String bookingDate = '', String valueDate = '', required String transactionAmount, String? remittanceInformationUnstructured = '', ...}):

Transaction Data Model for Nordigen. Refer to https://nordigen.com/en/docs/account-information/output/transactions/ for full Data Schema.

Balance({required AmountData balanceAmount, required String balanceType, bool? creditLimitIncluded, String? lastChangeDateTime, String? referenceDate, String? lastCommittedTransaction})

Balance Data Model for Nordigen. Contains balanceAmount of Transaction, its balanceType, whether its creditLimitIncluded, its lastChangeDateTime and referenceDate as String and the lastCommittedTransaction.

Refer to https://nordigen.com/en/docs/account-information/output/balance/ for full Data Schema and the available balance types.

AmountData({required String amount, required String currency})

It is a simple Class that holds the transaction amount and the currency type, both as required parameters.

Example Usage

import 'package:nordigen_integration/nordigen_integration.dart';

Future<void> main() async {
    /// Step 1
    final NordigenAccountInfoAPI apiInterface = NordigenAccountInfoAPI(
        accessToken: 'YOUR_TOKEN',
    );

    /// Step 2 and then selecting the first ASPSP
    final ASPSP firstBank =
        (await apiInterface.getASPSPsForCountry(countryCode: 'gb')).first;

    /// Step 4.1
    final RequisitionModel requisition = await apiInterface.createRequisition(
        endUserID: 'exampleEndUser',
        redirect: 'http://www.yourwebpage.com/',
        reference: 'exampleRef42069666',
    );

    /// Step 4.2
    final String redirectLink =
        await apiInterface.fetchRedirectLinkForRequisition(
        requisitionID: requisition.id,
        aspspID: firstBank.id,
    );

    /// Open and Validate [redirectLink] and proceed with other functionality.
    print(redirectLink);
}

Dependencies

http is used for making API calls to the Nordigen Server Endpoints with proper response and error handling.

Tests Screenshot

Nordigen EU PSD2 AISP Integration Tests Successful Screenshot

In case of any bugs, reach out to me at @Dhi13man or file an issue

The first release of this package was sponsored by Cashtic. Show them some love! This package would not otherwise be possible

Big thanks to contributors, including @stantemo and @c-louis. Contribution is welcome, and makes my day brighter

Getting Started

This project is a starting point for a Dart package, a library module containing code that can be shared easily across multiple Flutter or Dart projects.

For help getting started with Flutter, view our online documentation, which offers tutorials,samples, guidance on mobile development, and a full API reference.

Use this package as a library

Depend on it

Run this command:

With Dart:

 $ dart pub add nordigen_integration

With Flutter:

 $ flutter pub add nordigen_integration

This will add a line like this to your package's pubspec.yaml (and run an implicit dart pub get):


dependencies:
  nordigen_integration: ^1.5.2

Alternatively, your editor might support dart pub get or flutter pub get. Check the docs for your editor to learn more.

Import it

Now in your Dart code, you can use:

import 'package:nordigen_integration/nordigen_integration.dart';

example/nordigen_integration_example.dart

import 'package:nordigen_integration/nordigen_integration.dart';

Future<void> main() async {
  /// Step 1
  final NordigenAccountInfoAPI apiInterface = NordigenAccountInfoAPI(
    accessToken: 'YOUR_TOKEN',
  );

  /// Step 2 and then selecting the first ASPSP
  final ASPSP firstBank =
      (await apiInterface.getASPSPsForCountry(countryCode: 'gb')).first;

  /// Step 4.1
  final RequisitionModel requisition = await apiInterface.createRequisition(
    endUserID: 'exampleEndUser',
    redirect: 'http://www.yourwebpage.com/',
    reference: 'exampleRef42069666',
  );

  /// Step 4.2
  final String redirectLink =
      await apiInterface.fetchRedirectLinkForRequisition(
    requisitionID: requisition.id,
    aspspID: firstBank.id,
  );

  /// Open and Validate [redirectLink] and proceed with other functionality.
  print(redirectLink);
}

#integration #flutter #mobile-apps 

Dart/Flutter Package for Nordigen EU PSD2 AISP API Integration
Marisol  Kuhic

Marisol Kuhic

1627153200

Flutter facebook login integrations for beginners

Facebook login is used to make a easy login in the app rather than creating a new user registration for every app by making use of users facebook id for logging in into different apps.

This is a best practice to make sure all the credentials are same and can be easily handled also facebook provides privacy for the user data and also provides all the required information’s like username, #email address, #phone number, #profile picture if required

The below link provides the complete tutorial for a beginner to integrate fb login in their app.
Code Link : http://www.androidcoding.in/2020/08/06/flutter-facebook-login/

#flutter #integration #beginner

Flutter facebook login integrations for beginners
Marisol  Kuhic

Marisol Kuhic

1627066800

Flutter youtube integration | Youtube

Flutter youtube video integration on your app is explained in this app.We can play videos with the help of video id, url. Youtube is the popular video search engine and its usage is also very high, and it has better approach in app’s.

Flutter code can be used in both android & iOS so we can play in either the devices.

For more interesting tutorials may visit my blog link below.
Code Link : http://www.androidcoding.in/2020/07/23/flutter-youtube/
TextEditingController : https://youtu.be/HJRowyHurww
Easy way to learn Bloc Pattern : https://youtu.be/dj8TqRlSMGs
Http Implementation : https://youtu.be/ZdN4_eAw1gI
Google Maps : https://youtu.be/ryd5uUc6auU
Firebase Push Notification : https://youtu.be/JE2PtCi0-fw
Retrofit Network Call : https://youtu.be/OZF9mqKbi3k

#flutter #integration

Flutter youtube integration | Youtube