Desmond  Gerber

Desmond Gerber


Top 3 Open Source Audio Tools for Creators

I came across these helpful open source tools for sampling sounds and creating loops for music.

Finding good quality, open source audio samples can be a challenge. I've been getting increasingly into composition and creating music in my spare time, using the open source tool Ardour and the creator-focused distribution Ubuntu Studio. I've been looking for samples of specific sounds or loops to include.

I'm familiar with many tools to find images, but until recently, I hadn't come across a similar option for audio resources.

Open source sound samples

Finding specific sounds can be challenging if you can't record them yourself. Several resources are available, but not many provide sounds under an open source license.


Enter the incredible treasure trove that is Freesound, a collaborative database of Creative Commons licensed sounds where you can browse, download, and even contribute.

You can find pretty much anything on Freesound, from the sounds of a sleepy tour bus on the road to a door opening and closing or a ghostly shriek. While Freesound mainly focuses on sound samples, there are also some loops on the site. Many sounds are licensed under Creative Commons 0, so you can do whatever you like with them from a licensing perspective. However, that's not true for all of them, so check the license before you use anything, as you may need to credit the creator.

The site allows you to check out the sample rate, bit depth, and channels so you can be sure that the sample will work with your composition, and it has a built-in rating system and download count. A waveform display allows you to get some insight into the character of the sound sample before you preview it.

The search filters on Freesound are not as strong as other sites. Sounds will sometimes be grouped into packs of similar sounds, like this one for scary noises. This can help you quickly grab a bunch of similar sounds to play with. The quality of the samples is variable, so you might need to clean up the audio on some samples. If you're feeling bored, there's even an option to select a random sound from the database—and trust me, some are very random! Freesound also has a community forum where you can participate and learn from others.

Nasa space sounds

If you are looking for some otherworldly sounds or want to snoop on the conversations between Earth and space, the Nasa Space Sounds database might be a great place to look. It's fascinating to explore the recordings from the various missions and listen in on the communications back and forth, some of which are narrated. Several recordings have different sounds from different space missions, from the Sounds of Mars from Perseverance Rover to audio from the Apollo missions.

Sounds from the Nasa site are released under the Creative Commons category Public Domain Mark 1.0, meaning that it is free of known restrictions under copyright law.

Loops for music creation

If your focus is more on creating music, you might be looking for loops: short recordings of music that you can alter and tweak in your own compositions.

There are all kinds of sample packs out there from commercial sources, but there are also a lot of royalty-free loops available on Looperman. With over 200,000 loops uploaded by musicians, DJs, producers, and creators, there's everything from electronic dance music and trap to classical. There are also over 12,000 a cappella and spoken-word loops, and it's a great resource for finding things like bass lines or drum beats. You need to have an account to download, and you must download tracks before you can upload anything.

Looperman resources are not Creative Commons, but the site defaults to a similar concept: "All samples and loops are free to use in commercial and noncommercial projects," according to the site license, but "you can NOT claim copyright of those loops." A cappella and vocal samples are in a separate category, so checking the specific terms for any loop you consider using is important.

Each loop tells you the beats-per-minute, key signature (where relevant), and software it was created in. A waveform shows the character of the loop, which gives you a good idea of whether it is likely to work with your project. You can preview loops within the browser and leave comments for the creator. There is a lively community and many great resources to help you create your own loops.

Get creative

I hope this gives you some ideas of where to find audio resources for your next project, and I look forward to hearing what you have created!

Original article source at:

#opensource #audio #tools 

Top 3 Open Source Audio Tools for Creators
Oral  Brekke

Oral Brekke


Best 10 DevOps Tools You Must Know

“Devops is not a goal, but a never-ending process of continual improvement.” –  Jez Humble

Before you proceed any further and I give you the complete list of Top 10 DevOps tools of 2023, you must know What DevOps actually is? Well, if you are considering DevOps as a tool then, you are wrong! DevOps is not a tool or a software, it’s a culture that you can adopt for continuous improvement. It will help you to bring your Developer Team and Operations Team on the same page, allowing them to work together with ease.

Top 10 DevOps Tools For 2022:

Devops Blog

  • GIT
  • CHEF

To keep yourself updated with the most in-demand & industry-wide used DevOps tools, you can check our DevOps Certification Training

10. GIT: Don’t Worry Keep Updating

GIT is a version control system which allows you to track changes in your file and, by using it you can easily coordinate the work among your team


       •   Free-Open Source Tool

           Feature Branch Workflow

        •   Allows Distributed Development

        •   Supports Pull Request

        •   Enables Faster Release Cycle


Google Trend:

Devops Blog - GIT Google Trend - EdurekaCompanies Already Using GIT:Git Popular Company - Edureka

Download: Click here to download GIT

09. JENKINS: A Tool from Developers for Developers

Jenkins is a continuous integration server written in Java. You can use it for testing and reporting changes in near real time. Being a developer, it will help you to find and solve bugs in your code rapidly and automate the testing of their build.


   Free Open-Source Tool

•    Integrate all your DevOps stages with the help of around 1000 plugins

•    Script your pipeline having one or more build jobs into a single workflow

    Easily start your Jenkins with its WAR file

    Provides multiple ways of communication: web-based GUI, CLI and REST Api

Google Trend:

Devops Blog - Jenkins Google Trend - EdurekaCompanies Already Using Jenkins:

Devops Blog - Comapnies Using Jenkins - Edureka


Click here to download Jenkins

Check out our DevOps Certification in Top Cities

IndiaUnited StatesOther Countries
DevOps PGP in HyderabadDevOps Certification in DallasDevOps Certification in Canada
DevOps PGP in BangaloreDevOps Certification in CharlotteDevOps Certification in Mississauga
DevOps PGP in ChennaiDevOps Certification in NYCDevOps Certification in Toronto

08. SELENIUM: Automation Doesn’t Kill Bugs, People DO

Well, Selenium is a portable software testing framework for web applications. It provides you with an easy interface for developing automated tests.


   Free Open-Source Tool

   Create robust, browser-based regression automation suites and tests

   Write test scripts in multiple languages like Java, Python, C#, Ruby, Perl, Php, JavaScript

   Supports Multiplatform for testing like ios and Android

   Easy to build a keyword driven framework for a WebDriver

Google Trends:

Devops Blog - Selenium Google Trend - Edureka

Company Already Using SELENIUM:

Devops Blog - Comapnies Using Selenium - Edureka


Click here to Download Selenium

07. DOCKER: Build, Ship & Run Your Software Anywhere

A lightweight tool which uses container to package up an application with all the requirements and dependencies before shipping the complete container as one package


•   Use Docker container with any language

   Ship the container wherever you want, be it QA, your team or even the cloud

   Scale up to 1000’s node

   Update with zero downtime

Google Trends:

Devops Blog - Docker Google Trend - EdurekaCompanies Already Using Docker:

Devops Blog - Comapnies Using Docker - Edureka


Click here to Download Docker

06. Puppet: Deploy, Configure and Manage Your Servers 

An open-source configuration management tool, use to automate the method of inspecting, delivering and operating your software across the entire lifecycle with platform independency.


   Based on master-slave architecture

   Open-source tool

•   Long commercial track record

Google Trends:

Devops Blog - Puppet Google Trend - Edureka

Companies Already Using Puppet:

Devops Blog - Comapnies Using Puppet - Edureka


Click here to Download Puppet

05. Chef: Manage your data, attributes, roles, environments, and cookbooks

Chef is a powerful configuration management automation tool using which you can transform infrastructure into code.


   Another open-source configuration management tool

   Supports multiple platforms like AIX, RHEL/CentOS, FreeBSD

   Easy to integrate with cloud-based platforms

   Active, smart and fast-growing community support

Google Trends:

Devops Blog - Chef Google Trend - EdurekaCompanies Already Using Chef:

Devops Blog - Comapnies Using Chef - Edureka


Click here to download Chef

04. Ansible: Manage your data, attributes, roles, environments, and cookbooks

Ansible is an open-source tool which provides one of the simplest ways to automate your apps and IT infrastructures such as network configuration, cloud deployments, and creation of development environments.


•   Open source configuration management tool

•   Supports push configuration

•   Based on master-slave architecture

•   Completely agentless and uses simple syntax written YAML

Google Trends:

Devops Blog - Ansible Google Trend - EdurekaCompanies Already Using Ansible:

Devops Blog - Comapnies Using Ansible - Edureka


Click here to download Ansible

03. Nagios: Keep a Track of Your Logs

Nagios is a powerful monitoring system which enables you and your organization to identify and resolve IT infrastructure problems before they affect critical business processes.


   Monitors and troubleshoot server performance issues

   Plan infrastructure upgrades before outdated systems cause failures

   Automatically fix problems when detected

Google Trend:

Devops Blog - Nagios Google Trend - EdurekaCompanies Already using Nagios:

Devops Blog - Comapnies Using Nagios - Edureka


Click here to download Nagios

02. ELK Stack: Don’t Miss Out Important Insights from Your Logs

ELK is a combination of three powerful, opensource tool: Elasticsearch, Logstash and Kibana used to collect insights out of your logs or data.


   Open Source tool with multiple plugins

   Lightweight tool, easy to deploy

   Perform search in near-real time

   Collects and Analyse logs from an excel file to a database or server

   Active and Supportive discussion forum

Google Trends:

Devops Blog - ELK Google Trend - EdurekaCompanies already using ELK:

Popular company using ELK | Edureka

Download: Click here to download Elasticsearch, Logstash and Kibana

01. Splunk: Keep a Track of Your Logs

Splunk is a software platform to search, analyze and visualize the machine-generated data or logs gathered from the websites, applications, sensors, devices etc. which make up your IT infrastructure and business.


   Store, search, analyze and visualize the machine-generated data

  Ingest data in multiple file format

   Create knowledge objects for operational intelligence

   Monitors business metrics to get log insights

Google Trend:

Devops Blog - Splunk Google Trend - Edureka

Companies Already using Splunk:


Devops Blog - Comapnies Using Splunk - Edureka


Click here to download Splunk

I hope my blog on “Top 10 DevOps Tools” was relevant to you. If you need more information on DevOps you can browse other free and exciting content from edureka from the above links mentioned on the blog. If you’re in search of a career that’s both demanding and rewarding. No matter whether you’ve worked in DevOps or are new to the field, the DevOps Professional Certificate Program is precisely the thing you need to learn the methods to be successful. From the basic to the most advanced methods, we cover everything. You can also check out our DevOps Engineer Course. It will help you to gain expertise in DevOps tools.

Original article source at:

#devops #tools 

Best 10 DevOps Tools You Must Know

RPA tools List and Comparison

Robotic Process Automation is a new age technology in today’s market used to automate mundane tasks, and to do this we need RPA Tools. To upskill your career in RPA, getting a UiPath training or Automation Anywhere Certification training is a must, as this can help you land as an RPA developer. In this article on RPA Tools, the following topics will be covered:

  • Types Of RPA Tools
  • List Of RPA Tools
  • RPA Tools Comparison: UiPath vs Blue Prism vs Automation Anywhere
  • Checklist For Selecting RPA Tools

Before I give you a detailed list of the top RPA tools in today’s market, let me tell you the different types of RPA tools available.

Types Of RPA Tools

All the RPA tools can be segregated into 4 different types of tools that are built as the extension of the previous generation of bots. Refer to the following table for the same.

Type of RPA ToolsDescription
Excel automation and MacrosSimple automation solutions to automate basic processes.
Programmable Solution botsInteract with other systems based on the client’s requirements /inputs.
Self Learning ToolsAnalyze human actions and perform the same on various platforms
Cognitive automation botsSelf-learning bots which can handle unstructured data, and make decisions based on complex, unstructured input.

Since you must have got an idea about the types of tools available, let us look into the list of the top RPA tools present in today’s market.

List of RPA Tools

Vendor/ToolFree Version Available/ NotPricingUsabilitySelected Partners
Another Monday30 days free trialSmart process tracking, Drag & Drop cognitive automationKPMG, PwC
AntWorksCollaboration tool provides Bot cloningCyberArk, Vincix
Automation Edge30 days of free trialDrag and Drop technology and Cognitive featuresWipro, Keyvrox
Automation AnywhereProvides a community edition/ free editionDrag & Drop and provides AI-Augmented RPAErnst and Young, Cognizant
BluePrism30 days of free trialDrag & Drop features, used for enterprise automationAccenture, Capgemini
Contextor[Acquired by SAP]Price per block of 1,000 transactions per month [OR] Pay-per-useProvides cloud deployment, and a visual designer to create botsWorldline, IBM
JacadaDesktop automation with high, DirecTV
KofaxProvides free trialUnified Design Environment, and provides built-in analyticsBMW, Dominos
Kryon SystemsProvides free trialProvides strong analytics &deployment efficiencyPwC, EY
NICE Systems30 days of free trialAccenture, Cognizant
Pega30 days of free trialVisual designer studioAccenture, Capgemini
Redwood Software30 days of free trialHeineken, Airbus
UiPathCommunity EditionStudio License (Annual): $2000 – $3,000Drag and Drop functionality, Easy to use visual designerCognizant, Deloitte
Visual Cron45-day free trialper serverIntegration and task scheduling toolAmazon, Apple
WorkFusion30 days of free trialper processDrag and drop builder and provides machine learning capabilitiesBank of America, PNC

Well, as you can see above that each tool has its own positive and negative factors. But, if you talk about the market leaders, it is the famous trio i.e. Blue Prism, UiPath & Automation Anywhere. Refer to the following table for the differences between these tools.

RPA Tools Comparison: UiPath vs Blue Prism vs Automation Anywhere

UiPathBlue PrismAutomation Anywhere
Has Community Edition / Free EditionHas recently launched a free edition.Recently launched a Community Edition
Most Popular ToolPopular than Automation AnywhereLess Popular than others
No programming knowledge requiredIt provides functionality that allows the user to write code, but users can manage without it.No programming knowledge required
Has free online training and certification programsProvides official certification programRecently launched a certification of 50$.
Provides desktop, web and Citrix automationDesigned for Citrix automation for BPO.Reasonable across all mediums.

So, now that I have explained the differences between the top tools, you can consider the following checklist to select the right tool for you.

Checklist for Selecting The Right Tool

Checklist for RPA Tools - RPA Tools - edureka

  • Technology: Most of the organizations perform their day to day tasks outside the local desktop either by using virtual machines or Citrix environments. So the tool should support any type of application and must be platform-independent.
  • Scalability: While selecting an RPA tool, you must consider how easily the tool can respond to clients/business requirements and changes with high efficiency.
  • Security: Security is an important aspect in any field of technology.  Since RPA tools are software, you need to consider a lot of security measures while deploying bots in production.
  • Total Cost of Ownership: Includes the initial setup cost, maintenance cost, and ongoing vendor license fees. This is a very important parameter that must be considered while selecting a tool.
  • Ease of Use & Control: Any tool that you choose must be user-friendly to increase employee satisfaction and efficiency. 
  • Vendor Experience: It is advised to choose a vendor that serves a company similar to yours both in terms of size and industry. This will help you improve the speed of implementation.
  • Maintenance & Support: The vendor must follow a support model to ensure that the required Service Level Agreements are met. 

Now that you know the parameters you have to consider while selecting the tool, you should have an understanding of when to choose which tool. Refer to the following image to understand which tools suits your needs the best.

RPA Core Functionalities and the RPA Tools used - RPA Tools - edureka

Looking at the above tools, if you want to upskill your career in the field of RPA, then we at Edureka provide a structured curriculum on UiPath to help you master the tool. We are also one of the official training partners of Automation Anywhere, where we will provide you with the Enterprise Edition. If you wish to master, then check out our courses on UiPath and Automation Anywhere.

Original article source at:

#rpa #tools #comparison 

RPA tools List and Comparison
Rupert  Beatty

Rupert Beatty


Swiftinfo: Extract and analyze The Evolution Of An iOS App's Code


Note: I don't intend to ship further updates to this tool.

SwiftInfo is a CLI tool that extracts, tracks and analyzes metrics that are useful for Swift apps. Besides the default tracking options that are shipped with the tool, you can also customize SwiftInfo to track pretty much anything that can be conveyed in a simple .swift script.

By default SwiftInfo will assume you're extracting info from a release build and send the final results to Slack, but it can be used to extract info from individual pull requests as well with the danger-SwiftInfo danger plugin.

Available Providers

Type NameDescription
📦 IPASizeProviderSize of the .ipa archive (not the App Store size!)
📊 CodeCoverageProviderCode coverage percentage
👶 TargetCountProviderNumber of targets (dependencies)
🎯 TestCountProviderSum of all test target's test count
⚠️ WarningCountProviderNumber of warnings in a build
🧙‍♂️ OBJCFileCountProviderNumber of OBJ-C files and headers (for mixed OBJ-C / Swift projects)
⏰ LongestTestDurationProviderThe name and duration of the longest test
🛏 TotalTestDurationProviderTime it took to build and run all tests
🖼 LargestAssetCatalogProviderThe name and size of the largest asset catalog
🎨 TotalAssetCatalogsSizeProviderThe sum of the size of all asset catalogs
💻 LinesOfCodeProviderExecutable lines of code
🚚 ArchiveDurationProviderTime it took to build and archive the app
📷 LargestAssetProviderThe largest asset in the project. Only considers files inside asset catalogs.

Each provider may have a specific set of requirements in order for them to work. Check their documentation to learn more.


SwiftInfo extracts information by analyzing the logs that your build system generates when you build and/or test your app. Because it requires these logs to work, SwiftInfo is meant to be used alongside a build automation tool like fastlane. The following topics describe how you can retrieve these logs and setup SwiftInfo itself.

We'll show how to get the logs first as you'll need them to configure SwiftInfo.

Note: This repository contains an example project. Check it out to see the tool in action -- just go to the example project folder and run make swiftinfo in your terminal.

Retrieving raw logs with fastlane

If you use fastlane, you can expose raw logs to SwiftInfo by adding the buildlog_path argument to scan (test logs) and gym (build logs). Here's a simple example of a fastlane lane that runs tests, submits an archive to TestFlight and runs SwiftInfo (make sure to edit the folder paths to what's being used by your project):

desc "Submits a new beta build and runs SwiftInfo"
lane :beta do
  # Run tests, copying the raw logs to the project folder
    scheme: "MyScheme",
    buildlog_path: "./build/tests_log"

  # Archive the app, copying the raw logs to the project folder and the .ipa to the /build folder
    workspace: "MyApp.xcworkspace",
    scheme: "Release",
    output_directory: "build",
    buildlog_path: "./build/build_log"

  # Send to TestFlight
      skip_waiting_for_build_processing: true

  # Run the CocoaPods version of SwiftInfo

  # Commit and push SwiftInfo's output
  sh("git add ../SwiftInfo-output/SwiftInfoOutput.json")
  sh("git commit -m \"[ci skip] Updating SwiftInfo Output JSON\"")

Retrieving raw logs manually

An alternative that doesn't require fastlane is to simply manually run xcodebuild / xctest and pipe the output to a file. We don't recommend doing this in a real project, but it can be useful if you just want to test the tool without having to setup fastlane.

xcodebuild -workspace ./Example.xcworkspace -scheme Example 2>&1 | tee ./build/build_log/Example-Release.log

Configuring SwiftInfo

SwiftInfo itself is configured by creating a Infofile.swift file in your project's root. Here's an example one with a detailed explanation:

import SwiftInfoCore

// Use `FileUtils` to configure the path of your logs. 
// If you're retrieving them with fastlane and don't know what the name of the log files are going to be, 
// just run it once to have it create them.

FileUtils.buildLogFilePath = "./build/build_log/MyApp-MyConfig.log"
FileUtils.testLogFilePath = "./build/tests_log/MyApp-MyConfig.log"

// Now, create a `SwiftInfo` instance by passing your project's information.

let projectInfo = ProjectInfo(xcodeproj: "MyApp.xcodeproj",
                              target: "MyTarget",
                              configuration: "MyConfig")

let api = SwiftInfo(projectInfo: projectInfo)

// Use SwiftInfo's `extract()` method to extract and append all the information you want into a single property.

let output = api.extract(IPASizeProvider.self) +
             api.extract(WarningCountProvider.self) +
             api.extract(TestCountProvider.self) +
             api.extract(TargetCountProvider.self, args: .init(mode: .complainOnRemovals)) +
             api.extract(CodeCoverageProvider.self, args: .init(targets: ["NetworkModule", "MyApp"])) +
             api.extract(LinesOfCodeProvider.self, args: .init(targets: ["NetworkModule", "MyApp"]))

// Lastly, process the output.

if isInPullRequestMode {
    // If called from danger-SwiftInfo, print the results to the pull request
    api.print(output: output)
} else {
    // If called manually, send the results to Slack...
    api.sendToSlack(output: output, webhookUrl: url)
    // ...and save the output to your repo so it serves as the basis for new comparisons. output)

Saving and visualizing the data

After successfully extracting data, you should call output) to have SwiftInfo add/update a json file in the {Infofile path}/SwiftInfo-output folder. It's important to add this file to version control after the running the tool as this is what SwiftInfo uses to compare new pieces of information.

You can then use SwiftInfo-Reader to transform this output into a more visual static HTML page.

Customizing Providers

To be able to support different types of projects, SwiftInfo provides customization options to some providers. See the documentation for each provider to see what it supports. If you wish to track something that's not handled by the default providers, you can also create your own providers. Click here to see how.

Customizing Runs

Any arguments you pass to SwiftInfo can be inspected inside your Infofile. This allows you to pass any custom information you want to the binary and use it to customize your runs.

For example, if you run SwiftInfo by calling swiftinfo --myCustomArgument, you can use ProcessInfo to check for its presence inside your Infofile.

if ProcessInfo.processInfo.arguments.contains("--myCustomArgument") {
    print("Yay, custom arguments!")

If the argument has a value, you can also fetch that value with UserDefaults.



pod 'SwiftInfo'


To install SwiftInfo with Homebrew the first time, simply run these commands:

brew tap rockbruno/SwiftInfo
brew install rockbruno/SwiftInfo/swiftinfo

To update to the newest Homebrew version of SwiftInfo when you have an old version already installed, run:

brew upgrade swiftinfo


Download the latest release and unzip the contents somewhere in your project's folder.

Swift Package Manager

SwiftPM is currently not supported due to the need of shipping additional files with the binary, which SwiftPM does not support. We might find a solution for this, but for now there's no way to use the tool with it.

Download Details:

Author: Rockbruno
Source Code: 
License: MIT license

#swift #cli #ios #tools #xcode 

Swiftinfo: Extract and analyze The Evolution Of An iOS App's Code
Nat  Grady

Nat Grady


R-lib/cli: Tools for Making Beautiful & Useful Command Line interfaces


Helpers for Developing Command Line Interfaces

A suite of tools to build attractive command line interfaces (CLIs), from semantic elements: headers, lists, alerts, paragraphs, etc. Supports theming via a CSS-like language. It also contains a number of lower level CLI elements: rules, boxes, trees, and Unicode symbols with ASCII alternatives. It supports ANSI markup for terminal colors and font styles.


  • Build a CLI using semantic elements: headings, lists, alerts, paragraphs.
  • Theming via a CSS-like language.
  • Terminal colors and font styles.
  • All cli text can contain interpreted string literals, via the glue package.
  • Progress bars from R and C code.
  • Error and warning messages with rich text formatting.
  • Support for pluralized messages.
  • ANSI styled string manipulation.


Install the stable version from CRAN:


Short tour

Some of the more commonly used cli elements, and features.

Short alert messages

One liner messages to inform or warn.

pkgs <- c("foo", "bar", "foobar")
cli_alert_success("Downloaded {length(pkgs)} packages.")

db_url <- ""
cli_alert_info("Reopened database {.url {db_url}}.")

cli_alert_warning("Cannot reach GitHub, using local database cache.")

cli_alert_danger("Failed to connect to database.")

cli_alert("A generic alert")


Three levels of headings.

cli_h1("Heading 1")

cli_h2("Heading 2")

cli_h3("Heading 3")


Ordered, unordered and description lists, that can be nested.

fun <- function() {
  cli_li("Item 1")
  ulid <- cli_ul()
  cli_li("Subitem 1")
  cli_li("Subitem 2")
  cli_li("Item 2")


Theming via a CSS-like language.

fun <- function() {
  cli_div(theme = list(span.emph = list(color = "orange")))
  cli_text("This is very {.emph important}")
  cli_text("Back to the {.emph previous theme}")

Command substitution

Automatic command substitution via the glue package.

size <- 123143123
dt <- 1.3454
  "Downloaded {prettyunits::pretty_bytes(size)} in ",


Pluralization support.

nfiles <- 3
ndirs <- 1
cli_alert_info("Found {nfiles} file{?s} and {ndirs} director{?y/ies}.")

Progress bars

clean <- function() {
  cli_progress_bar("Cleaning data", total = 100)
  for (i in 1:100) {


See at and also in the installed package: help(package = "cli").

Code of Conduct

Please note that the cli project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

Download Details:

Author: r-lib
Source Code: 
License: Unknown and 2 other licenses found

#r #cli #tools #beautiful 

R-lib/cli: Tools for Making Beautiful & Useful Command Line interfaces
Lawrence  Lesch

Lawrence Lesch


JSONHero-web: an Open-source, Beautiful JSON Explorer for The Web


Brought to you by API Hero

JSON Hero was created and is maintained by the team behind API Hero. API Hero makes it quick and easy to add popular APIs to your project using the frameworks you love, and scale without worry.


JSON Hero makes reading and understand JSON files easy by giving you a clean and beautiful UI packed with extra features.

  • View JSON any way you'd like: Column View, Tree View, Editor View, and more.
  • Automatically infers the contents of strings and provides useful previews
  • Creates an inferred JSON Schema that could be used to validate your JSON
  • Quickly scan related values to check for edge cases
  • Search your JSON files (both keys and values)
  • Keyboard accessible
  • Easily sharable URLs with path support

JSON Hero Screenshot


Send to JSON Hero

Send your JSON to JSON Hero in a variety of ways

Head to and Drag and Drop a JSON file, or paste JSON or a JSON url in the provided form

Include a Base64 encoded string of a JSON payload:

Include a JSON URL to the new endpoint:

Install the VS Code extension and open JSON from VS Code

Raycast user? Check out our extension here

Use the unofficial API:

  • Make a POST request to with the following JSON body:

Column view

Inspired by macOS Finder, Column View is a new way to browse a JSON document.

JSON Hero Column View

It has all the features you'd expect: Keyboard navigation, Path bar, history.

It also has a nifty feature that allows you to "hold" a descendent selected and travel up through the hierarchy, and then move between siblings and view the different values found at that path. It's hard to describe, but here is an animation to help demonstrate:

Column View - Traverse with Context

As you can see, holding the Option (or Alt key on Windows) while moving to a parent keeps the part of the document selected and shows it in context of it's surrounding JSON. Then you can traverse between items in an array and compare the values of the selection across deep hierarchy cahnges.

Editor view

View your entire JSON document in an editor, but keep the nice previews and related values you get from the sidebar as you move around the document:

Editor view

Tree view

Use a traditional tree view to traverse your JSON document, with collapsible sections and keyboard shortcuts. All while keeping the nice previews:

Tree view


Quickly open a search panel and fuzzy search your entire JSON file in milliseconds. Searches through key names, key paths, values, and even pretty formatted values (e.g. Searching for "Dec" will find datetime strings in the month of December.)


Content Previews

JSON Hero automatically infers the content of strings and provides useful previews and properties of the value you've selected. It's "Show Don't Tell" for JSON:

Dates and Times

Preview colors

Image URLs

Preview colors

Website URLs

Preview websites

Tweet URLS

Preview tweets


Preview JSON


Preview colors

Related Values

Easily see all the related values across your entire JSON document for a specific field, including any undefined or null values.

Editor view

Bugs and Feature Requests

Have a bug or a feature request? Feel free to open a new issue.

You can also join our Discord channel to hang out and discuss anything you'd like.


To run locally, first clone the repo and install the dependencies:

git clone
cd jsonhero-web
npm install

Then, create a file at the root of the repo called .env and set the SESSION_SECRET value:


Then, run npm run build or npm run dev to build.

Now, run npm start and open your browser to http://localhost:8787

Download Details:

Author: Apihero-run
Source Code: 
License: Apache-2.0 license

#typescript #react #json #tools 

JSONHero-web: an Open-source, Beautiful JSON Explorer for The Web
Nat  Grady

Nat Grady


TSibble: Tidy Temporal Data Frames and Tools


The tsibble package provides a data infrastructure for tidy temporal data with wrangling tools. Adapting the tidy data principles, tsibble is a data- and model-oriented object. In tsibble:

  1. Index is a variable with inherent ordering from past to present.
  2. Key is a set of variables that define observational units over time.
  3. Each observation should be uniquely identified by index and key.
  4. Each observational unit should be measured at a common interval, if regularly spaced.


You could install the stable version on CRAN:


You could install the development version from Github using

# install.packages("remotes")

Get started

Coerce to a tsibble with as_tsibble()

To coerce a data frame to tsibble, we need to declare key and index. For example, in the weather data from the package nycflights13, the time_hour containing the date-times should be declared as index, and the origin as key. Other columns can be considered as measured variables.

weather <- nycflights13::weather %>% 
  select(origin, time_hour, temp, humid, precip)
weather_tsbl <- as_tsibble(weather, key = origin, index = time_hour)
#> # A tsibble: 26,115 x 5 [1h] <America/New_York>
#> # Key:       origin [3]
#>   origin time_hour            temp humid precip
#>   <chr>  <dttm>              <dbl> <dbl>  <dbl>
#> 1 EWR    2013-01-01 01:00:00  39.0  59.4      0
#> 2 EWR    2013-01-01 02:00:00  39.0  61.6      0
#> 3 EWR    2013-01-01 03:00:00  39.0  64.4      0
#> 4 EWR    2013-01-01 04:00:00  39.9  62.2      0
#> 5 EWR    2013-01-01 05:00:00  39.0  64.4      0
#> # … with 26,110 more rows

The key can be comprised of empty, one, or more variables. See package?tsibble and vignette("intro-tsibble") for details.

The interval is computed from index based on the representation, ranging from year to nanosecond, from numerics to ordered factors. The table below shows how tsibble interprets some common time formats.


A full list of index classes supported by tsibble can be found in package?tsibble.

fill_gaps() to turn implicit missing values into explicit missing values

Often there are implicit missing cases in time series. If the observations are made at regular time interval, we could turn these implicit missingness to be explicit simply using fill_gaps(), filling gaps in precipitation (precip) with 0 in the meanwhile. It is quite common to replaces NAs with its previous observation for each origin in time series analysis, which is easily done using fill() from tidyr.

full_weather <- weather_tsbl %>%
  fill_gaps(precip = 0) %>% 
  group_by_key() %>% 
  tidyr::fill(temp, humid, .direction = "down")
#> # A tsibble: 26,190 x 5 [1h] <America/New_York>
#> # Key:       origin [3]
#> # Groups:    origin [3]
#>   origin time_hour            temp humid precip
#>   <chr>  <dttm>              <dbl> <dbl>  <dbl>
#> 1 EWR    2013-01-01 01:00:00  39.0  59.4      0
#> 2 EWR    2013-01-01 02:00:00  39.0  61.6      0
#> 3 EWR    2013-01-01 03:00:00  39.0  64.4      0
#> 4 EWR    2013-01-01 04:00:00  39.9  62.2      0
#> 5 EWR    2013-01-01 05:00:00  39.0  64.4      0
#> # … with 26,185 more rows

fill_gaps() also handles filling in time gaps by values or functions, and respects time zones for date-times. Wanna a quick overview of implicit missing values? Check out vignette("implicit-na").

index_by() + summarise() to aggregate over calendar periods

index_by() is the counterpart of group_by() in temporal context, but it groups the index only. In conjunction with index_by(), summarise() aggregates interested variables over time periods. index_by() goes hand in hand with the index functions including as.Date(), yearweek(), yearmonth(), and yearquarter(), as well as other friends from lubridate. For example, it would be of interest in computing average temperature and total precipitation per month, by applying yearmonth() to the index variable (referred to as .).

full_weather %>%
  group_by_key() %>%
  index_by(year_month = ~ yearmonth(.)) %>% # monthly aggregates
    avg_temp = mean(temp, na.rm = TRUE),
    ttl_precip = sum(precip, na.rm = TRUE)
#> # A tsibble: 36 x 4 [1M]
#> # Key:       origin [3]
#>   origin year_month avg_temp ttl_precip
#>   <chr>       <mth>    <dbl>      <dbl>
#> 1 EWR      2013 Jan     35.6       3.53
#> 2 EWR      2013 Feb     34.2       3.83
#> 3 EWR      2013 Mar     40.1       3   
#> 4 EWR      2013 Apr     53.0       1.47
#> 5 EWR      2013 May     63.3       5.44
#> # … with 31 more rows

While collapsing rows (like summarise()), group_by() and index_by() will take care of updating the key and index respectively. This index_by() + summarise() combo can help with regularising a tsibble of irregular time space too.

Learn more about tsibble

An ecosystem, the tidyverts, is built around the tsibble object for tidy time series analysis.

  • The tsibbledata package curates a range of tsibble data examples to poke around the tsibble object.
  • The feasts package provides support for visualising the data and extracting time series features.
  • The fable package provides common forecasting methods for tsibble, such as ARIMA and ETS. The fabletools package, which is fable built upon, lays the modelling infrastructure to ease the programming with tsibble.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms.

Download Details:

Author: tidyverts
Source Code: 
License: GPL-3.0 license

#r #data #frames #tools 

TSibble: Tidy Temporal Data Frames and Tools

Ranking.jl: Tools for Ranking in Julia


Julia tools for ranking entities based on records of binary comparisons. Currently, we've implemented drafts of Elo, Bradley-Terry and TrueSkill.

Usage Example

All of the models we use expect a data matrix, D, in which each row represents a triple: ID of entity 1, ID of entity 2 and the outcome, which is 1.0 if 1 beat 2, 0.0 if 2 beat 1 and 0.5 if there was a tie. Let's create data now in which Player 1 beat Player 2 and also beat Player 3, then Player 2 and Player 3 played a match in which they came to a draw:

    n_players = 3

    D = [1 2 1.0;
         1 3 1.0;
         2 3 0.5;]

We can then fit Elo:

    using Ranking

    m1 = fit(Elo, D, n_players)

And then try Bradley-Terry(-Luce):

    m2 = fit(BradleyTerry, D, n_players)

Finally, let's try TrueSkill:

    m3 = fit(TrueSkill, D, n_players)

As you can see, Player 1 gets the highest score, whereas Players 2 and 3 get lower (and nearly equal) scores. Let's see what happens if we switch the data so that Player 2 definitively loses to Player 3:

    n_players = 3

    D = [1 2 1.0;
         1 3 1.0;
         2 3 0.0;]

    using Ranking

    m1 = fit(Elo, D, n_players)

    m2 = fit(BradleyTerry, D, n_players)

    m3 = fit(TrueSkill, D, n_players)

Here you can see that the order of scores now becomes Player 1, Player 3 and Player 2, which is just what we would expect.

All of these examples assume that you a single group of players that compete against one another. This can be viewed as a unipartite graph.

Another common task in ranking comes from educational testing, where you have students completing questions that they either answer correctly (1) or incorrectly. In this case, we work with a bipartite graph. From the data perspective, what matters is that the first and second columns of our data matrix maintain completely separate indices:

    n_students = 2
    n_questions = 5

    D = [1 1 1.0;
         1 2 1.0;
         1 3 1.0;
         1 4 1.0;
         1 5 0.0;
         2 1 1.0;
         2 2 1.0;
         2 3 1.0;
         2 4 0.0;
         2 5 0.0;]

Given this data, we can fit the Rasch model, which is like Bradley-Terry, but for bipartite data:

    m = fit(Rasch, D, n_students, n_questions)

This produces separate estimates for all students and all questions, but puts them on a common scale. In reality, we could do the same thing with the Bradley-Terry model if we extended the indices to grow from 1 to n_students + n_questions. The Rasch model is simply more convenient when we would like to employ the "natural" ID assignment in which students and questions have independent ID counters.

Download Details:

Author: johnmyleswhite
Source Code: 
License: View license

#julia #tools 

Ranking.jl: Tools for Ranking in Julia

GitHub Action to Set Up PHP with Extensions, Php.ini Configuration

Setup PHP in GitHub Actions

Setup PHP with required extensions, php.ini configuration, code-coverage support and various tools like composer in GitHub Actions. This action gives you a cross-platform interface to set up the PHP environment you need to test your application. Refer to Usage section and examples to see how to use this.

☁️ OS/Platform Support

Both GitHub-hosted and self-hosted runners are supported by setup-php on the following OS/Platforms.

GitHub-Hosted Runners

Virtual environmentYAML workflow labelPre-installed PHP
Ubuntu 22.04ubuntu-22.04PHP 8.1
Ubuntu 20.04ubuntu-latest or ubuntu-20.04PHP 7.4 to PHP 8.1
Ubuntu 18.04ubuntu-18.04PHP 7.2 to PHP 8.1
Windows Server 2022windows-latest or windows-2022PHP 8.1
Windows Server 2019windows-2019PHP 8.1
macOS Monterey 12.xmacos-12PHP 8.1
macOS Big Sur 11.xmacos-latest or macos-11PHP 8.1
macOS Catalina 10.15macos-10.15PHP 8.1

Self-Hosted Runners

Host OS/Virtual environmentYAML workflow label
Ubuntu 22.04self-hosted or Linux
Ubuntu 20.04self-hosted or Linux
Ubuntu 18.04self-hosted or Linux
Debian 11self-hosted or Linux
Debian 10self-hosted or Linux
Windows 7 and newerself-hosted or Windows
Windows Server 2012 R2 and newerself-hosted or Windows
macOS Monterey 12.x x86_64/arm64self-hosted or macOS
macOS Big Sur 11.x x86_64/arm64self-hosted or macOS
macOS Catalina 10.15self-hosted or macOS
  • Refer to the self-hosted setup to use the action on self-hosted runners.
  • Operating systems based on the above Ubuntu and Debian versions are also supported on best effort basis.
  • If the requested PHP version is pre-installed, setup-php switches to it, otherwise it installs the PHP version.

🎉 PHP Support

On all supported OS/Platforms the following PHP versions are supported as per the runner.

  • PHP 5.3 to PHP 8.2 on GitHub-hosted runners.
  • PHP 5.6 to PHP 8.2 on self-hosted runners.
PHP VersionStabilityRelease SupportRunner Support
5.3StableEnd of lifeGitHub-hosted
5.4StableEnd of lifeGitHub-hosted
5.5StableEnd of lifeGitHub-hosted
5.6StableEnd of lifeGitHub-hosted, self-hosted
7.0StableEnd of lifeGitHub-hosted, self-hosted
7.1StableEnd of lifeGitHub-hosted, self-hosted
7.2StableEnd of lifeGitHub-hosted, self-hosted
7.3StableEnd of lifeGitHub-hosted, self-hosted
7.4StableSecurity fixes onlyGitHub-hosted, self-hosted
8.0StableActiveGitHub-hosted, self-hosted
8.1StableActiveGitHub-hosted, self-hosted
8.2NightlyIn developmentGitHub-hosted, self-hosted


  • Specifying 8.2 in php-version input installs a nightly build of PHP 8.2.0-dev. See nightly build setup for more information.
  • To use JIT on PHP 8.0 and above, refer to the JIT configuration section.

➕ PHP Extension Support

PHP extensions can be set up using the extensions input. It accepts a string in csv-format.

  • On Ubuntu, extensions which are available as a package, available on PECL or a git repository can be set up.
- name: Setup PHP with PECL extension
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: imagick, swoole

On Windows, extensions available on PECL which have the DLL binary can be set up.

On macOS, extensions available on PECL or a git repository can be set up.

On Ubuntu and macOS to compile and install an extension from a git repository follow this guide.

Extensions installed along with PHP if specified are enabled.

Specific versions of extensions available on PECL can be set up by suffixing the extension's name with the version. This is useful for installing old versions of extensions which support end of life PHP versions.

- name: Setup PHP with specific version of PECL extension
  uses: shivammathur/setup-php@v2
    php-version: '5.4'
    extensions: swoole-1.9.3
  • Extensions with pre-release versions available on PECL can be set up by suffixing the extension's name with its state i.e alpha, beta, devel or snapshot.
- name: Setup PHP with pre-release PECL extension
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: xdebug-beta

On Ubuntu and macOS to compile and install an extension from PECL with libraries or custom configuration follow this guide.

Shared extensions can be disabled by prefixing them with a :. All extensions depending on the specified extension will also be disabled.

- name: Setup PHP and disable opcache
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: :opcache
  • All shared extensions can be disabled by specifying none. When none is specified along with other extensions, it is hoisted to the start of the input. So, all the shared extensions will be disabled first, then rest of the extensions in the input will be processed.

Note: This disables all core and third-party shared extensions and thus, can break some tools which need them. Required extensions are enabled again when the tools are set up on a best-effort basis. So it is recommended to add the extensions required for your tools after none in the extensions input to avoid any issues.

- name: Setup PHP without any shared extensions except mbstring
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: none, mbstring
  • Extension intl can be set up with specific ICU version for PHP 5.6 and above in Ubuntu workflows by suffixing intl with the ICU version. ICU 50.2 and newer versions are supported. Refer to ICU builds for the specific versions supported.
- name: Setup PHP with intl
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: intl-70.1

Extensions loaded by default after setup-php runs can be found on the wiki.

These extensions have custom support:

  • cubrid, pdo_cubrid and gearman on Ubuntu.
  • geos and event on Ubuntu and macOS.
  • blackfire, couchbase, ioncube, oci8, pdo_firebird, pdo_oci, pecl_http, phalcon3, phalcon4 and phalcon5 on all supported OS.

By default, extensions which cannot be added or disabled gracefully leave an error message in the logs, the execution is not interrupted. To change this behaviour you can set fail-fast flag to true.

- name: Setup PHP with fail-fast
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: oci8
    fail-fast: true

🔧 Tools Support

These tools can be set up globally using the tools input. It accepts a string in csv-format.

behat, blackfire, blackfire-player, churn, codeception, composer, composer-normalize, composer-prefetcher, composer-require-checker, composer-unused, cs2pr, deployer, flex, grpc_php_plugin, infection, parallel-lint, pecl, phan, phing, phinx, phive, php-config, php-cs-fixer, phpcbf, phpcpd, phpcs, phpdoc or phpDocumentor, phpize, phplint, phpmd, phpspec, phpstan, phpunit, phpunit-bridge, phpunit-polyfills, pint, prestissimo, protoc, psalm, rector, symfony or symfony-cli, vapor or vapor-cli, wp or wp-cli

- name: Setup PHP with tools
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: php-cs-fixer, phpunit
  • In addition to above tools any composer tool or package can also be set up globally by specifying it as vendor/package matching the listing on Packagist. This format accepts the same version constraints as composer.
- name: Setup PHP with tools
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: vimeo/psalm

To set up a particular version of a tool, specify it in the form tool:version.

Version can be in the following format:

  • Semver. For example tool:1.2.3 or tool:1.2.3-beta1.
  • Major version. For example tool:1 or tool:1.x.
  • Major and minor version. For example tool:1.2 or tool:1.2.x.
- name: Setup PHP with tools
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: php-cs-fixer:3.5, phpunit:9.5
    GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
  • The latest stable version of composer is set up by default. You can set up the required composer version by specifying the major version v1 or v2, or the version in major.minor or semver format. Additionally for composer snapshot and preview can also be specified to set up the respective releases.
- name: Setup PHP with composer v2
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: composer:v2
  • If you do not use composer in your workflow, you can specify tools: none to skip it.
- name: Setup PHP without composer
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: none

Tools pear, pecl, phpize and php-config are set up by default for all supported PHP versions on Linux and macOS.

The latest version of blackfire cli is set up when blackfire is specified in tools input. Please refer to the official documentation for using blackfire with GitHub Actions.

Tools prestissimo and composer-prefetcher will be skipped unless composer:v1 is also specified in tools input. It is recommended to drop prestissimo and use composer v2.

By default, expect composer tools which cannot be set up gracefully leave an error message in the logs, the execution is not interrupted. To change this behaviour you can set fail-fast flag to true.

- name: Setup PHP with fail-fast
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: deployer
    fail-fast: true


  • Input tools is useful to set up tools which are only used in CI workflows, thus keeping your composer.json tidy.
  • If you do not want to use all your dev-dependencies in workflow, you can run composer with --no-dev and install required tools using tools input to speed up your workflow.
  • By default, COMPOSER_NO_INTERACTION is set to 1 and COMPOSER_PROCESS_TIMEOUT is set to 0. In effect, this means that Composer commands in your scripts do not need to specify --no-interaction.
  • Also, COMPOSER_NO_AUDIT is set to 1. So if you want to audit your dependencies for security vulnerabilities, it is recommended to add a composer audit step before you install them.

📶 Coverage Support


Specify coverage: xdebug to use Xdebug and disable PCOV.
Runs on all PHP versions supported.

- name: Setup PHP with Xdebug
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    coverage: xdebug
  • When you specify coverage: xdebug, the latest version of Xdebug compatible with the PHP version is set up by default.
  • If you need Xdebug 2.x on PHP 7.2, 7.3 or 7.4, you can specify coverage: xdebug2.
- name: Setup PHP with Xdebug 2.x
  uses: shivammathur/setup-php@v2
    php-version: '7.4'
    coverage: xdebug2

Note: Xdebug is enabled by default on Ubuntu GitHub Actions images, so if you are not using it in your workflow it is recommended to disable it as that will have a positive impact on your PHP performance. Please refer to the disable coverage section for details.


Specify coverage: pcov to use PCOV and disable Xdebug.
Runs on PHP 7.1 and newer PHP versions.

  • If your source code directory is other than src, lib or, app, specify using the ini-values input.
- name: Setup PHP with PCOV
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    ini-values: #optional, see above for usage.
    coverage: pcov
  • PHPUnit 8.x and above supports PCOV out of the box.
  • If you are using PHPUnit 5.x, 6.x or 7.x, you need to set up pcov/clobber before executing your tests.
- name: Setup PCOV
  run: |
    composer require pcov/clobber
    vendor/bin/pcov clobber

Disable Coverage

Specify coverage: none to disable both Xdebug and PCOV.

Disable coverage for these reasons:

  • You are not generating coverage reports while testing.
  • You are using phpdbg for running your tests.
  • You are profiling your code using blackfire.
  • You are using PHP in JIT mode. Please refer to JIT configuration section for more details.
- name: Setup PHP with no coverage driver
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    coverage: none

📝 Usage


Specify using with keyword

php-version (required)

  • Specify the PHP version you want to set up.
  • Accepts a string. For example '8.0'.
  • Accepts latest to set up the latest stable PHP version.
  • Accepts nightly to set up a nightly build from the master branch of PHP.
  • Accepts the format d.x, where d is the major version. For example 5.x, 7.x and 8.x.
  • See PHP support for supported PHP versions.

extensions (optional)

  • Specify the extensions you want to add or disable.
  • Accepts a string in csv-format. For example mbstring, :opcache.
  • Accepts none to disable all shared extensions.
  • Shared extensions prefixed with : are disabled.
  • See PHP extension support for more info.

ini-file (optional)

  • Specify the base php.ini file.
  • Accepts production, development or none.
  • By default, production php.ini file is used.

ini-values (optional)

  • Specify the values you want to add to php.ini.
  • Accepts a string in csv-format. For example post_max_size=256M, max_execution_time=180.
  • Accepts ini values with commas if wrapped in quotes. For example xdebug.mode="develop,coverage".

coverage (optional)

  • Specify the code-coverage driver you want to set up.
  • Accepts xdebug, pcov or none.
  • See coverage support for more info.

tools (optional)

  • Specify the tools you want to set up.
  • Accepts a string in csv-format. For example: phpunit, phpcs
  • See tools support for tools supported.



On GitHub Actions you can assign the setup-php step an id, you can use the same to get the outputs in a later step.

  • Provides the PHP version in semver format.
- name: Setup PHP
  id: setup-php
  uses: shivammathur/setup-php@v2
    php-version: '8.1'

- name: Print PHP version
  run: echo ${{ steps.setup-php.outputs.php-version }}


Specify using env keyword

fail-fast (optional)

  • Specify to mark the workflow as failed if an extension or tool fails to set up.
  • This changes the default mode from graceful warnings to fail-fast.
  • By default, it is set to false.
  • Accepts true and false.

phpts (optional)

  • Specify to set up thread-safe version of PHP on Windows.
  • Accepts ts and nts.
  • By default, it is set to nts.
  • See thread safe setup for more info.

update (optional)

  • Specify to update PHP on the runner to the latest patch version.
  • Accepts true and false.
  • By default, it is set to false.
  • See force update setup for more info.

See below for more info.

Basic Setup

Set up a particular PHP version.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    extensions: mbstring, intl
    ini-values: post_max_size=256M, max_execution_time=180
    coverage: xdebug
    tools: php-cs-fixer, phpunit

Matrix Setup

Set up multiple PHP versions on multiple operating systems.

    runs-on: ${{ matrix.operating-system }}
        operating-system: ['ubuntu-latest', 'windows-latest', 'macos-latest']
        php-versions: ['7.4', '8.0', '8.1']
        phpunit-versions: ['latest']
        - operating-system: 'ubuntu-latest'
          php-versions: '7.2'
          phpunit-versions: '8.5.21'
    - name: Setup PHP
      uses: shivammathur/setup-php@v2
        php-version: ${{ matrix.php-versions }}
        extensions: mbstring, intl
        ini-values: post_max_size=256M, max_execution_time=180
        coverage: xdebug
        tools: php-cs-fixer, phpunit:${{ matrix.phpunit-versions }}

Nightly Build Setup

Set up a nightly build of PHP 8.2.

  • This PHP version is currently in active development and might contain bugs and breaking changes.
  • Some user space extensions might not support this version currently.
- name: Setup nightly PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.2'
    extensions: mbstring
    ini-values: post_max_size=256M, max_execution_time=180
    coverage: xdebug
    tools: php-cs-fixer, phpunit

Debug Build Setup

Set up a PHP build with debugging symbols.

  • Production release builds of PHP without debugging symbols are set up by default.
  • You can use the debug environment variable to set up a build with debugging symbols for PHP 5.6 and above.
- name: Setup PHP with debugging symbols
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    debug: true # specify true or false

Thread Safe Setup

Set up TS or NTS PHP on Windows.

  • NTS versions are set up by default.
  • On Ubuntu and macOS only NTS versions are supported.
  • On Windows both TS and NTS versions are supported.
    runs-on: windows-latest
    name: Setup PHP TS on Windows
    - name: Setup PHP
      uses: shivammathur/setup-php@v2
        php-version: '8.1'
        phpts: ts # specify ts or nts

Force Update Setup

Update to the latest patch of PHP versions.

  • Pre-installed PHP versions are not updated to their latest patch release by default.
  • If ppa:ondrej/php is missing on the Ubuntu GitHub environment, the PHP version is updated to the latest patch release.
  • You can specify the update environment variable to true for updating to the latest release.
- name: Setup PHP with latest versions
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    update: true # specify true or false

Verbose Setup

Debug your workflow

To debug any issues, you can use the verbose tag instead of v2.

- name: Setup PHP with logs
  uses: shivammathur/setup-php@verbose
    php-version: '8.1'

Multi-Arch Setup

Set up PHP on multiple architecture on Ubuntu GitHub Runners.

  • PHP 5.6 to PHP 8.1 are supported by setup-php on multiple architecture on Ubuntu.
  • For this, you can use shivammathur/node images as containers. These have compatible Nodejs installed for setup-php.
  • Currently, for ARM based setup, you will need self-hosted runners.
    runs-on: ubuntu-latest
    container: shivammathur/node:latest-${{ matrix.arch }}
        arch: ["amd64", "i386"]
      - name: Install PHP
        uses: shivammathur/setup-php@v2
          php-version: '8.1'

Self Hosted Setup

Set up PHP on a self-hosted runner.

To set up a containerised self-hosted runner, refer to the following guides as per your base operating system.

To set up the runner directly on the host OS or in a virtual machine, follow this requirements guide before setting up the self-hosted runner.

If your workflow uses service containers, then set up the runner on a Linux host or in a Linux virtual machine. GitHub Actions does not support nested virtualization on Linux, so services will not work in a dockerized container.

It is recommended to specify the environment variable runner with the value self-hosted for self-hosted environments.

    runs-on: self-hosted
        php-versions: ['5.6', '7.0', '7.1', '7.2', '7.3', '7.4', '8.0']
    name: PHP ${{ matrix.php-versions }}
    - name: Setup PHP
      uses: shivammathur/setup-php@v2
        php-version: ${{ matrix.php-versions }}
        runner: self-hosted


  • Do not set up multiple self-hosted runners on a single server instance as parallel workflow will conflict with each other.
  • Do not set up self-hosted runners on the side on your development environment or your production server.
  • Avoid using the same labels for your self-hosted runners which are used by GitHub-hosted runners.

Local Testing Setup

Test your Ubuntu workflow locally using nektos/act.

    runs-on: ubuntu-latest
    - name: Setup PHP
      uses: shivammathur/setup-php@v2
        php-version: '8.1'

Run the workflow locally with act using shivammathur/node docker images.

Choose the image tag which matches the runs-on property in your workflow. For example, if you are using ubuntu-20.04 in your workflow, run act -P ubuntu-20.04=shivammathur/node:2004.

# For runs-on: ubuntu-latest
act -P ubuntu-latest=shivammathur/node:latest

# For runs-on: ubuntu-22.04
act -P ubuntu-22.04=shivammathur/node:2204

# For runs-on: ubuntu-20.04
act -P ubuntu-20.04=shivammathur/node:2004

# For runs-on: ubuntu-18.04
act -P ubuntu-18.04=shivammathur/node:1804

JIT Configuration

Enable Just-in-time (JIT) on PHP 8.0 and above.

  • To enable JIT, enable opcache in cli mode by setting opcache.enable_cli=1.
  • JIT conflicts with Xdebug, PCOV, and other extensions which override zend_execute_ex function, so set coverage: none and disable any such extension if added.
  • By default, opcache.jit=1235 and opcache.jit_buffer_size=256M are set which can be changed using ini-values input.
  • For detailed information about JIT related directives refer to the official PHP documentation.

For example to enable JIT in tracing mode with buffer size of 64 MB.

- name: Setup PHP with JIT in tracing mode
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    coverage: none
    ini-values: opcache.enable_cli=1, opcache.jit=tracing, opcache.jit_buffer_size=64M

Cache Extensions

You can cache PHP extensions using shivammathur/cache-extensions and action/cache GitHub Actions. Extensions which take very long to set up when cached are available in the next workflow run and are enabled directly. This reduces the workflow execution time.
Refer to shivammathur/cache-extensions for details.

Cache Composer Dependencies

If your project uses composer, you can persist the composer's internal cache directory. Dependencies cached are loaded directly instead of downloading them while installation. The files cached are available across check-runs and will reduce the workflow execution time.

- name: Get composer cache directory
  id: composer-cache
  run: echo "dir=$(composer config cache-files-dir)" >> $GITHUB_OUTPUT

- name: Cache dependencies
  uses: actions/cache@v2
    path: ${{ steps.composer-cache.outputs.dir }}
    key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
    restore-keys: ${{ runner.os }}-composer-

- name: Install dependencies
  run: composer install --prefer-dist


  • Please do not cache vendor directory using action/cache as that will have side effects.
  • If you do not commit composer.lock, you can use the hash of composer.json as the key for your cache.
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.json') }}
  • If you support a range of composer dependencies and use prefer-lowest and prefer-stable options, you can store them in your matrix and add them to the keys.
key: ${{ runner.os }}-composer-${{ matrix.prefer }}-${{ hashFiles('**/composer.lock') }}
restore-keys: ${{ runner.os }}-composer-${{ matrix.prefer }}-

GitHub Composer Authentication

If you have a number of workflows which set up multiple tools or have many composer dependencies, you might hit the GitHub's rate limit for composer. Also, if you specify only the major version or the version in major.minor format, you can hit the rate limit. To avoid this you can specify an OAuth token by setting GITHUB_TOKEN environment variable. You can use GITHUB_TOKEN secret for this purpose.

The COMPOSER_TOKEN environment variable has been deprecated in favor of GITHUB_TOKEN and will be removed in the next major version.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

Private Packagist Authentication

If you use Private Packagist for your private composer dependencies, you can set the PACKAGIST_TOKEN environment variable to authenticate.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'

Manual Composer Authentication

In addition to GitHub or Private Packagist, if you want to authenticate private repositories hosted elsewhere, you can set the COMPOSER_AUTH_JSON environment variable with the authentication methods and the credentials in json format. Please refer to the authentication section in composer documentation for more details.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
        "http-basic": {
          "": {
            "username": "${{ secrets.EXAMPLE_ORG_USERNAME }}",
            "password": "${{ secrets.EXAMPLE_ORG_PASSWORD }}"

Inline PHP Scripts

If you have to run multiple lines of PHP code in your workflow, you can do that easily without saving it to a file.

Put the code in the run property of a step and specify the shell as php {0}.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'

- name: Run PHP code
  shell: php {0}
  run: |
    $welcome = "Hello, world";
    echo $welcome;

Problem Matchers

Problem matchers are json configurations which identify errors and warnings in your logs and surface them prominently in the GitHub Actions UI by highlighting them and creating code annotations.


Setup problem matchers for your PHP output by adding this step after the setup-php step.

- name: Setup problem matchers for PHP
  run: echo "::add-matcher::${{ runner.tool_cache }}/php.json"


Setup problem matchers for your PHPUnit output by adding this step after the setup-php step.

- name: Setup problem matchers for PHPUnit
  run: echo "::add-matcher::${{ runner.tool_cache }}/phpunit.json"


PHPStan supports error reporting in GitHub Actions, so it does not require problem matchers.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: phpstan

- name: Run PHPStan
  run: phpstan analyse src


Psalm supports error reporting in GitHub Actions with an output format github.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: psalm

- name: Run Psalm
  run: psalm --output-format=github

Tools with checkstyle support

For tools that support checkstyle reporting like phpstan, psalm, php-cs-fixer and phpcs you can use cs2pr to annotate your code.
For examples refer to cs2pr documentation.

Here is an example with phpcs.

- name: Setup PHP
  uses: shivammathur/setup-php@v2
    php-version: '8.1'
    tools: cs2pr, phpcs

- name: Run phpcs
  run: phpcs -q --report=checkstyle src | cs2pr


Examples of using setup-php with various PHP frameworks and packages.

Framework/PackageRuns onWorkflow
BlackfiremacOS, ubuntu and windowsblackfire.yml
Blackfire PlayermacOS, ubuntu and windowsblackfire-player.yml
CakePHP with MySQL and Redisubuntucakephp-mysql.yml
CakePHP with PostgreSQL and Redisubuntucakephp-postgres.yml
CakePHP without servicesmacOS, ubuntu and windowscakephp.yml
CodeIgnitermacOS, ubuntu and windowscodeigniter.yml
Laminas MVCmacOS, ubuntu and windowslaminas-mvc.yml
Laravel with MySQL and Redisubuntularavel-mysql.yml
Laravel with PostgreSQL and Redisubuntularavel-postgres.yml
Laravel without servicesmacOS, ubuntu and windowslaravel.yml
Lumen with MySQL and Redisubuntulumen-mysql.yml
Lumen with PostgreSQL and Redisubuntulumen-postgres.yml
Lumen without servicesmacOS, ubuntu and windowslumen.yml
Phalcon with MySQLubuntuphalcon-mysql.yml
Phalcon with PostgreSQLubuntuphalcon-postgres.yml
Slim FrameworkmacOS, ubuntu and windowsslim-framework.yml
Symfony with MySQLubuntusymfony-mysql.yml
Symfony with PostgreSQLubuntusymfony-postgres.yml
Symfony without servicesmacOS, ubuntu and windowssymfony.yml
Yii2 Starter Kit with MySQLubuntuyii2-mysql.yml
Yii2 Starter Kit with PostgreSQLubuntuyii2-postgres.yml

🔖 Versioning

  • Use the v2 tag as setup-php version. It is a rolling tag and is synced with the latest minor and patch releases. With v2 you automatically get the bug fixes, security patches, new features and support for latest PHP releases.
  • Semantic release versions can also be used. It is recommended to use dependabot with semantic versioning to keep the actions in your workflows up to date.
  • Commit SHA can also be used, but are not recommended. They have to be updated with every release manually, without which you will not get any bug fixes, security patches or new features.
  • For debugging any issues verbose tag can be used temporarily. It outputs all the logs and is also synced with the latest releases.
  • It is highly discouraged to use the master branch as version, it might break your workflow after major releases as they have breaking changes.
  • If you are using the v1 tag or a 1.x.y version, you should switch to v2 as v1 only gets critical bug fixes. Maintenance support for v1 will be dropped with the last PHP 8.0 release.

📦 Dependencies

📑 Further Reading

Download Details:

Author: Shivammathur
Source Code:
License: MIT license

#php #composer #tools 

GitHub Action to Set Up PHP with Extensions, Php.ini Configuration
Rupert  Beatty

Rupert Beatty


A Code Rewrite tool for Structural Search & Replace That Supports


See the usage documentation.

A short example below shows how comby simplifies matching and rewriting compared to regex approaches like sed.

Comby supports interactive review mode (click here to see it in action).

Need help writing patterns or have other problems? Post them in Gitter.

Install (pre-built binaries)

Mac OS X

  • brew install comby

Ubuntu Linux

bash <(curl -sL

Other Linux distributions: The PCRE library is dynamically linked in the Ubuntu binary. For other distributions like Arch Linux, a fixup is needed: sudo ln -s /usr/lib/ /usr/lib/ On Fedora, use sudo ln -s /usr/lib64/ /usr/lib64/ Alternatively, consider building from source.



  • docker pull comby/comby

click to expand an example invocation for the docker image

Running with docker on stdin:

docker run -a stdin -a stdout -a stderr -i comby/comby '(:[emoji] hi)' 'bye :[emoji]' lisp -stdin <<< '(👋 hi)'

Or try it live.

Isn't a regex approach like sed good enough?

Sometimes, yes. But often, small changes and refactorings are complicated by nested expressions, comments, or strings. Consider the following C-like snippet. Say the challenge is to rewrite the two if conditions to the value 1. Can you write a regular expression that matches the contents of the two if condition expressions, and only those two? Feel free to share your pattern with @rvtond on Twitter.

if (fgets(line, 128, file_pointer) == Null) // 1) if (...) returns 0
      return 0;
if (scanf("%d) %d", &x, &y) == 2) // 2) if (scanf("%d) %d", &x, &y) == 2) returns 0
      return 0;

To match these with comby, all you need to write is if (:[condition]), and specify one flag that this language is C-like. The replacement is if (1). See the live example.

Build from source

Install opam. TL;DR do sh <(curl -sL

Run this if you don't have OCaml installed (it bootstraps the OCaml compiler):

opam init
opam switch create 4.11.0 4.11.0

Run eval $(opam env)

Install OS dependencies:

Linux: sudo apt-get install autoconf libpcre3-dev pkg-config zlib1g-dev m4 libgmp-dev libev4 libsqlite3-dev

Mac: brew install pkg-config gmp pcre libev

Then install the library dependencies:

git clone
cd comby 
opam install . --deps-only
  • Build and test
make test
  • Install comby on your PATH by running
make install

Download Details:

Author: Comby-tools
Source Code: 
License: Apache-2.0 license

#swift #tools #javascript #refactoring #python 

A Code Rewrite tool for Structural Search & Replace That Supports

IRtools.jl: Mike's Little intermediate Representation


IRTools aims to provide a simple and flexible IR format, expressive enough to work with both lowered and typed Julia code, as well as external IRs. It can be used with Julia metaprogramming tools such as Cassette.

julia> using IRTools

julia> function pow(x, n) # A simple Julia function
         r = 1
         while n > 0
           n -= 1
           r *= x
         return r

julia> ir = @code_ir pow(1, 1) # Get its IR
1: (%1, %2, %3)
  br 2 (%3, 1)
2: (%4, %5)
  %6 = %4 > 0
  br 4 unless %6
  br 3
  %7 = %4 - 1
  %8 = %5 * %2
  br 2 (%7, %8)
  return %5

julia> using IRTools: var, xcall

julia> ir[var(8)] = xcall(:+, var(5), var(2)) # Tweak it
:(%5 + %2)

julia> ir
1: (%1, %2, %3)
  br 2 (%3, 1)
2: (%4, %5)
  %6 = %4 > 0
  br 4 unless %6
  br 3
  %7 = %4 - 1
  %8 = %5 + %2
  br 2 (%7, %8)
  return %5

julia> f = IRTools.func(ir); # Turn the new IR into a lambda

julia> f(nothing, 10, 5)

julia> @code_llvm f(nothing, 10, 5)
define i64 @"julia_##399_17438"(i64, i64) {
     %2 = icmp slt i64 %1, 1
     %3 = mul i64 %1, %0
     %4 = add i64 %3, 1
     %value_phi1.lcssa = select i1 %2, i64 1, i64 %4
    ret i64 %value_phi1.lcssa

Download Details:

Author: FluxML
Source Code: 
License: MIT license

#julia #tools 

IRtools.jl: Mike's Little intermediate Representation
Rupert  Beatty

Rupert Beatty


DevtoysMac: DevToys for Mac


This is the mac app version of DevToys for Windows!

How to install



  • Install Homebrew. Then install DevToysMac with brew install --cask devtoys.

How to Build



スクリーンショット 2022-01-30 19 01 01

Json <> Yaml Converter

スクリーンショット 2022-01-30 19 01 23

Number Base Converter

スクリーンショット 2022-01-30 19 01 41

HTML Encoder / Decoder

スクリーンショット 2022-01-30 19 02 05

URL Encoder / Decoder

スクリーンショット 2022-01-30 19 02 11

Base64 Encoder / Decoder

スクリーンショット 2022-01-30 19 02 49

JSON Formatter

スクリーンショット 2022-01-30 19 04 43

and more...

Download Details:

Author: ObuchiYuki
Source Code: 
License: MIT license

#swift #mac #developer #tools 

DevtoysMac: DevToys for Mac

A Set Of tools for Dealing with Recursive Arrays Like Arrays Of Arrays


RecursiveArrayTools.jl is a set of tools for dealing with recursive arrays like arrays of arrays.

Tutorials and Documentation

For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.


using RecursiveArrayTools
a = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
b = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
vA = VectorOfArray(a)
vB = VectorOfArray(b)

vA .* vB # Now all standard array stuff works!

a = (rand(5),rand(5))
b = (rand(5),rand(5))
pA = ArrayPartition(a)
pB = ArrayPartition(b)

pA .* pB # Now all standard array stuff works!

Download Details:

Author: SciML
Source Code: 
License: View license

 #julia #array #tools #vector 

A Set Of tools for Dealing with Recursive Arrays Like Arrays Of Arrays

LLLplus.jl: Lattice Reduction and Other Lattice tools in Julia


LLLplus provides lattice tools such as Lenstra-Lenstra-Lovász (LLL) lattice reduction which are of practical and theoretical use in cryptography, digital communication, integer programming, and more. This package is experimental and not a robust tool; use at your own risk :-)

LLLplus has functions for LLL, Seysen, and Hermite-Korkine-Zolotarev lattice reduction techniques. Brun integer relations is included in the form of lattice reduction. Solvers for the shortest vector and the closest vector problems are also included; for more see the help text for the lll, seysen, hkz, brun, svp, and cvp functions. Several toy (demo) functions are also included; see the subsetsum, integerfeasibility, rationalapprox, and spigotBBP functions.


Each function contains documentation and examples available via Julia's built-in documentation system (try ?lll or @doc(lll)). Documentation for all functions is available. A tutorial notebook is found in the docs directory or on nbviewer.

Here are a few examples of using the functions in the package on random lattices.

using LLLplus

# do lattice reduction on a matrix with randn entries
N = 40;
H = randn(N,N);
B,T = brun(H);
B,T = lll(H);
B,T = seysen(H);
B,T = hkz(H);

# check out the CVP solver
Q,Rtmp=qr(H); R = UpperTriangular(Rtmp);

Execution Time results 

In the first test we compare several LLL functions: the lll function from LLLplus, the l2avx function in the src\l2.jl file in LLLplus, the lll_with_transform function from Nemo.jl (which uses FLINT), and the lll_reduction function from fplll. Nemo is written by number theorists, while fplll is written by lattice cryptanalysis academics; they are good benchmarks against which to compare. We first show how the execution time varies as the basis (matrix) size varies over [4 8 16 32 64]. For each matrix size, 20 random bases are generated using fplll's gen_qary function with depth of 25 bits, with the average execution time shown; the eltype is Int64 except for NEMO, which can only use GMP (its own BigInt); in all cases the δ=.99. The vertical axis shows execution time on a logarithmic scale; the x-axis is also logarithmic. The lll function is slower, while l2avx is similar to fplll. Though not shown, using bases from gen_qary with bit depth of 45 gives fplll a larger advantage. Though the LLLplus functions are not the fastest, they are in the same ballpark as the C and C++ tools; if this package gets more users, we'll spend more time on speed :-) This figure was generated using code in test/timeLLLs.jl.

Time vs basis size

One additional question that could arise when looking at the plot above is what the quality of the basis is. In the next plot we show execution time vs the norm of the first vector in the reduced basis, this first vector is typically the smallest; its norm is an rough indication of the quality of the reduced basis. We show results averaged over 20 random bases from gen_qary with depth 25 bits, this time with the dimension fixed at 32. The curve is created by varying the δ parameter from .29 to .99 in steps of .2; the larger times and smaller norms correspond to the largest δ values. Though the l2avx function is competitive with fplll in this case, in most cases the fplll code is faster.

Time vs reduction quality

Finally, we show execution time for several built-in datatypes (Int32, Int64, Int128, Float32, Float64, BitInt, and BigFloat) as well as type from external packages (Float128 from Quadmath.jl and Double64 from DoubleFloat.jl) which are used to generate 60 16x16 matrices, over which execution time for the lattice reduction techniques is averaged. The vertical axis is a logarithmic representation of execution time as in the previous figure. This figure was generated using code in test/perftest.jl.

Time vs data type


The 2020 Simons Institute lattice workshop, a survey paper by Wuebben, and the monograph by Bremner were helpful in writing the tools in LLLplus and are good resources for further study. If you are trying to break one of the Lattice Challenge records or are looking for robust, well-proven lattice tools, look at fplll. Also, for many number-theoretic problems the Nemo.jl package is appropriate; it uses the FLINT C library to do LLL reduction on Nemo-specific data types. Finally, no number theorists have worked on LLLplus; please treat the package as experimental.

Download Details:

Author: Christianpeel
Source Code: 
License: MIT license

#julia #tools 

LLLplus.jl: Lattice Reduction and Other Lattice tools in Julia

Hexagons.jl: Useful tools for Working with Hexagonal Grids


This package provides some basic utilities for working with hexagonal grids. It is largely works from Amit Patel's terrific refererence.


Hexagonal grids can be indexed a number of different ways. Indexes are represented with one of the Hexagon types. The following are currently provided:

HexagonAxial(q, r)
HexagonCubic(x, y, z)
HexagonOffsetOddR(q, r)
HexagonOffsetEvenR(q, r)

One indexing system can be converted to another with convert.

convert(HexagonOffsetOddR, HexagonAxial(2, 4))

The six points (in cartesian space) of a hexagon can be iterated through with points.

for (x, y) in vertices(HexagonAxial(2, 3))
    # ...

The center point can be obtained with center

x, y = center(HexagonAxial(2, 3))

A point in cartesian space can be mapped to the index of the hexagon that contains it with the cube_round function.

h = cube_round(23.5, 4.67)


This library is not mature or complete, but provides just enough to implement hexagonal bin visualizations. If your hexagon project requires something that's not provided, file bug or pull request.

Download Details:

Author: GiovineItalia
Source Code: 
License: View license

#julia #tools #grids 

Hexagons.jl: Useful tools for Working with Hexagonal Grids