1678976345
Simbody is a high-performance, open-source toolkit for science- and engineering-quality simulation of articulated mechanisms, including biomechanical structures such as human and animal skeletons, mechanical systems like robots, vehicles, and machines, and anything else that can be described as a set of rigid bodies interconnected by joints, influenced by forces and motions, and restricted by constraints. Simbody includes a multibody dynamics library for modeling motion in generalized/internal coordinates in O(n) time. This is sometimes called a Featherstone-style physics engine.
Simbody provides a C++ API that is used to build domain-specific applications; it is not a standalone application itself. For example, it is used by biomechanists in OpenSim, by roboticists in Gazebo, and for biomolecular research in MacroMoleculeBuilder (MMB). Here's an artful simulation of several RNA molecules containing thousands of bodies, performed with MMB by Samuel Flores:
Read more about Simbody at the Simbody homepage.
Here's some code to simulate and visualize a 2-link chain:
#include "Simbody.h"
using namespace SimTK;
int main() {
// Define the system.
MultibodySystem system;
SimbodyMatterSubsystem matter(system);
GeneralForceSubsystem forces(system);
Force::Gravity gravity(forces, matter, -YAxis, 9.8);
// Describe mass and visualization properties for a generic body.
Body::Rigid bodyInfo(MassProperties(1.0, Vec3(0), UnitInertia(1)));
bodyInfo.addDecoration(Transform(), DecorativeSphere(0.1));
// Create the moving (mobilized) bodies of the pendulum.
MobilizedBody::Pin pendulum1(matter.Ground(), Transform(Vec3(0)),
bodyInfo, Transform(Vec3(0, 1, 0)));
MobilizedBody::Pin pendulum2(pendulum1, Transform(Vec3(0)),
bodyInfo, Transform(Vec3(0, 1, 0)));
// Set up visualization.
system.setUseUniformBackground(true);
Visualizer viz(system);
system.addEventReporter(new Visualizer::Reporter(viz, 0.01));
// Initialize the system and state.
State state = system.realizeTopology();
pendulum2.setRate(state, 5.0);
// Simulate for 20 seconds.
RungeKuttaMersonIntegrator integ(system);
TimeStepper ts(system, integ);
ts.initialize(state);
ts.stepTo(20.0);
}
See Simbody's User Guide for a step-by-step explanation of this example.
Simbody depends on the following:
Simbody works on Windows, Mac, and Linux. For each operating system, you can use a package manager or build from source. In this file, we provide instructions for 6 different ways of installing Simbody:
If you use Linux, check Repology to see if your distribution provides a package for Simbody.
These are not the only ways to install Simbody, however. For example, on a Mac, you could use CMake and Xcode.
Simbody 3.6 and later uses C++11 features (the -std=c++11
flag). Simbody 3.3 and earlier use only C++03 features, and Simbody 3.4 and 3.5 can use either C++03 or C++11; see the SIMBODY_STANDARD_11
CMake variable in these versions. Note that if you want to use the C++11 flag in your own project, Simbody must have been compiled with the C++11 flag as well.
All needed library dependencies are provided with the Simbody installation on Windows, including linear algebra and visualization dependencies.
C:/Simbody-source
.Get git. There are many options:
Clone the github repository into C:/Simbody-source
. Run the following in a Git Bash / Git Shell, or find a way to run the equivalent commands in a GUI client:
$ git clone https://github.com/simbody/simbody.git C:/Simbody-source
$ git checkout Simbody-3.7
In the last line above, we assumed you want to build a released version. Feel free to change the version you want to build. If you want to build the latest development version ("bleeding edge") of Simbody off the master
branch, you can omit the checkout
line.
To see the set of releases and checkout a specific version, you can use the following commands:
$ git tag
$ git checkout Simbody-X.Y.Z
C:/Simbody-source
.C:/Simbody-build
, just not inside your source directory. This is not where we will install Simbody; see below.CMAKE_INSTALL_PREFIX
variable. We'll assume you set it to C:/Simbody
. If you choose a different installation location, make sure to use yours where we use C:/Simbody
below.BUILD_EXAMPLES
to see what Simbody can do. On by default.BUILD_TESTING
to ensure your Simbody works correctly. On by default.BUILD_VISUALIZER
to be able to watch your system move about! If building remotely, you could turn this off. On by default.BUILD_DYNAMIC_LIBRARIES
builds the three libraries as dynamic libraries. On by default. Unless you know what you're doing, leave this one on.BUILD_STATIC_LIBRARIES
builds the three libraries as static libraries, whose names will end with _static
. Off by default. You must activate either BUILD_DYNAMIC_LIBRARIES
, BUILD_STATIC_LIBRARIES
, or both.BUILD_TESTS_AND_EXAMPLES_STATIC
if static libraries, and tests or examples are being built, creates statically-linked tests/examples. Can take a while to build, and it is unlikely you'll use the statically-linked libraries.BUILD_TESTS_AND_EXAMPLES_SHARED
if tests or examples are being built, creates dynamically-linked tests/examples. Unless you know what you're doing, leave this one on.Open C:/Simbody-build/Simbody.sln
in Visual Studio.
Select your desired Solution configuration from the drop-down at the top.
_d
.Build the project ALL_BUILD by right-clicking it and selecting Build.
Run the tests by right-clicking RUN_TESTS and selecting Build. Make sure all tests pass. You can use RUN_TESTS_PARALLEL for a faster test run if you have multiple cores.
(Optional) Build the project doxygen to get API documentation generated from your Simbody source. You will get some warnings if your doxygen version is earlier than Doxygen 1.8.8; upgrade if you can.
Install Simbody by right-clicking INSTALL and selecting Build.
Within your build in Visual Studio (not the installation):
Example -
and select Select as Startup Project.If you are only building Simbody to use it with OpenSim, you can skip this section.
bin/
directory to your PATH
environment variable.environment
.C:/Simbody/bin;
to the front of the text field. Don't forget the semicolon!SIMBODY_HOME
.C:/Simbody
.C:/Simbody/examples/bin
and running SimbodyInstallTest.exe
or SimbodyInstallTestNoViz.exe
.Note: Example binaries are not installed for Debug configurations. They are present in the build environment, however, so you can run them from there. They will run very slowly!
How is your Simbody installation organized?
bin/
the visualizer and shared libraries (.dll's, used at runtime).doc/
a few manuals, as well as API docs (SimbodyAPI.html
).examples/
src/
the source code for the examples.bin/
the examples, compiled into executables; run them! (Not installed for Debug builds.)include/
the header (.h) files; necessary for projects that use Simbody.lib/
"import" libraries, used during linking.cmake/
CMake files that are useful for projects that use Simbody.These instructions are for building Simbody from source on either a Mac or on Ubuntu.
Simbody uses recent C++ features, that require a modern compiler. Before installing Simbody, check your compiler version with commands like that:
g++ --version
clang++ --version
In case your compiler is not supported, you can upgrade your compiler.
Upgrading GCC to 4.9 on Ubuntu 14.04
Here are some instructions to upgrade GCC on a Ubuntu 14.04 distribution.
$ sudo add-apt-repository ppa:ubuntu-toolchain-r/test
$ sudo apt-get update
$ sudo apt-get install gcc-4.9 g++-4.9
If one wants to set gcc-4.9
and g++-4.9
as the default compilers, run the following command
$ sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-4.9 60 --slave /usr/bin/g++ g++ /usr/bin/g++-4.9
Remember that when having several compilers, CMake flags CMAKE_C_COMPILER
and CMAKE_CXX_COMPILER
can be used to select the ones desired. For example, Simbody can be configured with the following flags:
$ cmake -DCMAKE_C_COMPILER=gcc-4.9 -DCMAKE_CXX_COMPILER=g++-4.9
On a Mac, the Xcode developer package gives LAPACK and BLAS to you via the Accelerate framework. Mac's come with the visualization dependencies.
On Ubuntu, we need to get the dependencies ourselves. Open a terminal and run the following commands.
$ sudo apt-get install cmake liblapack-dev
.cmake-qt-gui
.$ sudo apt-get install freeglut3-dev libxi-dev libxmu-dev
.$ sudo apt-get install doxygen
.LAPACK version 3.6.0 and higher may be required for some applications (OpenSim). LAPACK can be downloaded from http://www.netlib.org/lapack/, and compiled using the following method. It is sufficient to set LD_LIBRARY_PATH
to your LAPACK install prefix and build Simbody using the -DBUILD_USING_OTHER_LAPACK:PATH=/path/to/liblapack.so
option in cmake.
cmake ../lapack-3.6.0 -DCMAKE_INSTALL_PREFIX=/path/to/new/lapack/ -DCMAKE_BUILD_TYPE=RELEASE -DBUILD_SHARED_LIBS=ON
make
make install
There are two ways to get the source code.
~/simbody-source
.Get git.
brew install git
in a terminal.sudo apt-get install git
in a terminal.Clone the github repository into ~/simbody-source
.
$ git clone https://github.com/simbody/simbody.git ~/simbody-source
$ git checkout Simbody-3.7
In the last line above, we assumed you want to build a released version. Feel free to change the version you want to build. If you want to build the latest development version ("bleeding edge") of Simbody off the master
branch, you can omit the checkout
line.
To see the set of releases and checkout a specific version, you can use the following commands:
$ git tag
$ git checkout Simbody-X.Y.Z
Create a directory in which we'll build Simbody. We'll assume you choose ~/simbody-build
. Don't choose a location inside ~/simbody-source
.
$ mkdir ~/simbody-build
$ cd ~/simbody-build
Configure your Simbody build with CMake. We'll use the cmake
command but you could also use the interactive tools ccmake
or cmake-gui
. You have a few configuration options to play with here.
If you don't want to fuss with any options, run:
$ cmake ~/simbody-source
Where do you want to install Simbody? By default, it is installed to /usr/local/
. That's a great default option, especially if you think you'll only use one version of Simbody at a time. You can change this via the CMAKE_INSTALL_PREFIX
variable. Let's choose ~/simbody
:
$ cmake ~/simbody-source -DCMAKE_INSTALL_PREFIX=~/simbody
Do you want the libraries to be optimized for speed, or to contain debugger symbols? You can change this via the CMAKE_BUILD_TYPE
variable. There are 4 options:
_d
.There are a few other variables you might want to play with:
BUILD_EXAMPLES
to see what Simbody can do. On by default.BUILD_TESTING
to ensure your Simbody works correctly. On by default.BUILD_VISUALIZER
to be able to watch your system move about! If building on a cluster, you could turn this off. On by default.BUILD_DYNAMIC_LIBRARIES
builds the three libraries as dynamic libraries. On by default.BUILD_STATIC_LIBRARIES
builds the three libraries as static libraries, whose names will end with _static
.BUILD_TESTS_AND_EXAMPLES_STATIC
if tests or examples are being built, creates statically-linked tests/examples. Can take a while to build, and it is unlikely you'll use the statically-linked libraries.BUILD_TESTS_AND_EXAMPLES_SHARED
if tests or examples are being built, creates dynamically-linked tests/examples. Unless you know what you're doing, leave this one on.Build the API documentation. This is optional, and you can only do this if you have Doxygen. You will get warnings if your doxygen installation is a version older than Doxygen 1.8.8.
$ make doxygen
Compile. Use the -jn
flag to build using n
processor cores. For example:
$ make -j8
Run the tests.
$ ctest -j8
Install. If you chose CMAKE_INSTALL_PREFIX
to be a location which requires sudo access to write to (like /usr/local/
, prepend this command with a sudo
.
$ make -j8 install
Just so you know, you can also uninstall (delete all files that CMake placed into CMAKE_INSTALL_PREFIX
) if you're in ~/simbody-build
.
$ make uninstall
From your build directory, you can run Simbody's example programs. For instance, try:
$ ./ExamplePendulum
If you are only building Simbody to use it with OpenSim, you can skip this section.
Allow executables to find Simbody libraries (.dylib's or so's) by adding the Simbody lib directory to your linker path. On Mac, most users can skip this step.
If your CMAKE_INSTALL_PREFIX
is /usr/local/
, run:
$ sudo ldconfig
If your CMAKE_INSTALL_PREFIX
is neither /usr/
nor /usr/local/
(e.g., ~/simbody
'):
Mac:
$ echo 'export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:~/simbody/lib' >> ~/.bash_profile
Ubuntu:
$ echo 'export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/simbody/lib/x86_64-linux-gnu' >> ~/.bashrc
Allow Simbody and other projects (e.g., OpenSim) to find Simbody. Make sure to replace ~/simbody
with your CMAKE_INSTALL_PREFIX
.
Mac:
$ echo 'export SIMBODY_HOME=~/simbody' >> ~/.bash_profile
Ubuntu:
$ echo 'export SIMBODY_HOME=~/simbody' >> ~/.bashrc
Open a new terminal.
Test your installation:
$ cd ~/simbody/share/doc/simbody/examples/bin
$ ./SimbodyInstallTest # or ./SimbodyInstallTestNoViz
The installation creates the following directories in CMAKE_INSTALL_PREFIX
. The directory [x86_64-linux-gnu]
only exists if you did NOT install to /usr/local/
and varies by platform. Even in that case, the name of your directory may be different.
include/simbody/
the header (.h) files; necessary for projects that use Simbody.lib/[x86_64-linux-gnu]/
shared libraries (.dylib's or .so's).cmake/simbody/
CMake files that are useful for projects that use Simbody.pkgconfig/
pkg-config files useful for projects that use Simbody.simbody/examples/
the examples, compiled into executables; run them! (Not installed for Debug builds.)libexec/simbody/
the simbody-visualizer
executable.share/doc/simbody/
a few manuals, as well as API docs (SimbodyAPI.html
).examples/src
source code for the examples.examples/bin
symbolic link to the runnable examples.If using a Mac and Homebrew, the dependencies are taken care of for you.
Install Homebrew.
Open a terminal.
Add the Open Source Robotics Foundation's list of repositories to Homebrew:
$ brew tap osrf/simulation
Install the latest release of Simbody.
$ brew install simbody
To install from the master branch instead, append --HEAD
to the command above.
Simbody is now installed to /usr/local/Cellar/simbody/<version>/
, where <version>
is either the version number (e.g., 3.6.1
), or HEAD
if you specified --HEAD
above.
Some directories are symlinked (symbolically linked) to /usr/local/
, which is where your system typically expects to find executables, shared libraries (.dylib's), headers (.h's), etc. The following directories from the Simbody installation are symlinked:
include/simbody -> /usr/local/include/simbody
lib -> /usr/local/lib
share/doc/simbody -> /usr/local/share/doc/simbody
What's in the /usr/local/Cellar/simbody/<version>/
directory?
include/simbody/
the header (.h) files; necessary for projects that use Simbody.lib/
shared libraries (.dylib's), used at runtime.cmake/simbody/
CMake files that are useful for projects that use Simbody.pkgconfig/
pkg-config files useful for projects that use Simbody.simbody/examples/
the examples, compiled into executables; run them! (Not installed for Debug builds.)libexec/simbody/
the simbody-visualizer
executable.share/doc/simbody/
a few manuals, as well as API docs (SimbodyAPI.html
).examples/src
source code for the examples.examples/bin
symbolic link to executable examples.Starting with Ubuntu 15.04, Simbody is available in the Ubuntu (and Debian) repositories. You can see a list of all simbody packages for all Ubuntu versions at the Ubuntu Packages website. The latest version of Simbody is usually not available in the Ubuntu repositories; the process for getting a new version of Simbody into the Ubuntu repositories could take up to a year.
Open a terminal and run the following command:
$ sudo apt-get install libsimbody-dev simbody-doc
Simbody is installed into the usr/
directory. The directory [x86_64-linux-gnu]
varies by platform.
usr/include/simbody/
the header (.h) files; necessary for projects that use Simbody.usr/lib/[x86_64-linux-gnu]
shared libraries (.so's).cmake/simbody/
CMake files that are useful for projects that use Simbody.pkgconfig/
pkg-config files useful for projects that use Simbody.usr/libexec/simbody/
the simbody-visualizer
executable.usr/share/doc/simbody/
a few manuals, as well as API docs (SimbodyAPI.html
).examples/src
source code for the examples.examples/bin
symbolic link to executable examples.Simbody is available via the FreeBSD package repository.
Open a terminal and run the following command:
$ sudo pkg install simbody
Warning: The MinGW generation and build is experimental!
This build is still experimental, because of :
Below are three sections that gives a list of supported versions, command line instructions, and reasons why is it not so obvious to use MinGW.
If you do not want to go into details, you need a MinGW version with :
Other versions are supported with additional configurations.
The table below lists the various versions of MinGW versions tested:
OS | Thread | Exception | Comment | URL | |
---|---|---|---|---|---|
1 | 64 Bits | Posix | SJLJ | All features supported, all binary included (Recommended version) | MinGW64 GCC 5.2.0 |
2 | 64 Bits | Posix | SEH | Needs to be linked against user's Blas and Lapack | MinGW64 GCC 5.2.0 |
3 | 32 Bits | Posix | Dwarf | No visualization, all binary included | MinGW64 GCC 5.2.0 |
4 | 32 Bits | Posix | SJLJ | No visualization, needs to be linked against user's Blas and Lapack | MinGW64 GCC 5.2.0 |
We recommend to use the first configuration where all features are supported and does not need additional libraries to compile and run. The URL allows to download directly this version. The second version needs to be linked against user's Blas and Lapack (A CLI example is given below). Blas and Lapack sources can be downloaded from netlib. For the 3rd and 4th versions that run that target a 32 bit behaviour, visualization is not possible for the time being. (It is due to a compile and link problem with glut
). Moreover for the 4th one, one needs to provide Blas and Lapack libraries.
Please note that only Posix version of MinGW are supported.
If your version is not supported, CMake will detect it while configuring and stops.
Below are some examples of command line instructions for various cases. It is assumed you are running commands from a build directory, that can access Simbody source with a command cd ..\simbody
.
It is recommended to specify with the installation directory with flag CMAKE_INSTALL_PREFIX
(e.g. -DCMAKE_INSTALL_PREFIX="C:\Program Files\Simbody"
). If not used, the installation directory will be C:\Program Files (x86)\Simbody
on a 64 bit computer. This might be confusing since it is the 32 bit installation location.
Example of instructions where one uses Blas and Lapack libraries provided (to be used in a Windows terminal, where MinGW is in the PATH):
rem CMake configuration
cmake ..\simbody -G "MinGW Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="C:\Program Files\Simbody"
rem Compilation
mingw32-make
rem Test
mingw32-make test
rem Installation
mingw32-make install
Example of instructions where one uses Blas and Lapack libraries provided (to be used in a Windows terminal, where MinGW is NOT in the PATH):
rem Variable and path definition
set CMAKE="C:\Program Files\CMake\bin\cmake.exe"
set MinGWDir=C:\Program Files\mingw-w64\i686-5.2.0-posix-sjlj-rt_v4-rev0\mingw32
set PATH=%MinGWDir%\bin;%MinGWDir%\i686-w64-mingw32\lib
rem CMake configuration
%CMAKE% ..\simbody -G"MinGW Makefiles" -DCMAKE_BUILD_TYPE=Release ^
-DCMAKE_INSTALL_PREFIX="C:\Program Files\Simbody" ^
-DCMAKE_C_COMPILER:PATH="%MinGWDir%\bin\gcc.exe" ^
-DCMAKE_CXX_COMPILER:PATH="%MinGWDir%\bin\g++.exe" ^
-DCMAKE_MAKE_PROGRAM:PATH="%MinGWDir%\bin\mingw32-make.exe"
rem Compilation
mingw32-make
rem Test
mingw32-make test
rem Installation
mingw32-make install
Example of instructions where one uses Blas and Lapack libraries provided (to be used in a MSYS terminal with MinGW in the PATH):
# CMake configuration
cmake ../simbody -G "MSYS Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="C:\Program Files\Simbody"
# Compilation
make
# Test
make test
# Installation
make install
Example of instructions where one provides our own Blas and Lapack libraries (to be used in a MSYS terminal with MinGW in the PATH):
# CMake configuration
cmake ../simbody -G"MSYS Makefiles" -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX="C:\Program Files\Simbody" \
-DCMAKE_C_COMPILER:PATH="C:\Program Files\mingw-w64\i686-5.2.0-posix-sjlj-rt_v4-rev0\mingw32\bin\gcc.exe" \
-DCMAKE_CXX_COMPILER:PATH="C:\Program Files\mingw-w64\i686-5.2.0-posix-sjlj-rt_v4-rev0\mingw32\bin\g++.exe" \
-DBUILD_USING_OTHER_LAPACK:PATH="C:\Program Files\lapack-3.5.0\bin\liblapack.dll;C:\Program Files\lapack-3.5.0\bin\libblas.dll"
make
# Test
make test
# Installation
make install
This paragraph explains the reason why one can not use any MinGW version.
MinGW is available with two thread models :
One has to use the Posix thread model, since all thread functionalities (e.g. std:mutex
) are not implemented.
To ease building on Windows, Simbody provides compiled libraries for Blas and Lapack :
If one chooses a MinGW compilation, we need to respect this exception mechanism. A program can not rely on both mechanisms. This means that if we want to use the compiled libraries, our MinGW installation should have the same exception mechanism. Otherwise, we need to provide our own Blas and Lapack libraries.
To see which exception mechanism is used, user can look at dlls located in the bin
directory of MinGW. The name of mechanism is present in the file libgcc_XXXX.dll
, where XXXX
can be dw
, seh
or sjlj
. For some MinGW versions, this information is also available by looking at the result of gcc --version
.
CMake will check the version of your MinGW, and if the exception mechanism is different, then the configuration stops because of this difference. If one provides Blas and Lapack libraries with the CMake variable BUILD_USING_OTHER_LAPACK
, compilation with MinGW is always possible.
Conda is a cross platform package manager that can be used to install Simbody on Windows, Mac, or Linux. To install Simbody using Conda you must first install Miniconda or Anaconda. Either of these will provide the conda
command which can be invoked at the command line to install Simbody from the Conda Forge channel as follows:
$ conda install -c conda-forge simbody
This command will install Simbody (both the libraries and headers) into the Miniconda or Anaconda installation directory as per the standard layout for each of the operating systems described above. The Conda Forge Simbody recipe can be found in Conda Forge's feedstock repository.
You can download and install simbody using the vcpkg dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
./bootstrap-vcpkg.sh
./vcpkg integrate install
./vcpkg install simbody
The simbody port in vcpkg is kept up to date by Microsoft team members and community contributors. If the version is out of date, please create an issue or pull request on the vcpkg repository.
We are grateful for past and continuing support for Simbody's development in Stanford's Bioengineering department through the following grants:
Prof. Scott Delp is the Principal Investigator on these grants and Simbody is used extensively in Scott's Neuromuscular Biomechanics Lab as the basis for the OpenSim biomechanical simulation software application for medical research.
Author: Simbody
Source Code: https://github.com/simbody/simbody
License: Apache-2.0 license
1678955880
PyDy, short for Python Dynamics, is a tool kit written in the Python programming language that utilizes an array of scientific programs to enable the study of multibody dynamics. The goal is to have a modular framework that can provide the user with their desired workflow, including:
We started by building the SymPy mechanics package which provides an API for building models and generating the symbolic equations of motion for complex multibody systems. More recently we developed two packages, pydy.codegen and pydy.viz, for simulation and visualization of the models, respectively. This Python package contains these two packages and other tools for working with mathematical models generated from SymPy mechanics. The remaining tools currently used in the PyDy workflow are popular scientific Python packages such as NumPy, SciPy, IPython, Jupyter, ipywidgets, pythreejs, and matplotlib which provide additional code for numerical analyses, simulation, and visualization.
We recommend the conda package manager and the Anaconda or Miniconda distributions for easy cross platform installation.
Once Anaconda (or Miniconda) is installed type:
$ conda install -c conda-forge pydy
Also, a simple way to install all of the optional dependencies is to install the pydy-optional
metapackage using conda:
$ conda install -c conda-forge pydy-optional
Note that pydy-optional
currently enforces the use of Jupyter 4.0, so you may not want to install into your root environment. Create a new environment for working with PyDy examples that use the embedded Jupyter visualizations:
$ conda create -n pydy -c conda-forge pydy-optional
$ conda activate pydy
(pydy)$ python -c "import pydy; print(pydy.__version__)"
If you have the pip package manager installed you can type:
$ pip install pydy
Installing from source is also supported. The latest stable version of the package can be downloaded from PyPi[1]:
$ wget https://pypi.python.org/packages/source/p/pydy/pydy-X.X.X.tar.gz
[1] | Change X.X.X to the latest version number. |
and extracted and installed[2]:
$ tar -zxvf pydy-X.X.X.tar.gz
$ cd pydy-X.X.X
$ python setup.py install
[2] | For system wide installs you may need root permissions (perhaps prepend commands with sudo ). |
PyDy has hard dependencies on the following software[3]:
[3] | We only test PyDy with these minimum dependencies; these module versions are provided in the Ubuntu 20.04 packages. Previous versions may work. |
PyDy has optional dependencies for extended code generation on:
and animated visualizations with Scene.display_jupyter()
on:
or interactive animated visualizations with Scene.display_ipython()
on:
The examples may require these dependencies:
This is an example of a simple one degree of freedom system: a mass under the influence of a spring, damper, gravity and an external force:
/ / / / / / / / /
-----------------
| | | | g
\ | | | V
k / --- c |
| | | x, v
-------- V
| m | -----
--------
| F
V
Derive the system:
from sympy import symbols
import sympy.physics.mechanics as me
mass, stiffness, damping, gravity = symbols('m, k, c, g')
position, speed = me.dynamicsymbols('x v')
positiond = me.dynamicsymbols('x', 1)
force = me.dynamicsymbols('F')
ceiling = me.ReferenceFrame('N')
origin = me.Point('origin')
origin.set_vel(ceiling, 0)
center = origin.locatenew('center', position * ceiling.x)
center.set_vel(ceiling, speed * ceiling.x)
block = me.Particle('block', center, mass)
kinematic_equations = [speed - positiond]
force_magnitude = mass * gravity - stiffness * position - damping * speed + force
forces = [(center, force_magnitude * ceiling.x)]
particles = [block]
kane = me.KanesMethod(ceiling, q_ind=[position], u_ind=[speed],
kd_eqs=kinematic_equations)
kane.kanes_equations(particles, loads=forces)
Create a system to manage integration and specify numerical values for the constants and specified quantities. Here, we specify sinusoidal forcing:
from numpy import array, linspace, sin
from pydy.system import System
sys = System(kane,
constants={mass: 1.0, stiffness: 10.0,
damping: 0.4, gravity: 9.8},
specifieds={force: lambda x, t: sin(t)},
initial_conditions={position: 0.1, speed: -1.0},
times=linspace(0.0, 10.0, 1000))
Integrate the equations of motion to get the state trajectories:
y = sys.integrate()
Plot the results:
import matplotlib.pyplot as plt
plt.plot(sys.times, y)
plt.legend((str(position), str(speed)))
plt.xlabel('Time [s]')
plt.show()
The documentation for this package is hosted at http://pydy.readthedocs.org but you can also build them from source using the following instructions.
To build the documentation you must install the dependencies:
To build the HTML docs, run Make from within the docs
directory:
$ cd docs
$ make html
You can then view the documentation from your preferred web browser, for example:
$ firefox _build/html/index.html
This package provides code generation facilities. It generates functions that can numerically evaluate the right hand side of the ordinary differential equations generated with sympy.physics.mechanics with three different backends: SymPy's lambdify, Theano, and Cython.
The models module provides some canned models of classic systems.
The System module provides a System
class to manage simulation of a single system.
This package provides tools to create 3D animated visualizations of the systems. The visualizations utilize WebGL and run in a web browser. They can also be embedded into an IPython notebook for added interactivity.
The source code is managed with the Git version control system. To get the latest development version and access to the full repository, clone the repository from Github with:
$ git clone https://github.com/pydy/pydy.git
You should then install the dependencies for running the tests:
It is typically advantageous to setup a virtual environment to isolate the development code from other versions on your system. There are two popular environment managers that work well with Python packages: virtualenv and conda.
The following installation assumes you have virtualenvwrapper in addition to virtualenv and all the dependencies needed to build the various packages:
$ mkvirtualenv pydy-dev
(pydy-dev)$ pip install numpy scipy cython nose theano sympy ipython "notebook<5.0" "ipywidgets<5.0" version_information
(pydy-dev)$ pip install matplotlib # make sure to do this after numpy
(pydy-dev)$ git clone git@github.com:pydy/pydy.git
(pydy-dev)$ cd pydy
(pydy-dev)$ python setup.py develop
Or with conda:
$ conda create -c pydy -n pydy-dev setuptools numpy scipy ipython "notebook<5.0" "ipywidgets<5.0" cython nose theano sympy matplotlib version_information
$ source activate pydy-dev
(pydy-dev)$ git clone git@github.com:pydy/pydy.git
(pydy-dev)$ cd pydy
(pydy-dev)$ conda develop .
The full Python test suite can be run with:
(pydy-dev)$ nosetests
For the JavaScript tests the Jasmine and blanket.js libraries are used. Both of these libraries are included in pydy.viz with the source. To run the JavaScript tests:
cd pydy/viz/static/js/tests && phantomjs run-jasmine.js SpecRunner.html && cd ../../../../../
Run the benchmark to test the n-link pendulum problem with the various backends:
$ python bin/benchmark_pydy_code_gen.py <max # of links> <# of time steps>
If you make use of PyDy in your work or research, please cite us in your publications or on the web. This citation can be used:
Gilbert Gede, Dale L Peterson, Angadh S Nanjangud, Jason K Moore, and Mont Hubbard, "Constrained Multibody Dynamics With Python: From Symbolic Equation Generation to Publication", ASME 2013 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, 2013, 10.1115/DETC2013-13470.
If you have any question about installation, usage, etc, feel free send a message to our public mailing list.
If you think there’s a bug or you would like to request a feature, please open an issue on Github.
These are various related and similar Python packages:
Author: pydy
Source Code: https://github.com/pydy/pydy
License: View license
1678741860
The structure of the Chrono git repository was changed as follows:
main
(previously develop
)master
branch, now obsolete, was deletedrelease/*.*
and have tags of the form *.*.*
Project CHRONO
Distributed under a permissive BSD license, Chrono is an open-source multi-physics package used to model and simulate:
Chrono provides a mature and stable code base that continues to be augmented with new features and modules. The core functionality of Chrono provides support for the modeling, simulation, and visualization of rigid and flexible multibody systems with additional capabilities offered through optional modules. These modules provide support for additional classes of problems (e.g., granular dynamics and fluid-solid interaction), modeling and simulation of specialized systems (such as ground vehicles), co-simulation, run-time visualization, post-processing, interfaces to external linear solvers, or specialized parallel computing algorithms (multi-core, GPU, and distributed) for large-scale simulations.
Used in many different scientific and engineering problems by researchers from academia, industry, and government, Chrono has mature and sophisticated support for multibody dynamics, finite element analysis, granular dynamics, fluid-solid interaction, ground vehicle simulation and vehicle-terrain interaction.
Implemented almost entirely in C++, Chrono also provides Python and C# APIs. The build system is based on CMake. Chrono is platform-independent and is actively tested on Linux, Windows, and MacOS using a variety of compilers.
Author: Projectchrono
Source Code: https://github.com/projectchrono/chrono
License: BSD-3-Clause license
#machinelearning #cpluplus #robotics #model #physics #engine
1678726080
Maintainer: michael AT openrobotics DOT org
Gazebo Sim is an open source robotics simulator. Through Gazebo Sim, users have access to high fidelity physics, rendering, and sensor models. Additionally, users and developers have multiple points of entry to simulation including a graphical user interface, plugins, and asynchronous message passing and services.
Gazebo Sim is derived from Gazebo Classic and represents over 16 years of development and experience in robotics and simulation. This library is part of the Gazebo project.
Features
Dynamics simulation: Access multiple high-performance physics engines through Gazebo Physics.
Advanced 3D graphics: Through Gazebo Rendering, it's possible to use rendering engines such as OGRE v2 for realistic rendering of environments with high-quality lighting, shadows, and textures.
Sensors and noise models: Generate sensor data, optionally with noise, from laser range finders, 2D/3D cameras, Kinect style sensors, contact sensors, force-torque, IMU, GPS, and more, all powered by Gazebo Sensors
Plugins: Develop custom plugins for robot, sensor, and environment control.
Graphical interface: Create, introspect and interact with your simulations through plugin-based graphical interfaces powered by Gazebo GUI.
Simulation models: Access numerous robots including PR2, Pioneer2 DX, iRobot Create, and TurtleBot, and construct environments using other physically accurate models available through Gazebo Fuel. You can also build a new model using SDF.
TCP/IP Transport: Run simulation on remote servers and interface to Gazebo Sim through socket-based message passing using Gazebo Transport.
Command line tools: Extensive command line tools for increased simulation introspection and control.
Install
See the installation tutorial.
Usage
Gazebo Sim can be run from the command line, once installed, using:
gz sim
For help, and command line options use:
gz sim -h
In the event that the installation is a mix of Debian and from source, command line tools from gz-tools
may not work correctly.
A workaround for a single package is to define the environment variable GZ_CONFIG_PATH
to point to the location of the Gazebo library installation, where the YAML file for the package is found, such as
export GZ_CONFIG_PATH=/usr/local/share/gz
However, that environment variable only takes a single path, which means if the installations from source are in different locations, only one can be specified.
Another workaround for working with multiple Gazebo libraries on the command line is using symbolic links to each library's YAML file.
mkdir ~/.gz/tools/configs -p
cd ~/.gz/tools/configs/
ln -s /usr/local/share/gz/fuel8.yaml .
ln -s /usr/local/share/gz/transport12.yaml .
ln -s /usr/local/share/gz/transportlog12.yaml .
...
export GZ_CONFIG_PATH=$HOME/.gz/tools/configs
This issue is tracked here.
Documentation
See the installation tutorial.
Testing
See the installation tutorial.
See the Writing Tests section of the contributor guide for help creating or modifying tests.
Folder Structure
Refer to the following table for information about important directories and files in this repository.
gz-sim
├── examples Various examples that can be run against binary or source installs of gz-sim.
│ ├── plugin Example plugins.
│ ├── standalone Example standalone programs that use gz-sim as a library.
│ └── worlds Example SDF world files.
├── include/gz/sim Header files that downstream users are expected to use.
│ └── detail Header files that are not intended for downstream use, mainly template implementations.
├── src Source files and unit tests.
│ ├── gui Graphical interface source code.
│ └── systems System source code.
├── test
│ ├── integration Integration tests.
│ ├── performance Performance tests.
│ ├── plugins Plugins used in tests.
│ ├── regression Regression tests.
│ └── tutorials Tutorials, written in markdown.
├── Changelog.md Changelog.
├── CMakeLists.txt CMake build script.
├── Migration.md Migration guide.
└── README.md This readme.
Contributing
Please see CONTRIBUTING.md.
Code of Conduct
Please see CODE_OF_CONDUCT.md.
Versioning
This library uses Semantic Versioning. Additionally, this library is part of the Gazebo project which periodically releases a versioned set of compatible and complimentary libraries. See the Gazebo website for version and release information.
Author: Gazebosim
Source Code: https://github.com/gazebosim/gz-sim
License: Unknown, Apache-2.0 licenses found
1678698540
A Near Photo-Realistic Interactable Framework for Embodied AI Agents
![]() | ![]() | ![]() |
iTHOR | ManipulaTHOR | RoboTHOR |
A high-level interaction framework that facilitates research in embodied common sense reasoning. | A mid-level interaction framework that facilitates visual manipulation of objects using a robotic arm. | A framework that facilitates Sim2Real research with a collection of simlated scene counterparts in the physical world. |
🏡 Scenes. 200+ custom built high-quality scenes. The scenes can be explored on our demo page. We are working on rapidly expanding the number of available scenes and domain randomization within each scene.
🪑 Objects. 2600+ custom designed household objects across 100+ object types. Each object is heavily annotated, which allows for near-realistic physics interaction.
🤖 Agent Types. Multi-agent support, a custom built LoCoBot agent, a Kinova 3 inspired robotic manipulation agent, and a drone agent.
🦾 Actions. 200+ actions that facilitate research in a wide range of interaction and navigation based embodied AI tasks.
🖼 Images. First-class support for many image modalities and camera adjustments. Some modalities include ego-centric RGB images, instance segmentation, semantic segmentation, depth frames, normals frames, top-down frames, orthographic projections, and third-person camera frames. User's can also easily change camera properties, such as the size of the images and field of view.
🗺 Metadata. After each step in the environment, there is a large amount of sensory data available about the state of the environment. This information can be used to build highly complex custom reward functions.
Date | Announcement | |
5/2021 | RandomizeMaterials is now supported! It enables a massive amount of realistic looking domain randomization within each scene. Try it out on the demo | |
4/2021 | We are excited to release ManipulaTHOR, an environment within the AI2-THOR framework that facilitates visual manipulation of objects using a robotic arm. Please see the full 3.0.0 release notes here. | |
4/2021 | RandomizeLighting is now supported! It includes many tunable parameters to allow for vast control over its effects. Try it out on the demo! | |
2/2021 | We are excited to host the AI2-THOR Rearrangement Challenge, RoboTHOR ObjectNav Challenge, and ALFRED Challenge, held in conjunction with the Embodied AI Workshop at CVPR 2021. | |
2/2021 | AI2-THOR v2.7.0 announces several massive speedups to AI2-THOR! Read more about it here. | |
6/2020 | We've released 🐳 AI2-THOR Docker a mini-framework to simplify running AI2-THOR in Docker. | |
4/2020 | Version 2.4.0 update of the framework is here. All sim objects that aren't explicitly part of the environmental structure are now moveable with physics interactions. New object types have been added, and many new actions have been added. Please see the full 2.4.0 release notes here. | |
2/2020 | AI2-THOR now includes two frameworks: iTHOR and RoboTHOR. iTHOR includes interactive objects and scenes and RoboTHOR consists of simulated scenes and their corresponding real world counterparts. | |
9/2019 | Version 2.1.0 update of the framework has been added. New object types have been added. New Initialization actions have been added. Segmentation image generation has been improved in all scenes. | |
6/2019 | Version 2.0 update of the AI2-THOR framework is now live! We have over quadrupled our action and object states, adding new actions that allow visually distinct state changes such as broken screens on electronics, shattered windows, breakable dishware, liquid fillable containers, cleanable dishware, messy and made beds and more! Along with these new state changes, objects have more physical properties like Temperature, Mass, and Salient Materials that are all reported back in object metadata. To combine all of these new properties and actions, new context sensitive interactions can now automatically change object states. This includes interactions like placing a dirty bowl under running sink water to clean it, placing a mug in a coffee machine to automatically fill it with coffee, putting out a lit candle by placing it in water, or placing an object over an active stove burner or in the fridge to change its temperature. Please see the full 2.0 release notes here to view details on all the changes and new features. |
AI2-THOR Colab can be used to run AI2-THOR freely in the cloud with Google Colab. Running AI2-THOR in Google Colab makes it extremely easy to explore functionality without having to set AI2-THOR up locally.
pip install ai2thor
conda install -c conda-forge ai2thor
🐳 AI2-THOR Docker can be used, which adds the configuration for running a X server to be used by Unity 3D to render scenes.
Once you've installed AI2-THOR, you can verify that everything is working correctly by running the following minimal example:
from ai2thor.controller import Controller
controller = Controller(scene="FloorPlan10")
event = controller.step(action="RotateRight")
metadata = event.metadata
print(event, event.metadata.keys())
Component | Requirement |
---|---|
OS | Mac OS X 10.9+, Ubuntu 14.04+ |
Graphics Card | DX9 (shader model 3.0) or DX11 with feature level 9.3 capabilities. |
CPU | SSE2 instruction set support. |
Python | Versions 3.5+ |
Linux | X server with GLX module enabled |
Questions. If you have any questions on AI2-THOR, please ask them on our GitHub Discussions Page.
Issues. If you encounter any issues while using AI2-THOR, please open an Issue on GitHub.
Section | Description |
---|---|
Demo | Interact and play with AI2-THOR live in the browser. |
iTHOR Documentation | Documentation for the iTHOR environment. |
ManipulaTHOR Documentation | Documentation for the ManipulaTHOR environment. |
RoboTHOR Documentation | Documentation for the RoboTHOR environment. |
AI2-THOR Colab | A way to run AI2-THOR freely on the cloud using Google Colab. |
AllenAct | An Embodied AI Framework build at AI2 that provides first-class support for AI2-THOR. |
AI2-THOR Unity Development | A (sparse) collection of notes that may be useful if editing on the AI2-THOR backend. |
AI2-THOR WebGL Development | Documentation on packaging AI2-THOR for the web, which might be useful for annotation based tasks. |
If you use AI2-THOR or iTHOR scenes, please cite the original AI2-THOR paper:
@article{ai2thor,
author={Eric Kolve and Roozbeh Mottaghi and Winson Han and
Eli VanderBilt and Luca Weihs and Alvaro Herrasti and
Daniel Gordon and Yuke Zhu and Abhinav Gupta and
Ali Farhadi},
title={{AI2-THOR: An Interactive 3D Environment for Visual AI}},
journal={arXiv},
year={2017}
}
If you use 🏘️ ProcTHOR or procedurally generated scenes, please cite the following paper:
@inproceedings{procthor,
author={Matt Deitke and Eli VanderBilt and Alvaro Herrasti and
Luca Weihs and Jordi Salvador and Kiana Ehsani and
Winson Han and Eric Kolve and Ali Farhadi and
Aniruddha Kembhavi and Roozbeh Mottaghi},
title={{ProcTHOR: Large-Scale Embodied AI Using Procedural Generation}},
booktitle={NeurIPS},
year={2022},
note={Outstanding Paper Award}
}
If you use ManipulaTHOR agent, please cite the following paper:
@inproceedings{manipulathor,
title={{ManipulaTHOR: A Framework for Visual Object Manipulation}},
author={Kiana Ehsani and Winson Han and Alvaro Herrasti and
Eli VanderBilt and Luca Weihs and Eric Kolve and
Aniruddha Kembhavi and Roozbeh Mottaghi},
booktitle={CVPR},
year={2021}
}
If you use RoboTHOR scenes, please cite the following paper:
@inproceedings{robothor,
author={Matt Deitke and Winson Han and Alvaro Herrasti and
Aniruddha Kembhavi and Eric Kolve and Roozbeh Mottaghi and
Jordi Salvador and Dustin Schwenk and Eli VanderBilt and
Matthew Wallingford and Luca Weihs and Mark Yatskar and
Ali Farhadi},
title={{RoboTHOR: An Open Simulation-to-Real Embodied AI Platform}},
booktitle={CVPR},
year={2020}
}
AI2-THOR is an open-source project built by the PRIOR team at the Allen Institute for AI (AI2). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering.
Author: allenai
Source Code: https://github.com/allenai/ai2thor
License: Apache-2.0 license
#machinelearning #python #computervision #physics #engine #artificial #intelligence
1678499040
This Python module adds a quaternion dtype to NumPy.
The code was originally based on code by Martin Ling (which he wrote with help from Mark Wiebe), but has been rewritten with ideas from rational to work with both python 2.x and 3.x (and to fix a few bugs), and greatly expands the applications of quaternions.
See also the pure-python package quaternionic.
conda install -c conda-forge quaternion
or
python -m pip install --upgrade --force-reinstall numpy-quaternion
Optionally add --user
after install
in the second command if you're not using a python environment — though you should start.
The basic requirements for this code are reasonably current versions of python
and numpy
. In particular, python
versions 3.8 through 3.10 are routinely tested. Earlier python
versions, including 2.7, will work with older versions of this package; they might still work with more recent versions of this package, but even numpy no longer supports python
previous to 3.8, so your mileage may vary. Also, any numpy
version greater than 1.13.0 should work, but the tests are run on the most recent release at the time of the test.
However, certain advanced functions in this package (including squad
, mean_rotor_in_intrinsic_metric
, integrate_angular_velocity
, and related functions) require scipy
and can automatically use numba
. Scipy
is a standard python package for scientific computation, and implements interfaces to C and Fortran codes for optimization (among other things) need for finding mean and optimal rotors. Numba
uses LLVM to compile python code to machine code, accelerating many numerical functions by factors of anywhere from 2 to 2000. It is possible to run all the code without numba
, but these particular functions can be anywhere from 4 to 400 times slower without it.
Both scipy
and numba
can be installed with pip
or conda
. However, because conda
is specifically geared toward scientific python, it is generally more robust for these more complicated packages. In fact, the main anaconda
package comes with both numba
and scipy
. If you prefer the smaller download size of miniconda
(which comes with minimal extras), you'll also have to run this command:
conda install numpy scipy numba
Assuming you use conda
to manage your python installation (which is currently the preferred choice for science and engineering with python), you can install this package simply as
conda install -c conda-forge quaternion
If you prefer to use pip
, you can instead do
python -m pip install --upgrade --force-reinstall numpy-quaternion
(See here for a veteran python core contributor's explanation of why you should always use python -m pip
instead of just pip
or pip3
.) The --upgrade --force-reinstall
options are not always necessary, but will ensure that pip will update numpy if it has to.
If you refuse to use conda
, you might want to install inside your home directory without root privileges. (Conda does this by default anyway.) This is done by adding --user
to the above command:
python -m pip install --user --upgrade --force-reinstall numpy-quaternion
Note that pip will attempt to compile the code — which requires a working C
compiler.
Finally, there's also the fully manual option of just downloading the code, changing to the code directory, and running
python -m pip install --upgrade --force-reinstall .
This should work regardless of the installation method, as long as you have a compiler hanging around.
The full documentation can be found on Read the Docs, and most functions have docstrings that should explain the relevant points. The following are mostly for the purposes of example.
>>> import numpy as np
>>> import quaternion
>>> np.quaternion(1,0,0,0)
quaternion(1, 0, 0, 0)
>>> q1 = np.quaternion(1,2,3,4)
>>> q2 = np.quaternion(5,6,7,8)
>>> q1 * q2
quaternion(-60, 12, 30, 24)
>>> a = np.array([q1, q2])
>>> a
array([quaternion(1, 2, 3, 4), quaternion(5, 6, 7, 8)], dtype=quaternion)
>>> np.exp(a)
array([quaternion(1.69392, -0.78956, -1.18434, -1.57912),
quaternion(138.909, -25.6861, -29.9671, -34.2481)], dtype=quaternion)
Note that this package represents a quaternion as a scalar, followed by the x
component of the vector part, followed by y
, followed by z
. These components can be accessed directly:
>>> q1.w, q1.x, q1.y, q1.z
(1.0, 2.0, 3.0, 4.0)
However, this only works on an individual quaternion
; for arrays it is better to use "vectorized" operations like as_float_array
.
The following ufuncs are implemented (which means they run fast on numpy arrays):
add, subtract, multiply, divide, log, exp, power, negative, conjugate,
copysign, equal, not_equal, less, less_equal, isnan, isinf, isfinite, absolute
Quaternion components are stored as double-precision floating point numbers — float
s, in python language, or float64
in more precise numpy language. Numpy arrays with dtype=quaternion
can be accessed as arrays of doubles without any (slow, memory-consuming) copying of data; rather, a view
of the exact same memory space can be created within a microsecond, regardless of the shape or size of the quaternion array.
Comparison operations follow the same lexicographic ordering as tuples.
The unary tests isnan and isinf return true if they would return true for any individual component; isfinite returns true if it would return true for all components.
Real types may be cast to quaternions, giving quaternions with zero for all three imaginary components. Complex types may also be cast to quaternions, with their single imaginary component becoming the first imaginary component of the quaternion. Quaternions may not be cast to real or complex types.
Several array-conversion functions are also included. For example, to convert an Nx4 array of floats to an N-dimensional array of quaternions, use as_quat_array
:
>>> import numpy as np
>>> import quaternion
>>> a = np.random.rand(7, 4)
>>> a
array([[ 0.93138726, 0.46972279, 0.18706385, 0.86605021],
[ 0.70633523, 0.69982741, 0.93303559, 0.61440879],
[ 0.79334456, 0.65912598, 0.0711557 , 0.46622885],
[ 0.88185987, 0.9391296 , 0.73670503, 0.27115149],
[ 0.49176628, 0.56688076, 0.13216632, 0.33309146],
[ 0.11951624, 0.86804078, 0.77968826, 0.37229404],
[ 0.33187593, 0.53391165, 0.8577846 , 0.18336855]])
>>> qs = quaternion.as_quat_array(a)
>>> qs
array([ quaternion(0.931387262880247, 0.469722787598354, 0.187063852060487, 0.866050210100621),
quaternion(0.706335233363319, 0.69982740767353, 0.933035590130247, 0.614408786768725),
quaternion(0.793344561317281, 0.659125976566815, 0.0711557025000925, 0.466228847713644),
quaternion(0.881859869074069, 0.939129602918467, 0.736705031709562, 0.271151494174001),
quaternion(0.491766284854505, 0.566880763189927, 0.132166320200012, 0.333091463422536),
quaternion(0.119516238634238, 0.86804077992676, 0.779688263524229, 0.372294043850009),
quaternion(0.331875925159073, 0.533911652483908, 0.857784598617977, 0.183368547490701)], dtype=quaternion)
[Note that quaternions are printed with full precision, unlike floats, which is why you see extra digits above. But the actual data is identical in the two cases.] To convert an N-dimensional array of quaternions to an Nx4 array of floats, use as_float_array
:
>>> b = quaternion.as_float_array(qs)
>>> b
array([[ 0.93138726, 0.46972279, 0.18706385, 0.86605021],
[ 0.70633523, 0.69982741, 0.93303559, 0.61440879],
[ 0.79334456, 0.65912598, 0.0711557 , 0.46622885],
[ 0.88185987, 0.9391296 , 0.73670503, 0.27115149],
[ 0.49176628, 0.56688076, 0.13216632, 0.33309146],
[ 0.11951624, 0.86804078, 0.77968826, 0.37229404],
[ 0.33187593, 0.53391165, 0.8577846 , 0.18336855]])
It is also possible to convert a quaternion to or from a 3x3 array of floats representing a rotation matrix, or an array of N quaternions to or from an Nx3x3 array of floats representing N rotation matrices, using as_rotation_matrix
and from_rotation_matrix
. Similar conversions are possible for rotation vectors using as_rotation_vector
and from_rotation_vector
, and for spherical coordinates using as_spherical_coords
and from_spherical_coords
. Finally, it is possible to derive the Euler angles from a quaternion using as_euler_angles
, or create a quaternion from Euler angles using from_euler_angles
— though be aware that Euler angles are basically the worst things ever.1 Before you complain about those functions using something other than your favorite conventions, please read this page.
Bug reports and feature requests are entirely welcome (with very few exceptions). The best way to do this is to open an issue on this code's github page. For bug reports, please try to include a minimal working example demonstrating the problem.
Pull requests are also entirely welcome, of course, if you have an idea where the code is going wrong, or have an idea for a new feature that you know how to implement.
This code is routinely tested on recent versions of both python (3.8 though 3.10) and numpy (>=1.13). But the test coverage is not necessarily as complete as it could be, so bugs may certainly be present, especially in the higher-level functions like mean_rotor_...
.
This code is, of course, hosted on github. Because it is an open-source project, the hosting is free, and all the wonderful features of github are available, including free wiki space and web page hosting, pull requests, a nice interface to the git logs, etc. Github user Hannes Ovrén (hovren) pointed out some errors in a previous version of this code and suggested some nice utility functions for rotation matrices, etc. Github user Stijn van Drongelen (rhymoid) contributed some code that makes compilation work with MSVC++. Github user Jon Long (longjon) has provided some elegant contributions to substantially improve several tricky parts of this code. Rebecca Turner (9999years) and Leo Stein (duetosymmetry) did all the work in getting the documentation onto Read the Docs.
Every change in this code is automatically tested on Travis-CI. This service integrates beautifully with github, detecting each commit and automatically re-running the tests. The code is downloaded and installed fresh each time, and then tested, on each of the five different versions of python. This ensures that no change I make to the code breaks either installation or any of the features that I have written tests for. Travis-CI also automatically builds the conda
and pip
versions of the code hosted on anaconda.org and pypi respectively. These are all free services for open-source projects like this one.
The work of creating this code was supported in part by the Sherman Fairchild Foundation and by NSF Grants No. PHY-1306125 and AST-1333129.
1 Euler angles are awful
Euler angles are pretty much the worst things ever and it makes me feel bad even supporting them. Quaternions are faster, more accurate, basically free of singularities, more intuitive, and generally easier to understand. You can work entirely without Euler angles (I certainly do). You absolutely never need them. But if you really can't give them up, they are mildly supported.
Author: Moble
Source Code: https://github.com/moble/quaternion
License: MIT license
1670407980
When it comes to game development, a lot of games will require interaction between the player and the environment or the player and another player. For example, you’d probably want to prevent your player from falling through floors or make them lose health every time an opponent collides with them.
There are numerous strategies towards handling collisions in your game. You could identify the position and bounding boxes for every sprite in your game, something I don’t recommend, or you could make use of the physics engine that’s available with your game development framework.
The past few tutorials have focused on Phaser, so we’re going to proceed with developing a game using that framework.
In this tutorial, we’re going to look at using arcade physics, one of several physics engines available in Phaser 3.x, and we’re going to see how to handle collisions.
To get an idea of what we’re going to accomplish, take a look at the following animated image:
In the above image, we’re working with two sprites, one of which we already saw in a previous tutorial titled, Animate a Compressed Sprite Atlas in a Phaser Game. Essentially we have our player, the plane, and a totally random obstacle. The obstacle moves towards the plane and with it touches the collision boundaries, an event happens on the plane. The bounding boxes are visible for demonstration purposes.
With arcade physics in Phaser 3.x, you can use boxes or circles for collision boundaries. In a future tutorial we’re going to see how to use more tightly defined collision boundaries using another physics engine, but for now we’re sticking to arcade physics. The benefits of arcade physics is that it is very easy to use and will give you the best performance in your game.
If you haven’t already checked out the previous tutorial around animating the plane, I encourage you to do so. The code from that tutorial will not be thoroughly explained in this tutorial.
Before we start worrying about our physics engine and the collision details that come with it, we’re going to get our project ready to go. Create a directory on your computer and include an index.html file with the following markup:
<!DOCTYPE html>
<html>
<head>
<script src="//cdn.jsdelivr.net/npm/phaser@3.24.1/dist/phaser.min.js"></script>
</head>
<body>
<div id="game"></div>
<script>
const phaserConfig = {
type: Phaser.AUTO,
parent: "game",
width: 1280,
height: 720,
backgroundColor: "#5DACD8",
scene: {
init: initScene,
preload: preloadScene,
create: createScene,
update: updateScene
}
};
const game = new Phaser.Game(phaserConfig);
var plane, obstacle;
var isGameOver = false;
function initScene() { }
function preloadScene() {
this.load.atlas("plane", "plane.png", "plane.json");
this.load.image("obstacle", "obstacle.png");
}
function createScene() {
this.anims.create({
key: "fly",
frameRate: 7,
frames: this.anims.generateFrameNames("plane", {
prefix: "plane",
suffix: ".png",
start: 1,
end: 3,
zeroPad: 1
}),
repeat: -1
});
this.anims.create({
key: "explode",
frameRate: 7,
frames: this.anims.generateFrameNames("plane", {
prefix: "explosion",
suffix: ".png",
start: 1,
end: 3,
zeroPad: 1
}),
repeat: 2
});
}
function updateScene() {}
</script>
</body>
</html>
The above code configures our Phaser 3.x game and initializes some of our media assets as well as animations. If you’re not sure what a sprite atlas is, check out my previous tutorial. While I did include the spritesheet and atlas in my other tutorial, go ahead and create any kind of image that you want for the obstacle. In fact, you don’t need to be using any animated sprites at all for this example. Just come up with two different image files and you’ll be fine progressing through this tutorial.
If you ran the game right now, you should end up with a blue screen because we’re not showing any of our media assets and the animations we created are attached to nothing.
With the foundation of our game in place, we can start creating sprites from our image assets. However, we’re not going to create the sprites the same as we’ve seen in other tutorials. Instead we need to create physics sprites.
Within the index.html file, change your phaserConfig
to look like the following:
const phaserConfig = {
type: Phaser.AUTO,
parent: "game",
width: 1280,
height: 720,
backgroundColor: "#5DACD8",
physics: {
default: "arcade",
arcade: {
debug: true
}
},
scene: {
init: initScene,
preload: preloadScene,
create: createScene,
update: updateScene
}
};
We’ve enabled our default physics engine and we’ve enabled debug mode. With debug mode enabled, we can see our collision boundaries on each of our objects. When you publish your game, just disable debug mode so those boxes or circles disappear.
So we’ve enabled arcade physics, now we can add our sprites. Within the createScene
function of our index.html file, add the following:
function createScene() {
// Animations ...
plane = this.physics.add.sprite(300, 360, "plane");
plane.play("fly");
obstacle = this.physics.add.sprite(1100, 360, "obstacle");
}
Like I said, we’re not creating our sprites how we’ve done it in other tutorials. However, creating a physics enabled sprite isn’t much different. The above code creates two different sprites with no physics beyond the boxed collision bodies. This means there is no gravity, no friction, nothing that you’d expect from game related physics.
Had we wanted to use a circle boundary instead of a box boundary, we could have done something like this:
plane = this.physics.add.sprite(300, 360, "plane");
plane.setCircle(300);
plane.play("fly");
You would call the setCircle
method on the sprite and specify the size of the circle. This would replace the default bounding box with the circle.
Now that we have active collision bodies with the arcade physics on our sprites, we can monitor for when the collisions happen. If we did absolutely nothing right now, when the sprites collide, they’d just pass through each other and we wouldn’t even know a collision happened. If we had other physics data set, collisions might stop the movement due to friction, resistance, etc., but not for this example.
Before we start looking at the collision event, let’s make our obstacle move.
Within the updateScene
function of our index.html file, include the following:
function updateScene() {
obstacle.x -= 4;
}
The updateScene
function is constantly called by Phaser, so every time it is called, we decrease the obstacle horizontal position. This will simulate movement in the left direction.
Go ahead and test it out.
Now let’s figure out when the collision happens.
In the createScene
function of our project, include the following:
function createScene() {
// Animations ...
plane = this.physics.add.sprite(300, 360, "plane");
plane.play("fly");
obstacle = this.physics.add.sprite(1100, 360, "obstacle");
this.physics.add.collider(plane, obstacle, function (plane, obstacle) {
if (!isGameOver) {
plane.play("explode");
plane.once(Phaser.Animations.Events.SPRITE_ANIMATION_COMPLETE, () => {
plane.destroy();
});
isGameOver = true;
}
});
}
Notice that we’re using the collider
method in the above code. We’re specifying what should happen when two different sprites collide. When they collide, the callback function is executed, which passes the two colliding sprites into the function. The names of the callback parameters do not need to match the variable names of the sprites, but I did this for readability purposes.
The collider
method will trigger for as long as the two sprites are touching, and this includes when they pass through each other. For this reason we don’t want to continuously replay our animation or try to destroy a sprite that might not exist. This is why we’re making use of a isGameOver
boolean variable. If we’ve just now collided for the first time, play the new animation and toggle the boolean. When the animation is done, remove the sprite.
How you work with collisions is up to you. You could decrease player health, increase score, or do something else.
You just saw how to handle collisions in a Phaser 3.x game that made use of arcade physics. Phaser offers compatibility with quite a few physics engines, but the arcade physics engine is rated to be the most efficient at the sacrifice of using only bounding boxes and circles for collision bodies. This tutorial expanded on a previous tutorial I wrote around animated sprite atlases, hence why I didn’t go into details on the animation component.
While I didn’t show it in this tutorial, there is also an overlap
method that behaves similar to the collider
method. Depending on the functionality you want, it might be worth checking out.
In a future tutorial we’re going to explore a different physics engine and see how we can get more detailed collision boundaries that aren’t restricted to boxes and circles.
Original article source at: https://www.thepolyglotdeveloper.com/
1670401755
I recently wrote about handling collisions in a Phaser 3.x game. In this previous tutorial titled, Handle Collisions Between Sprites in Phaser with Arcade Physics, the focus was around the arcade physics engine that Phaser integrates with.
While you should use the arcade physics engine whenever possible, due to its speed and efficiency, sometimes working with box and circle physics bodies isn’t enough.
This is where Matter.js comes in!
Matter.js is another supported physics engine in Phaser 3.x and while it offers quite a bit of functionality that arcade physics doesn’t offer, it also offers custom polygon physics bodies.
In this tutorial, we’re going to explore collisions once more in a Phaser game, but this time with Matter.js and more refined boundaries.
To get an idea of what we are going to build, take a look at the following animated image:
In the above example, we have two sprites, one of which is animated. Both sprites have a custom physics body which is fitted to the image. While this isn’t pixel perfect due to using polygons, it is a lot more refined than what we saw in the previous tutorial. The physics body is only visible for demonstration purposes and can be hidden in a realistic scenario.
For this tutorial we’re not going to explore how to animate sprites even though some code around animation will be included. If you want to learn how to animate a spritesheet that has an atlas file, check out my tutorial titled, Animate a Compressed Sprite Atlas in a Phaser Game.
To get us up to speed when it comes to collisions in a Phaser game, we need to add a foundation to our project. Essentially we need to add some code that was seen in other Phaser 3.x tutorials on the blog.
On your computer, create a new directory with an index.html file that contains the following markup:
<!DOCTYPE html>
<html>
<head>
<script src="//cdn.jsdelivr.net/npm/phaser@3.24.1/dist/phaser.min.js"></script>
</head>
<body>
<div id="game"></div>
<script>
const phaserConfig = {
type: Phaser.AUTO,
parent: "game",
width: 1280,
height: 720,
backgroundColor: "#5DACD8",
scene: {
init: initScene,
preload: preloadScene,
create: createScene,
update: updateScene
}
};
const game = new Phaser.Game(phaserConfig);
var plane, obstacle;
var spritePhysics;
function initScene() { }
function preloadScene() {
this.load.atlas("plane", "plane.png", "plane.json");
this.load.image("obstacle", "obstacle.png");
}
function createScene() {
this.anims.create({
key: "fly",
frameRate: 7,
frames: this.anims.generateFrameNames("plane", {
prefix: "plane",
suffix: ".png",
start: 1,
end: 3,
zeroPad: 1
}),
repeat: -1
});
this.anims.create({
key: "explode",
frameRate: 7,
frames: this.anims.generateFrameNames("plane", {
prefix: "explosion",
suffix: ".png",
start: 1,
end: 3,
zeroPad: 1
}),
repeat: 2
});
}
function updateScene() { }
</script>
</body>
</html>
Once again, the assumption is that you are either familiar with sprite atlas concepts or you’ve seen my previous tutorial on the subject. I’ve even included the spritesheet and the atlas file in the other tutorial if you wanted to use it for this tutorial. For the obstacle, you can use any image.
With the foundation of our project in place, now we can move onto the part of the tutorial that matters.
Before we can add physics to our sprites, we need to define the physics engine that we plan to use in our Phaser 3.x game. To do this, we need to edit the phaserConfig
object in the index.html file:
const phaserConfig = {
type: Phaser.AUTO,
parent: "game",
width: 1280,
height: 720,
backgroundColor: "#5DACD8",
physics: {
default: "matter",
matter: {
debug: true
}
},
scene: {
init: initScene,
preload: preloadScene,
create: createScene,
update: updateScene
}
};
Notice that we’ve defined our default physic engine and we’ve also enabled debug mode for it. Debug mode for this example will outline the physics body on each of our sprites. If you don’t want to see the physics body, just disable debug mode.
This particular example doesn’t use gravity on our sprites. We could choose to disable gravity on a sprite per sprite basis or we can disable gravity for the entire scene. It probably makes sense to do it for the entire scene. This can be done by adding the following to our createScene
function:
function createScene() {
// Animations ...
this.matter.world.disableGravity();
}
Things are about to get potentially more complicated.
To define a custom physics body for our sprites, we need to create a JSON configuration file with the appropriate information. This is similar to the configuration file that acts as our atlas for our spritesheet. You could do this by hand, but I’d strongly recommend against it. I use a tool called PhysicsEditor and it’s by the same creator as TexturePacker. You don’t have to use this tool or you can use a different tool, but I strongly recommend against creating things by hand.
For simplicity, you can download my sprite-physics.json file.
Do note that the configuration information matches the spritesheet from my previous tutorial for the plane. It also matches an obstacle that I didn’t provide. Even so, the file is enough to get a general idea from.
So let’s add some Matter.js powered sprites!
Within the preloadScene
function, we need to add the JSON file that represents our physics data. To do this, alter the function to look like the following:
function preloadScene() {
this.load.atlas("plane", "plane.png", "plane.json");
this.load.image("obstacle", "obstacle.png");
this.load.json("sprites", "sprite-physics.json");
}
The above code assumes that your physics file is named sprite-physics.json like mine.
Now within the createScene
function of the index.html file, add the following:
function createScene() {
// Animations ...
this.matter.world.disableGravity();
spritePhysics = this.cache.json.get("sprites");
plane = this.matter.add.sprite(300, 360, "plane", "plane1.png", { shape: spritePhysics.plane });
plane.play("fly");
obstacle = this.matter.add.sprite(1100, 360, "obstacle", null, { shape: spritePhysics.obstacle });
}
In the above code, we are accessing the sprites
asset which is the name we associated with the sprite-physics.json file. As per my configuration, I have physics information for plane
as well as obstacle
which we’re adding to a sprite. For clarity, the plane
sprite is using plane1.png
and not null
because plane1.png
is the first frame in the animation while the obstacle is not animated.
To move the obstacle, change the updateScene
to look like the following:
function updateScene() {
obstacle.x -= 4;
}
If we did nothing else and ran our game, the obstacle would move until it collides with the plane. When a collision happens, we won’t know about it, but the obstacle will continue to move beyond the plane. This is because in the physics file, both physics bodies are sensor bodies. This means collisions are only reported, not demonstrated. Had these physics bodies not been sensors, the sprites would collide and stop moving, or one of the sprites would be pushed back.
Even though collisions are reported, we’re not looking for them yet.
To be useful, we’re going to want to do something when a collision happens. In our example we want the plane to explode when colliding with an obstacle. This means we need to track our collisions.
Within the createScene
function, modify it to look like the following:
function createScene() {
// Animations ...
this.matter.world.disableGravity();
spritePhysics = this.cache.json.get("sprites");
plane = this.matter.add.sprite(300, 360, "plane", "plane1.png", { shape: spritePhysics.plane });
plane.play("fly");
obstacle = this.matter.add.sprite(1100, 360, "obstacle", null, { shape: spritePhysics.obstacle });
this.matter.world.on("collisionstart", (event, bodyA, bodyB) => {
if((bodyA.label == "plane" && bodyB.label == "obstacle") || (bodyB.label == "plane" && bodyA.label == "obstacle")) {
if(plane.anims.getCurrentKey() != "explode") {
plane.play("explode");
plane.once(Phaser.Animations.Events.SPRITE_ANIMATION_COMPLETE, () => {
plane.destroy();
});
}
}
});
}
Tracking collisions with Matter.js is not as easy as tracking them with arcade physics.
We need to look for a collision event between two physics bodies. Then we need to make sure the collision happened between a plane
and an obstacle
and not something else. If the correct collision happened, we need to make sure our plane isn’t already exploding. If it isn’t, we can start the explosion animation and destroy the plane when the animation ends.
You just saw how to use Matter.js as the physics engine in a Phaser 3.x game. While the focus of this tutorial was around collisions, a physic engine can be used for so much more.
There are a few things to note about what we saw in this tutorial:
If you want to see how to use arcade physics, I strongly recommend checking out my previous tutorial which uses the same example.
Original article source at: https://www.thepolyglotdeveloper.com/
1669790472
The Fourier transform has a million applications across all sorts of fields in science and math. But one of the very deepest arises in quantum mechanics, where it provides a map between two parallel descriptions of a quantum particle: one in terms of the position space wavefunction, and a dual description in terms of the momentum space wavefunction. Understanding this connection is also one of the best ways of learning what the Fourier transform really means.
We'll start by thinking about the quantum mechanics of a particle on a circle, which requires that the wavefunction be periodic. That lets us expand it in a Fourier series---a superposition of many sine and cosine functions, or equivalently complex exponential functions. We'll see that these individual Fourier waves are the eigenfunctions of the quantum momentum operator, and the corresponding eigenvalues are the numbers we can get when we go to measure the momentum of the particle. The coefficients of the Fourier series tell us the probabilities of which value we'll get.
Then, by taking the limit where the radius of this circular space goes to infinity, we'll return to the quantum mechanics of a particle on an infinite line. And what we'll discover is that the full-fledged Fourier transform emerges directly from the Fourier series in this limit, and that gives us a powerful intuition for understanding what the Fourier transform means. We'll look at an example that shows that when the position space wavefunction is a narrow spike, so that we have a good idea of where the particle is in space, the momentum space wavefunction will be spread out across a huge range. By knowing the position of the particle precisely, we don't have a clue what the momentum will be, and vice-versa! This is the Heisenberg uncertainty principle in action.
0:00 Introduction
2:56 The Fourier series
16:08 The Fourier transform
25:37 An example
Develop a deep understanding of the Fourier transform by appreciating the critical role it plays in quantum mechanics! Get the notes for free here: https://courses.physicswithelliot.com/notes-sign-up
Sign up for my newsletter for additional physics lessons: https://www.physicswithelliot.com/sign-up
Subscribe: https://www.youtube.com/@PhysicswithElliot/featured
1667793578
Magnetic Field using Biot-Savart law: Circular Loop and Long Wire
The Biot-Savart law and uses the law to calculate the magnetic field produced by a current loop. In the second example i show how to evaluate the field produced by a long wire. For the long wire i show how to solve using trigonometric substitution and also using integral tables.
Biot-Savart’s law is an equation that gives the magnetic field produced due to a current carrying segment. This segment is taken as a vector quantity known as the current element.
Subscribe: https://www.youtube.com/c/OnlinePhysicsNinja/featured
1667556060
A Julia package for dynamical billiard systems in two dimensions. The goals of the package is to provide a flexible and intuitive framework for fast implementation of billiard systems of arbitrary construction.
If you have used this package for research that resulted in a publication, please be kind enough to cite the papers listed in the CITATION.bib file.
Please see the documentation for list of features, tutorials and installation instructions.
This package is mainly developed by George Datseris. However, this development would not have been possible without significant help from other people:
Author: JuliaDynamics
Source Code: https://github.com/JuliaDynamics/DynamicalBilliards.jl
License: View license
1667536620
Photonic Crystals in Julia
Peacock.jl
- or the Plane-wave Expansion Approach to Characterising Optical Crystals in k-space - is a Julia package for studying photonic crystals using the Plane Wave Expansion Method.
Photonic crystals are materials whose optical properties arise from the structuring of the material when the size of the structures are comparable to the wavelengths of light. Peacock.jl
is named for the irridescent colours of peacock feathers which arise not from pigmentation but from their photonic crystal structure, as shown below.
Peacock by allanlau2000 from pixabay. Feather by suju from pixabay. Electron microscope image of photonic crystal structure from Zi, Jian, et al. "Coloration strategies in peacock feathers.", Proceedings of the National Academy of Sciences 100.22 (2003): 12576-12578. Copyright (2003) National Academy of Sciences.
As well as occuring naturally as in animals such as peacocks, advances in nanofabrication mean that 'designer' photonic crystals can now be manufactured for unprecedented control over the flow of light, with applications ranging from optical fibers to photonic circuitry. Photonic crystals are also a promising platform for more exotic materials like topological insulators.
julia> ]
pkg> add Peacock
For more info on the package and usage instructions, see the documentation.
Solve for...
Focused on ease of use
Peacock.Zoo
submodule.If you use Peacock.jl
in your work, please consider citing us as
@article{palmer2020peacock,
title={Peacock.jl: Photonic crystals in {Julia}},
author={Palmer, Samuel J and Giannini, Vincenzo},
journal={Journal of Open Source Software},
volume={5},
number={54},
pages={2678},
year={2020}
}
Author: sp94
Source Code: https://github.com/sp94/Peacock.jl
License: MIT license
1667513460
Chipmunk.jl
is a Julia binding of the popular physics engine [Chipmunk] (https://chipmunk-physics.net/) ([Github page] (https://github.com/slembcke/Chipmunk2D)). This is currently very much in progress.
Take a look at the examples/
folder to see what you can do with Chipmunk.jl
Chipmunk.jl
uses [SFML.jl] (https://github.com/zyedidia/SFML.jl) to render the world to the screen.
Chipmunk.jl
also requires Julia 0.4.
Installation
Installation is quite simple. The package will clone and install chipmunk from source to deps/
. Make sure that you have cmake
installed so that it can build chipmunk.
I have not been able to get cmake to compile Chipmunk on Windows yet.
If you get an SFML build error about follow_symlinks
you should update your version of Julia 0.4.
julia> Pkg.clone("https://github.com/zyedidia/Chipmunk.jl")
julia> Pkg.build("Chipmunk")
Author: Zyedidia
Source Code: https://github.com/zyedidia/Chipmunk.jl
License: View license
1667392749
A lens is a transparent material that concentrates or disperses light rays when it passes through them by refraction. According to the shape and purpose of the lens, they are classified into two types convex lens and concave lens.
https://www.pw.live/physics-articles/convex-lens
#convexlens #lens #mirror #physics #reflection #physicsarticles
1666168920
An animation library for iOS, tvOS, and macOS that uses physics-based animations (including springs) to power interactions that move and respond realistically.
let view = UIView(frame: CGRect(x: 0, y: 0, width: 100, height: 100))
// Springs animate changes to a value
let spring = Spring(initialValue: view.center)
// The `onChange` closure will be called every time the spring updates
spring.onChange = { [view] newCenter in
view.center = newCenter
}
/// The view's center will realistically animate to the new target value.
spring.target = CGPoint(x: 300, y: 200)
There are several ways to integrate Advance into your project.
Manually: add Advance.xcodeproj
to your project, then add Advance-{iOS|macOS|tvOS}.framework
as an "Embedded Binary" to your application target (under General in target settings). From there, add import Advance
to your code and you're good to go.
Carthage: add github "timdonnelly/Advance"
to your Cartfile
.
CocoaPods: add pod 'Advance'
to your Podfile
.
Swift Package Manager: add a dependency to your Project.swift
: .package(url: "http://github.com/timdonnelly/Advance", from: "3.0.0")
Requirements
API documentation is available here.
Advance animations are applied on every frame (using CADisplayLink
on iOS/tvOS, and CVDisplayLink
on macOS), allowing for fine-grained control at any time.
Spring
Spring
instances animate changes to a value over time, using spring physics.
let spring = Spring(initialValue: 0.0)
spring.onChange = { [view] newAlpha in
view.alpha = newAlpha
}
// Off it goes!
spring.target = 0.5
/// Spring values can be adjusted at any time.
spring.tension = 30.0 /// The strength of the spring
spring.damping = 2.0 /// The resistance (drag) that the spring encounters
spring.threshold = 0.1 /// The maximum delta between the current value and the spring's target (for each component) for which the simulation can enter a converged state.
/// Update the simulation state at any time.
spring.velocity = 6.5
spring.value = 0.2
/// Sets the spring's target and the current simulation value, and removes all velocity. This causes the spring to converge at the given value.
spring.reset(to: 0.5)
Animator
Animator
allows for more flexibility in the types of animation that can be performed, but gives up some convenience in order to do so. Specifically, animators allow for any type of animation or simulation to be performed for a single value.
let view = UIView(frame: CGRect(x: 0, y: 0, width: 100, height: 100))
/// Animators coordinate animations to drive changes to a value.
let sizeAnimator = Animator(initialValue: view.bounds.size)
sizeAnimator.onChange = { [view] newSize in
view.bounds.size = newSize
}
/// A simple timed animation
sizeAnimator.animate(to: CGSize(width: 123, height: 456), duration: 0.25, timingFunction: .easeInOut)
/// Some time in the future (before the previous timed animation was complete)...
/// Spring physics will move the view's size to the new value, maintaining the velocity from the timed animation.
sizeAnimator.simulate(using: SpringFunction(target: CGSize(width: 300, height: 300)))
/// Some time in the future (before the previous spring animation was complete)...
/// The value will keep the same velocity that it had from the preceeding spring
/// animation, and a decay function will slowly bring movement to a stop.
sizeAnimator.simulate(using: DecayFunction(drag: 2.0))
Animators support two fundamentally different types of animations: timed and simulated.
Timed animations are, well, timed: they have a fixed duration, and they animate to a final value in a predictable manner.
animator.animate(to: CGSize(width: 123, height: 456), duration: 0.25, timingFunction: .easeInOut)
TimingFunction
described the pacing of a timed animation.
TimingFunction
comes with a standard set of functions:
TimingFunction.linear // No easing
TimingFunction.easeIn
TimingFunction.easeOut
TimingFunction.easeInOut
TimingFunction.swiftOut // Similar to Material Design's default curve
Custom timing functions can be expressed as unit beziers (described here).
let customTimingFunction = TimingFunction(x1: 0.1, y1: 0.2, x2: 0.6, y2: 0.0)
Simulated animations use a simulation function to power a physics-based transition. Simulation functions are types conforming to the SimulationFunction
protocol.
Simulated animations may be started using two different methods:
// Begins animating with the custom simulation function, maintaining the previous velocity of the animator.
animator.simulate(using: MyCustomFunction())
// or...
// Begins animating with the custom simulation function, imparting the specified velocity into the simulation.
animator.simulate(using: DecayFunction(), initialVelocity: dragGestureRecognizer.velocity(in: view))
Values conforming to the VectorConvertible
protocol can be animated by Advance. Conforming types can be converted to and from a Vector
implementation.
public protocol VectorConvertible: Equatable, Interpolatable {
associatedtype VectorType: SIMD where VectorType.Scalar == Double
init(vector: VectorType)
var vector: VectorType { get }
}
The library adds conformance for many common types through extensions.
If you encounter any issues or surprises, please open an issue.
For suggestions or new features, please consider opening a PR with a functional implementation. Issues may be used if you aren't sure how to implement the change, but working code is typically easier to evaluate.
Author: Timdonnelly
Source Code: https://github.com/timdonnelly/Advance
License: BSD-2-Clause license