PNLA_Julia: Polynomial Numerical Linear Algebra Package for Julia

Polynomial Numerical Linear Algebra package for Julia

Multi-functional package for solving all kinds of problems with multivariate polynomials in double precision.

Author: Kim Batselier

1. Basics

Only 1 monomial ordering is supported: the degree negative lex ordering. This graded ordering is defined for two n-variate monomials x^a and x^b as

a > b

if

|a| = \sum_i^n a_i > |b| = \sum_i^n b_i or

|a|=|b| and the leftmost nonzero entry of a-b is negative.

In this definition x^a stands for

x^a = x_1^a_1 * x_2^a_2 * ... * x_n^a_n,

where a is an n-tuple of nonnegative integers.

A system of s multivariate polynomials is represented by an s-by-2 Any Array. The first column elements are vectors containing the coefficients. The second column elements are matrices containing the corresponding exponents. For example, the system

f_1 = 5.3 x_1^2 + 9 x_2 x_3 -1

f_2 = 2 x_1^3 + .5 x_2^2 - 7.89 x_3 - 94

f_3 = x_1 - 2.13

is represented by

polysys[1,1]=[5.3,9,-1]

polysys[1,2]=[2 0 0;0 1 1;0 0 0]

polysys[2,1]=[2,.5,-7.89,-94]

polysys[2,2]=[3 0 0;0 2 0;0 0 1;0 0 0]

polysys[3,1]=[1,-2.13]

polysys[3,2]=[1 0 0;0 0 0]

## 2. Available functions

index = feti(exponent)

Converts an exponent to its corresponding linear index with respect to the degree negative lex ordering. If the exponent argument is a matrix of exponents, then feti() is applied to each row of the matrix.

exponent = fite(index,n)

Converts a linear index with respect to the degree negative lex ordering to the corresponding n-variate exponent.

d0 = getD0(polysys)

Returns the maximal total degree of the given polynomial system.

dorig = getDorig(polysys)

dorig[i] (i=1:s) contains the maximal total degree of the multivariate polynomial corresponding with polysys[i,1],polysys[i,2].

(q,p) = getMDim(polysys,d)

Returns the number of columns p and the number of rows q of the Macaulay matrix of degree d.

M = getM(polysys,d,d0...)

Returns the Macaulay matrix for the given polynomial system polysys at degree d. When the optional third argument d0 is also given, then only the columns required to enlarge M(d0) to M(d) are returned.

Download Details:

Author: kbatseli
Source Code: https://github.com/kbatseli/PNLA_Julia 
License: MIT license

#julia #numeral 

What is GEEK

Buddha Community

PNLA_Julia: Polynomial Numerical Linear Algebra Package for Julia

PNLA_Julia: Polynomial Numerical Linear Algebra Package for Julia

Polynomial Numerical Linear Algebra package for Julia

Multi-functional package for solving all kinds of problems with multivariate polynomials in double precision.

Author: Kim Batselier

1. Basics

Only 1 monomial ordering is supported: the degree negative lex ordering. This graded ordering is defined for two n-variate monomials x^a and x^b as

a > b

if

|a| = \sum_i^n a_i > |b| = \sum_i^n b_i or

|a|=|b| and the leftmost nonzero entry of a-b is negative.

In this definition x^a stands for

x^a = x_1^a_1 * x_2^a_2 * ... * x_n^a_n,

where a is an n-tuple of nonnegative integers.

A system of s multivariate polynomials is represented by an s-by-2 Any Array. The first column elements are vectors containing the coefficients. The second column elements are matrices containing the corresponding exponents. For example, the system

f_1 = 5.3 x_1^2 + 9 x_2 x_3 -1

f_2 = 2 x_1^3 + .5 x_2^2 - 7.89 x_3 - 94

f_3 = x_1 - 2.13

is represented by

polysys[1,1]=[5.3,9,-1]

polysys[1,2]=[2 0 0;0 1 1;0 0 0]

polysys[2,1]=[2,.5,-7.89,-94]

polysys[2,2]=[3 0 0;0 2 0;0 0 1;0 0 0]

polysys[3,1]=[1,-2.13]

polysys[3,2]=[1 0 0;0 0 0]

## 2. Available functions

index = feti(exponent)

Converts an exponent to its corresponding linear index with respect to the degree negative lex ordering. If the exponent argument is a matrix of exponents, then feti() is applied to each row of the matrix.

exponent = fite(index,n)

Converts a linear index with respect to the degree negative lex ordering to the corresponding n-variate exponent.

d0 = getD0(polysys)

Returns the maximal total degree of the given polynomial system.

dorig = getDorig(polysys)

dorig[i] (i=1:s) contains the maximal total degree of the multivariate polynomial corresponding with polysys[i,1],polysys[i,2].

(q,p) = getMDim(polysys,d)

Returns the number of columns p and the number of rows q of the Macaulay matrix of degree d.

M = getM(polysys,d,d0...)

Returns the Macaulay matrix for the given polynomial system polysys at degree d. When the optional third argument d0 is also given, then only the columns required to enlarge M(d0) to M(d) are returned.

Download Details:

Author: kbatseli
Source Code: https://github.com/kbatseli/PNLA_Julia 
License: MIT license

#julia #numeral 

Bailee  Streich

Bailee Streich

1624447260

Course Review: Python for Linear Algebra

Because I am continuously endeavouring to improve my knowledge and skill of the Python programming language, I decided to take some free courses in an attempt to improve upon my knowledge base. I found one such course on linear algebra, which I found on YouTube. I decided to watch the video and undertake the course work because it focused on the Python programming language, something that I wanted to improve my skill in. Youtube video this course review was taken from:- (4) Python for linear algebra (for absolute beginners) — YouTube

The course is for absolute beginners, which is good because I have never studied linear algebra and had no idea what the terms I would be working with were.

Linear algebra is the branch of mathematics concerning linear equations, such as linear maps and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics.

Whilst studying linear algebra, I have learned a few topics that I had not previously known. For example:-

A scalar is simply a number, being an integer or a float. Scalers are convenient in applications that don’t need to be concerned with all the ways that data can be represented in a computer.

A vector is a one dimensional array of numbers. The difference between a vector is that it is mutable, being known as dynamic arrays.

A matrix is similar to a two dimensional rectangular array of data stored in rows and columns. The data stored in the matrix can be strings, numbers, etcetera.

In addition to the basic components of linear algebra, being a scalar, vector and matrix, there are several ways the vectors and matrix can be manipulated to make it suitable for machine learning.

I used Google Colab to code the programming examples and the assignments that were given in the 1 hour 51 minute video. It took a while to get into writing the code of the various subjects that were studied because, as the video stated, it is a course for absolute beginners.

The two main libraries that were used for this course were numpy and matplotlib. Numpy is the library that is used to carry out algebraic operations and matplotlib is used to graphically plot the points that are created in the program.

#numpy #matplotlib #python #linear-algebra #course review: python for linear algebra #linear algebra

GenericLinearAlgebra.jl: Generic Numerical Linear Algebra in Julia

GenericLinearAlgebra.jl

A fresh approach to numerical linear algebra in Julia

The purpose of this package is partly to extend linear algebra functionality in base to cover generic element types, e.g. BigFloat and Quaternion, and partly to be a place to experiment with fast linear algebra routines written in Julia (except for optimized BLAS). It is my hope that it is possible to have implementations that are generic, fast, and readable.

So far, this has mainly been my playground but you might find some of the functionality here useful. The package has a generic implementation of a singular value solver which will make it possible to compute norm and cond of matrices of BigFloat. Hence

julia> using GenericLinearAlgebra

julia> A = big.(randn(10,10));

julia> cond(A)
1.266829904721752610946505846921202851190952179974780602509001252204638657237828e+03

julia> norm(A)
6.370285271475041598951769618847832429030388948627697440637424244721679386430589

The package also includes functions for the blocked Cholesky and QR factorization, the self-adjoint (symmetric) and the general eigenvalue problem. These routines can be accessed by fully qualifying the names

julia> using GenericLinearAlgebra

julia> A = randn(1000,1000); A = A'A;

julia> cholesky(A);

julia> @time cholesky(A);
  0.013036 seconds (16 allocations: 7.630 MB)

julia> GenericLinearAlgebra.cholRecursive!(copy(A), Val{:L});

julia> @time GenericLinearAlgebra.cholRecursive!(copy(A), Val{:L});
  0.012098 seconds (7.00 k allocations: 7.934 MB)

Download Details:

Author: JuliaLinearAlgebra
Source Code: https://github.com/JuliaLinearAlgebra/GenericLinearAlgebra.jl 
License: MIT license

#julia #algebra 

MKL.jl: Intel MKL Linear Algebra Backend for Julia

MKL.jl

Using Julia with Intel's MKL

MKL.jl is a Julia package that allows users to use the Intel MKL library for Julia's underlying BLAS and LAPACK, instead of OpenBLAS, which Julia ships with by default. Julia includes libblastrampoline, which enables picking a BLAS and LAPACK library at runtime. A JuliaCon 2021 talk provides details on this mechanism.

This package requires Julia 1.7+

Usage

If you want to use MKL.jl in your project, make sure it is the first package you load before any other package. It is essential that MKL be loaded before other packages so that it can find the Intel OMP library and avoid issues resulting out of GNU OMP being loaded first.

To Install:

Adding the package will replace the system BLAS and LAPACK with MKL provided ones at runtime. Note that the MKL package has to be loaded in every new Julia process. Upon quitting and restarting, Julia will start with the default OpenBLAS.

julia> using Pkg; Pkg.add("MKL")

To Check Installation:

julia> using LinearAlgebra

julia> BLAS.get_config()
LinearAlgebra.BLAS.LBTConfig
Libraries: 
└ [ILP64] libopenblas64_.0.3.13.dylib

julia> using MKL

julia> BLAS.get_config()
LinearAlgebra.BLAS.LBTConfig
Libraries: 
└ [ILP64] libmkl_rt.1.dylib

Using the 64-bit vs 32-bit version of MKL

We use ILP64 by default on 64-bit systems, and LP64 on 32-bit systems.

Download Details:

Author: JuliaLinearAlgebra
Source Code: https://github.com/JuliaLinearAlgebra/MKL.jl 
License: View license

#julia #algebra #backend 

Linear Algebra: The hidden engine of machine learning

Algebra is firstly taken from a book, written by Khwarizmi(780-850 CE), which is about calculation and equations. It is a branch of mathematics in which letters are used instead of numbers. Each letter can represent a specific number in one place, and a completely different number in another. Notations and symbols are also used in algebra to show the relationship between numbers. I remember about 17 years ago when I was an ordinary student in applied mathematics(ordinary graduate today!), I was so curious about some research in algebra, done by Maryam Mirzakhani(1977–2017), at Harvard University about analogous counting problem. This science has evolved a lot throughout history and now includes many branches.

Elementary algebra includes basic operations on four main operations. After defining the signs by which fixed numbers and variables are separated, methods are used to solve the equations. A polynomial is an expression that is the sum of a finite number of non-zero terms, each term consisting of the product of a constant and a finite number of variables raised to whole number powers.

Abstract algebra or modern algebra is a group in the algebra family that studies advanced algebraic structures such as groups, rings, and fields. Algebraic structures, with their associated homomorphisms, form mathematical categories. Category theory is a formalism that allows a unified way for expressing properties and constructions that are similar for various structures. Abstract algebra is so popular and used in many fields of mathematics and engineering sciences. For instance, algebraic topology uses algebraic objects to study topologies. The Poincaré conjecture proved in 2003, asserts that the fundamental group of a manifold, which encodes information about connectedness, can be used to determine whether a manifold is a sphere or not. Algebraic number theory studies various number rings that generalize the set of integers.

I believe that the most influencing branch of algebra in other sciences is linear algebra. Let’s suppose that you went out for jogging, that can’t be easy these days with Covid-19 lockdown, and suddenly a beautiful flower catches all your attention. Please don’t be rush and pick it, just take a picture, others can enjoy it, as well. After a while when you look at this picture, you can recognize the flower in the image, because the human brain has evolved over millions of years and able to detect such a thing. We are unaware of the operations that take place in the background of our brains and enable us to recognize the colors in the image, they are trained to do this for us automatically. But, it’s not easy to do such a thing with a machine, that’s why this is one of the most active research areas in machine learning and deep learning. Actually, the fundamental question is: “How does a machine store this image?” You probably know that today’s computers are designed to process only two numbers, 0 and 1. Now, how can an image like this with different features be stored? This is done by storing the pixel intensity in a structure called a “matrix”.

The main topics in linear algebra are vectors and matrices. Vectors are geometric objects that have length and direction. For example, we can mention speed and force, both of which are vector quantities. Each vector is represented by an arrow whose length and direction indicate the size and direction of the vector.

The addition of two or more vectors can be done based on ease of use using parallelogram methods or the method of images in which each vector is decomposed into its components along the coordinate axes. A vector space is a collection of vectors, which may be added together and multiplied by scalars. Scalars generally can be picked up from any field but normally are real numbers.

#matrice #machine-learning #linear-algebra #algebra #deep-learning #deep learning