NLreg.jl: Nonlinear Regression in Julia

Nonlinear regression

This package is an experiment in using the Zygote automatic differentiation package and the lowrankupdate! function in the LinearAlgebra package to solve the linear least squares problem for a Gauss-Newton update.

The data are represented as a Tables.RowTable, which is a vector of NamedTuples. The model parameters are also a NamedTuple. The model function is given as a function of two arguments - the parameters and a data row.

Example - a Michaelis-Menten fit

In the Michaelis-Menten model for enzyme kinetics,

v = Vm * c / (K + c)

the relationship between the velocity, v, of a reaction and the concentration, c, of the substrate depends on two parameters; Vm, the maximum velocity and K, the Michaelis parameter. The Vm parameter occurs linearly in this expression whereas K is a nonlinear parameter.

julia> using CSV, DataFrames, NLreg

julia> datadir = normpath(joinpath(dirname(pathof(NLreg)), "..", "data"));

julia> PurTrt = first(groupby(CSV.read(joinpath(datadir, "Puromycin.csv")), :state))
12×3 SubDataFrame
│ Row │ conc    │ rate    │ state   │
│     │ Float64 │ Float64 │ String  │
├─────┼─────────┼─────────┼─────────┤
│ 1   │ 0.02    │ 76.0    │ treated │
│ 2   │ 0.02    │ 47.0    │ treated │
│ 3   │ 0.06    │ 97.0    │ treated │
⋮
│ 9   │ 0.56    │ 191.0   │ treated │
│ 10  │ 0.56    │ 201.0   │ treated │
│ 11  │ 1.1     │ 207.0   │ treated │
│ 12  │ 1.1     │ 200.0   │ treated │

julia> pm1 = fit(NLregModel, PurTrt, :rate, (p,d) -> p.Vm * d.conc/(p.K + d.conc),
                  (Vm = 200., K = 0.05))
Nonlinear regression model fit by maximum likelihood

Data schema (response variable is rate)
Tables.Schema:
 :conc   Float64
 :rate   Float64
 :state  String
Number of observations:                  12

Parameter estimates
───────────────────────────────────────
      Estimate   Std.Error  t-statistic
───────────────────────────────────────
Vm  212.684     6.94715        30.6145
K     0.064121  0.00828092      7.74322
───────────────────────────────────────

Sum of squared residuals at convergence: 1195.4488145417758
Achieved convergence criterion:          8.798637504793927e-6

Download Details:

Author: Dmbates
Source Code: https://github.com/dmbates/NLreg.jl 

#julia #regression 

What is GEEK

Buddha Community

NLreg.jl: Nonlinear Regression in Julia

NLreg.jl: Nonlinear Regression in Julia

Nonlinear regression

This package is an experiment in using the Zygote automatic differentiation package and the lowrankupdate! function in the LinearAlgebra package to solve the linear least squares problem for a Gauss-Newton update.

The data are represented as a Tables.RowTable, which is a vector of NamedTuples. The model parameters are also a NamedTuple. The model function is given as a function of two arguments - the parameters and a data row.

Example - a Michaelis-Menten fit

In the Michaelis-Menten model for enzyme kinetics,

v = Vm * c / (K + c)

the relationship between the velocity, v, of a reaction and the concentration, c, of the substrate depends on two parameters; Vm, the maximum velocity and K, the Michaelis parameter. The Vm parameter occurs linearly in this expression whereas K is a nonlinear parameter.

julia> using CSV, DataFrames, NLreg

julia> datadir = normpath(joinpath(dirname(pathof(NLreg)), "..", "data"));

julia> PurTrt = first(groupby(CSV.read(joinpath(datadir, "Puromycin.csv")), :state))
12×3 SubDataFrame
│ Row │ conc    │ rate    │ state   │
│     │ Float64 │ Float64 │ String  │
├─────┼─────────┼─────────┼─────────┤
│ 1   │ 0.02    │ 76.0    │ treated │
│ 2   │ 0.02    │ 47.0    │ treated │
│ 3   │ 0.06    │ 97.0    │ treated │
⋮
│ 9   │ 0.56    │ 191.0   │ treated │
│ 10  │ 0.56    │ 201.0   │ treated │
│ 11  │ 1.1     │ 207.0   │ treated │
│ 12  │ 1.1     │ 200.0   │ treated │

julia> pm1 = fit(NLregModel, PurTrt, :rate, (p,d) -> p.Vm * d.conc/(p.K + d.conc),
                  (Vm = 200., K = 0.05))
Nonlinear regression model fit by maximum likelihood

Data schema (response variable is rate)
Tables.Schema:
 :conc   Float64
 :rate   Float64
 :state  String
Number of observations:                  12

Parameter estimates
───────────────────────────────────────
      Estimate   Std.Error  t-statistic
───────────────────────────────────────
Vm  212.684     6.94715        30.6145
K     0.064121  0.00828092      7.74322
───────────────────────────────────────

Sum of squared residuals at convergence: 1195.4488145417758
Achieved convergence criterion:          8.798637504793927e-6

Download Details:

Author: Dmbates
Source Code: https://github.com/dmbates/NLreg.jl 

#julia #regression 

QuantileRegressions.jl: Quantile regression in Julia

Quantile Regression

Implementation of quantile regression.

  • Install using ]add QuantileRegressions
  • Main author: Patrick Kofod Mogensen

Example

The file examples/qreg_example.jl shows how to use the functions provided here. It replicates part of the analysis in:

  • Koenker, Roger and Kevin F. Hallock. "Quantile Regression". Journal of Economic Perspectives, Volume 15, Number 4, Fall 2001, Pages 143–156

We are interested in the relationship between income and expenditures on food for a sample of working class Belgian households in 1857 (the Engel data), so we estimate a least absolute deviation model.

julia> using QuantileRegressions

julia> # Load data
       url = "http://vincentarelbundock.github.io/Rdatasets/csv/quantreg/engel.csv"
"http://vincentarelbundock.github.io/Rdatasets/csv/quantreg/engel.csv"

julia> df = readtable(download(url))
235×3 DataFrames.DataFrame
│ Row │ x   │ income  │ foodexp │
├─────┼─────┼─────────┼─────────┤
│ 1   │ 1   │ 420.158 │ 255.839 │
│ 2   │ 2   │ 541.412 │ 310.959 │
│ 3   │ 3   │ 901.157 │ 485.68  │
│ 4   │ 4   │ 639.08  │ 402.997 │
│ 5   │ 5   │ 750.876 │ 495.561 │
│ 6   │ 6   │ 945.799 │ 633.798 │
│ 7   │ 7   │ 829.398 │ 630.757 │
│ 8   │ 8   │ 979.165 │ 700.441 │
⋮
│ 227 │ 227 │ 776.596 │ 485.52  │
│ 228 │ 228 │ 1230.92 │ 772.761 │
│ 229 │ 229 │ 1807.95 │ 993.963 │
│ 230 │ 230 │ 415.441 │ 305.439 │
│ 231 │ 231 │ 440.517 │ 306.519 │
│ 232 │ 232 │ 541.201 │ 299.199 │
│ 233 │ 233 │ 581.36  │ 468.001 │
│ 234 │ 234 │ 743.077 │ 522.602 │
│ 235 │ 235 │ 1057.68 │ 750.32  │

julia> # Fit least absolute deviation model (quantile  = .5)
       ResultQR = qreg(@formula(foodexp~income), df, .5)
StatsModels.TableRegressionModel{QuantileRegressions.QRegModel,Array{Float64,2}}

foodexp ~ 1 + income

Coefficients:
             Quantile Estimate Std.Error t value
(Intercept)       0.5  81.4822   14.6345 5.56783
income            0.5 0.560181 0.0131756 42.5164

The results look pretty close to Stata 12's qreg:

. insheet using engel.csv
. qreg foodexp income, vce(iid, kernel(epan2))
Median regression                                    Number of obs =       235
  Raw sum of deviations 46278.06 (about 582.54126)
  Min sum of deviations 17559.93                     Pseudo R2     =    0.6206

------------------------------------------------------------------------------
     foodexp |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
      income |   .5601805   .0131763    42.51   0.000     .5342206    .5861403
       _cons |   81.48233   14.63518     5.57   0.000     52.64815    110.3165
------------------------------------------------------------------------------

We can also compute and plot (using Julia's Winston) results for various quantiles. Full code to produce the figure is in the examples folder.

History

This package was originally created as a port of the reweighed least squares code by Vincent Arel-Bundock from the python project statsmodels. All contributions can be seen via the contributors page.

Download Details:

Author: Pkofod
Source Code: https://github.com/pkofod/QuantileRegressions.jl 
License: View license

#julia #regression 

Isotonic.jl: Isotonic Regression in Julia

Isotonic

Isotonic Regression in Julia

This implements several algorithms for isotonic regression in Julia.

Algorithms

  • Linear PAVA (fastest)
  • Pooled PAVA (slower)
  • Active Set (slowest)

Demonstration

See the iJulia Notebook for a demonstration of usage (and some performance numbers).

julia> isotonic_regression([1.0, 2.0, 3.0, 4.0])
4-element Array{Float64,1}:
 1.0
 2.0
 3.0
 4.0

Download Details:

Author: Ajtulloch
Source Code: https://github.com/ajtulloch/Isotonic.jl 
License: View license

#julia #regression 

5 Regression algorithms: Explanation & Implementation in Python

Take your current understanding and skills on machine learning algorithms to the next level with this article. What is regression analysis in simple words? How is it applied in practice for real-world problems? And what is the possible snippet of codes in Python you can use for implementation regression algorithms for various objectives? Let’s forget about boring learning stuff and talk about science and the way it works.

#linear-regression-python #linear-regression #multivariate-regression #regression #python-programming

Angela  Dickens

Angela Dickens

1598352300

Regression: Linear Regression

Machine learning algorithms are not your regular algorithms that we may be used to because they are often described by a combination of some complex statistics and mathematics. Since it is very important to understand the background of any algorithm you want to implement, this could pose a challenge to people with a non-mathematical background as the maths can sap your motivation by slowing you down.

Image for post

In this article, we would be discussing linear and logistic regression and some regression techniques assuming we all have heard or even learnt about the Linear model in Mathematics class at high school. Hopefully, at the end of the article, the concept would be clearer.

**Regression Analysis **is a statistical process for estimating the relationships between the dependent variables (say Y) and one or more independent variables or predictors (X). It explains the changes in the dependent variables with respect to changes in select predictors. Some major uses for regression analysis are in determining the strength of predictors, forecasting an effect, and trend forecasting. It finds the significant relationship between variables and the impact of predictors on dependent variables. In regression, we fit a curve/line (regression/best fit line) to the data points, such that the differences between the distances of data points from the curve/line are minimized.

#regression #machine-learning #beginner #logistic-regression #linear-regression #deep learning