Monty  Boehm

Monty Boehm

1678835640

Package Polyline Implements A Google Maps Encoding Polyline Encoder

Go-polyline

Package polyline implements a Google Maps Encoding Polyline encoder and decoder.

Encoding example

func ExampleEncodeCoords() {
    coords := [][]float64{
        {38.5, -120.2},
        {40.7, -120.95},
        {43.252, -126.453},
    }
    fmt.Println(string(polyline.EncodeCoords(coords)))
    // Output: _p~iF~ps|U_ulLnnqC_mqNvxq`@
}

Decoding example

func ExampleDecodeCoords() {
    buf := []byte("_p~iF~ps|U_ulLnnqC_mqNvxq`@")
    coords, _, _ := polyline.DecodeCoords(buf)
    fmt.Println(coords)
    // Output: [[38.5 -120.2] [40.7 -120.95] [43.252 -126.453]]
}

Download Details:

Author: twpayne
Source Code: https://github.com/twpayne/go-polyline 
License: BSD-2-Clause license

#go #golang #encode #googlemaps #geospatial 

 Package Polyline Implements A Google Maps Encoding Polyline Encoder
Monty  Boehm

Monty Boehm

1678823940

Package Geom Implements Efficient Geometry Types for Geospatial Apps

Go-geom

Package geom implements efficient geometry types for geospatial applications.

Key features

  • OpenGeo Consortium-style geometries.
  • Support for 2D and 3D geometries, measures (time and/or distance), and unlimited extra dimensions.
  • Encoding and decoding of common geometry formats (GeoJSON, KML, WKB, and others) including sql.Scanner and driver.Value interface implementations for easy database integration.
  • 2D and 3D topology functions.
  • Efficient, cache-friendly internal representation.
  • Optional protection against malicious or malformed inputs.

Examples

Detailed features

Geometry types

Encoding and decoding

Geometry functions

  • XY 2D geometry functions
  • XYZ 3D geometry functions

Protection against malicious or malformed inputs

The WKB and EWKB formats encode geometry sizes, and memory is allocated for those geometries. If the input is malicious or malformed, the memory allocation can be very large, leading to a memory starvation denial-of-service attack against the server. For example, a client might send a MultiPoint with header indicating that it contains 2^32-1 points. This will result in the server reading that geometry to allocate 2 × sizeof(float64) × (2^32-1) = 64GB of memory to store those points. By default, malicious or malformed input protection is disabled, but can be enabled by setting positive values for wkbcommon.MaxGeometryElements.

Related libraries


Download Details:

Author: twpayne
Source Code: https://github.com/twpayne/go-geom 
License: BSD-2-Clause license

#go #golang #encoded #geojson #geospatial 

Package Geom Implements Efficient Geometry Types for Geospatial Apps
Royce  Reinger

Royce Reinger

1676758080

Raster-vision: An Open Source Library and Framework for Deep Learning

Raster vision

An open source library and framework for deep learning on satellite and aerial imagery.

Raster Vision is an open source Python library and framework for building computer vision models on satellite, aerial, and other large imagery sets (including oblique drone imagery).

It has built-in support for chip classification, object detection, and semantic segmentation with backends using PyTorch.

Examples of chip classification, object detection and semantic segmentation

As a library, Raster Vision provides a full suite of utilities for dealing with all aspects of a geospatial deep learning workflow: reading geo-referenced data, training models, making predictions, and writing out predictions in geo-referenced formats.

As a low-code framework, Raster Vision allows users (who don't need to be experts in deep learning!) to quickly and repeatably configure experiments that execute a machine learning pipeline including: analyzing training data, creating training chips, training models, creating predictions, evaluating models, and bundling the model files and configuration for easy deployment. Overview of Raster Vision workflow

Raster Vision also has built-in support for running experiments in the cloud using AWS Batch.

See the documentation for more details.

Installation

For more details, see the Setup documentation.

Install via pip

You can install Raster Vision directly via pip.

pip install rastervision

Use Pre-built Docker Image

Alternatively, you may use a Docker image. Docker images are published to quay.io (see the tags tab).

We publish a new tag per merge into master, which is tagged with the first 7 characters of the commit hash. To use the latest version, pull the latest suffix, e.g. raster-vision:pytorch-latest. Git tags are also published, with the Github tag name as the Docker tag suffix.

Build Docker Image

You can also build a Docker image from scratch yourself. After cloning this repo, run docker/build, and run then the container using docker/run.

Usage Examples and Tutorials

Non-developers may find it easiest to use Raster Vision as a low-code framework where Raster Vision handles all the complexities and the user only has to configure a few parameters. The Quickstart guide is a good entry-point into this. More advanced examples can be found on the Examples page.

For developers and those looking to dive deeper or combine Raster Vision with their own code, the best starting point is Usage Overview, followed by Basic Concepts and Tutorials.

Contact and Support

You can ask questions and talk to developers (let us know what you're working on!) at:

Contributing

For more information, see the Contribution page.

We are happy to take contributions! It is best to get in touch with the maintainers about larger features or design changes before starting the work, as it will make the process of accepting changes smoother.

Everyone who contributes code to Raster Vision will be asked to sign the Azavea CLA, which is based off of the Apache CLA.

Download a copy of the Raster Vision Individual Contributor License Agreement or the Raster Vision Corporate Contributor License Agreement

Print out the CLAs and sign them, or use PDF software that allows placement of a signature image.

Send the CLAs to Azavea by one of:

  • Scanning and emailing the document to cla@azavea.com
  • Faxing a copy to +1-215-925-2600.
  • Mailing a hardcopy to: Azavea, 990 Spring Garden Street, 5th Floor, Philadelphia, PA 19107 USA

Download Details:

Author: Azavea
Source Code: https://github.com/azavea/raster-vision 
License: View license

#machinelearning #python #computervision #deeplearning #geospatial #pytorch 

Raster-vision: An Open Source Library and Framework for Deep Learning
Nigel  Uys

Nigel Uys

1670711520

Tutorial Geospatial Analytics using Presto and Hive

Introduction to Geospatial Analytics

Geospatial Analytics is related to data that is used for locating anything on the globe, an uber driver to a man in a new neighbourhood place everybody uses its data in some way or the other. Its technology involves GPS (global positioning systems), GIS (geographical information systems), and RS (remote sensing). This blog we will explore the topic in depth. We start with the basics and then deep dive into all the details.

Why is it important?

It is necessary for so many things and is used daily for various reasons. From commuting purposes for an ordinary man to data in missiles of a defence organization of a particular county, everything requires its data. It is extracted from various resources. Every phone having an active internet connection somehow adds up to contributing to geospatial data, satellites collect data daily. It is of great use in everyday life, and so it requires a significant amount of attention. It can be used for various reasons, to help support natural hazards and, to know of disasters, global climate change, wildlife, natural resources, etc. It is used for satellite imagery too that could be for tactical or for weather forecasting purposes. Many tech giants like uber etc. use it on daily bases to help ease everyday life. A company has to be efficient in extracting the data efficiently and use it, to stand out in the market. 

How to retrieve Geospatial Data?

Various methods could do this, but mainly Presto and hives are used to extract and reform the data that's present in hundreds of petabyte and use it efficiently and make the lives of billions easy. This data is vital as it touches the mass majority and is used every second. GIS is a part of its data that helps in the collection, storage, manipulation, analyzation, and present spatial data. Whatever the situation is going on at local, regional or national level, if where is asked for it come to play. It wouldn't be effective without Visualization. 

Geospatial Analytics Using Presto

Presto is an open-source distributed SQL query, used to solve the question of any size or type. It runs on Hadoop. It supports many non-relational resources and Teradata. It can query data on its respective location, without moving the actual data to any separate system. The execution of the query runs parallel over a pure memory-based architecture, with most results returning within seconds. Many tech giants use it. It's a popular choice for undertaking interactive queries that are in data ranging in100s of PetaByte.

Geospatial Analytics Using Hive

It is a data warehouse infrastructure tool to process any structured data and developed on top of the Hadoop distributed file system. It resides on top of Hadoop to summarize Big Data and makes querying and analyzing of any kind of data accessible.

What is the architecture of Hive?

It is an ETL and Data Warehousing tool built on top of the Hadoop. It helps to perform many operations secure like :

  • Analysis of large data sets
  • Data encapsulation
  • Ad-hoc queries

What are its major components?

  1. Client
  2. Services
  3. Processing & Resource Management
  4. Distributed Storage

Hive Clients

 It supports all the application written in languages like Java, Python, C++ etc. It is using Thrift, JDBC and ODBC drivers. It's easy to write its client application in the desired language. Its clients are categorized into three types:-

  • Thrift Clients: Apache Hive's servers are based on Thrift, so it's easy for it to serve all the request from the languages that support Thrift
  • JDBC Clients: It allows java apps to connect to it by using its JDBC driver
  • ODBC Clients: ODBC Driver will enable applications that support ODBC protocol to connect to it. It uses Thrift to communicate to its server.

Hive Services

 It provides with various services like -

  1. CLI(Command Line Interface) – It is the default shell provided by it, which helps to execute its queries and command directly.
  2. Web Interface – It gives an option to execute queries and commands on a web-based GUI provided by it.
  3. Server – It is built on Apache Thrift and is also knows as Thrift Server. It allows different clients to submit requests and retrieve the final result from it.
  4. Driver – It is responsible for receiving the queries submitted by clients. It compiles, optimizes and executes the queries.

What is the architecture of Presto?

There is two central part in it: Coordinator and Worker. It is an open-source distributed system that can be run on multiple machines. Its distributed SQL query engine was built for fast analytic queries. Its deployment will include one Coordinator and any number of it.

  • Coordinator – Used to submit queries and manages parsing, planning, and scheduling query processing. 
  • Worker – Processes the queries, adding more workers gives faster query processing.

What are its key components?

The key components of presto are:

Coordinator

It is the brain of any installation; it manages all the worker nodes for all the work comes related to queries. It gets results from workers and returns the final output to the client. It connects with workers and clients via REST.

Worker

It helps to execute the task and to process the data. These nodes share data amongst each other and get data from the Coordinator.

Catalogue

It contains information related to data, such as where the data is located, where the schema is located and the data source. 

Tables and Schemas

It is similar to what it means in a relational database. The table is set of rows organized into named columns and schema is what you use to hold your tables.

Connector

lt issued to help it to integrate with the external data source.

Stage

To execute a query, Presto breaks it up into steps.

Tasks

Stages are implemented as a series of functions that might get distributed on Workers.

Drivers and Operators

Tasks contains one or more parallel drivers, and they are operators in memory. An operator consumes, transforms and produces data.

What are the deployment strategies?

The deployment strategies for Hive are listed below:

AWS

Amazon EMR is used to deploy its megastore. User can opt from three configurations that Amazon has to offer, namely – Embedded, Local or Remote.  There are two options for creating an external Hive megastore for EMR:

  1. By using AWS Glue data catalogue
  2. Use Amazon RDS / Amazon Aurora

Cloud Dataproc

Apache Hive on Cloud Dataproc provides an efficient and flexible way by storing data of it in Cloud Storage and hosting its metastore in MySQL database on the Cloud SQL. It offers some advantages like flexibility and agility by letting user tailor cluster configuration for specific workloads and scale the cluster according to the need. It also helps in saving cost.

The deployment strategies for Presto

AWS

Amazon EMR allows to quickly spin up a managed EMR cluster with a presto query engine and run interactive analysis on the data stored in Amazon S3. It is used to run interactive queries. Its implementation can be built on the cloud on Amazon Web Services. Amazon EMR and Amazon Athena provides with building and implementation of it.

Cloud Dataproc

The cluster that includes its component can easily prepare in Presto. 

What are the various ways to optimise?

The various ways to optimise are described below:

Hive

  1. Tez-Execution Engine  – It is an application framework built on Hadoop Yarn. 
  2. Usage of Suitable File Format – Usage of appropriate file format on the basis of data will drastically increase the query performance. ORC file format is best suited for the same.
  3. Partitioning – By partitioning the entries into the different dataset, only the required data is called during the time of the execution of the query, thus making the performance more efficient and optimized.
  4. Bucketing – It helps divide the datasets into more manageable parts, for this purpose bucketing is used. User can set the size of manageable pieces or Buckets too.
  5. Vectorization – Vectorized query execution is used for more optimized performance of it. It happens by performing aggregation over batches of 1024 rows at once instead of the single row each time.
  6. Cost-Based Optimization (CBO) – It performs optimization based on query cost. To use CBO parameters are to be set at the beginning of the query.
  7. Indexing – Indexing helps increase optimization. It helps the speed of the process of executing queries by taking less time to do so. 

Presto

  1. File format - Usage of ORC file format is best suited for optimizing the execution of queries while using it.
  2. It can join automatically if the feature is enabled.
  3. Dynamic filter feature optimizes the use of JOIN queries
  4. It has added a new connector configuration to skip corrupt records in input formats other than orc, parquet and rcfile.
  5. By setting task.max-worker-threads in config.properties, number of CPU cores into hyper-threads per core on a worker node.
  6. Splits can be used for efficient and optimized use in executing the queries in Presto.

What are the advantages?

The advantages of Hive and Presto are:

Hive

  1. It is a stable query engine and has a large and active community
  2. Its queries are similar to that of SQL, which are easy to understand by RDBMS professionals
  3. It supports ORC, TextFile, RCFile, Avro and Parquet file Formats

Presto 

  1. It supports file formats like ORC, Parquet and RCFile formats, eliminating the need for data transformation.
  2. It works well with Amazon S3 queries and Storage, it can query data in mere seconds even if the data is of the size of petabytes.
  3. It also has an active community.

Geospatial Analytics Using Presto and Hive

Modelling geospatial data has quite many complexities. Well, Known Texts are used to model different locations on the map. Various types like point and polygon shapes are used for these purposes. The Spatial Library is used for spatial processing in it with User-Defined Functions and SerDes. Through allowing this library in it, queries may be created using its Query Language (HQL), which is somewhat close to SQL. You will, therefore, stop complex MapReduce algorithms and stick to a more common workflow. Its plugin is running in production at Uber. All GeoSpatial traffic at Uber, more than 90% of it is completed within 5 minutes. Compared with brute force its MapReduce execution, Uber's Geospatial Plugin is more than 50X faster, leading to greater efficiency.

Summing up

Presto has the edge over Hive as it can be used to process unstructured data too, and query processing in it is faster than that in it. The data is collected in a humongous amount daily, and it needs to be extracted efficiently and judiciously to have better working software that requires it.

Original article source at: https://www.xenonstack.com/

#analytics #presto #hive #geospatial 

Tutorial Geospatial Analytics using Presto and Hive
Nat  Grady

Nat Grady

1668078000

Terra: R Package for Spatial Data Handling

Terra

terra is an R package for spatial data analysis. There are tutorials at rspatial.org/terra.

stackoverflow is the best place to ask questions if you get stuck. Make sure to include a simple reproducible example. But if you think you have found a bug, please file an issue.

terra replaces the raster package. The interfaces of terra and raster are similar, but terra is simpler, faster and can do more.

Installation

terra is available from CRAN, so you can use install.packages("terra") to get the current released version.

The easiest way to use the development version on Windows or MacOS, is to install it from the R-universe, like this:

install.packages('terra', repos='https://rspatial.r-universe.dev')

From source-code

To install from source-code, first install the Rcpp package that terra depends on:

install.packages("Rcpp")

And then continue based on the OS you are using.

Windows

On Windows, you need to first install Rtools to get a C++ compiler that R can use. You need a recent version of Rtools42 (rtools42-5355-5357).

Then, in R, install the package.

Sys.setenv("R_REMOTES_NO_ERRORS_FROM_WARNINGS" = "true")
remotes::install_github("rspatial/terra")

macOS

On macOS, first install gdal and proj with homebrew

brew install pkg-config
brew install gdal

Followed by (note the additional configuration argument needed for the current homebrew version of proj (9.1.0)

remotes::install_github("rspatial/terra", configure.args = "--with-proj-lib=/opt/homebrew/Cellar/proj/9.1.0/lib/")

To install the CRAN version from source you would do

install.packages("terra", configure.args = "--with-proj-lib=/opt/homebrew/Cellar/proj/9.1.0/lib/")

Linux

The easy way to install terra on linux is with r2u.

The harder way: C++11, GDAL (>= 2.2.3), GEOS (>= 3.4.0), PROJ (>= 4.9.3), sqlite3 are required, but more recent versions highly recommended.

To install these system requirements on Ubuntu you can do:

sudo add-apt-repository ppa:ubuntugis/ubuntugis-unstable
sudo apt-get update
sudo apt-get install libgdal-dev libgeos-dev libproj-dev 

And now, in R, install the package

remotes::install_github("rspatial/terra")

See the sf instructions for installation on other linux systems --- and for possible updates/improvements on the above instructions.

Download Details:

Author: rspatial
Source Code: https://github.com/rspatial/terra 
License: GPL-3.0 license

#r #vector #geospatial #raster 

Terra: R Package for Spatial Data Handling

Shapefile.jl: Parsing .shp Files in Julia

Shapefile 

This library supports reading ESRI Shapefiles in pure Julia.

Quick Start

Basic example of reading a shapefile from test cases:

using Shapefile

path = joinpath(dirname(pathof(Shapefile)),"..","test","shapelib_testcases","test.shp")
table = Shapefile.Table(path)

# if you only want the geometries and not the metadata in the DBF file
geoms = Shapefile.shapes(table)

# whole columns can be retrieved by their name
table.Descriptio  # => Union{String, Missing}["Square with triangle missing", "Smaller triangle", missing]

# example function that iterates over the rows and gathers shapes that meet specific criteria
function selectshapes(table)
    geoms = empty(Shapefile.shapes(table))
    for row in table
        if !ismissing(row.TestDouble) && row.TestDouble < 2000.0
            push!(geoms, Shapefile.shape(row))
        end
    end
    return geoms
end

# the metadata can be converted to other Tables such as DataFrame
using DataFrames
df = DataFrame(table)

Shapefiles can contain multiple parts for each shape entity. Use GeoInterface.coordinates to fully decompose the shape data into parts.

# Example of converting the 1st shape of the file into parts (array of coordinates)
julia> GeoInterface.coordinates(Shapefile.shape(first(table)))
2-element Vector{Vector{Vector{Vector{Float64}}}}:
 [[[20.0, 20.0], [20.0, 30.0], [30.0, 30.0], [20.0, 20.0]]]
 [[[0.0, 0.0], [100.0, 0.0], [100.0, 100.0], [0.0, 100.0], [0.0, 0.0]]]

Alternative packages

If you want another lightweight pure Julia package for reading feature files, consider also GeoJSON.jl.

For much more fully featured support for reading and writing geospatial data, at the cost of a larger binary dependency, look at GDAL.jl or ArchGDAL.jl packages. The latter builds a higher level API on top of GDAL.jl.

Download Details:

Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/Shapefile.jl 
License: View license

#julia #vector #geospatial 

Shapefile.jl: Parsing .shp Files in Julia

NetCDF.jl: NetCDF Support for The Julia Programming Language

NetCDF.jl

NetCDF support for the Julia programming language, there is a high-level and a medium-level interface for writing and reading netcdf files.

Installation

pkg> add NetCDF

Quickstart

First, load the library:

using NetCDF

The high-level interface is quite similar to the Matlab NetCDF interface, reading files is done by:

x = ncread("myfile.nc", "Radiation")

which will read the variable called "Radiation" from the file "myfile.nc". General information can be gained by using

ncinfo(filename)

which gives an overview of the dimensions, variables and attributes stored in the file.

filename = "myfile.nc"
varname  = "var1"
attribs  = Dict("units"   => "mm/d",
                "data_min" => 0.0,
                "data_max" => 87.0)

Creating variables and files is done by using the nccreate command:

nccreate(filename, varname, "x1", collect(11:20), "t", 20, Dict("units"=>"s"), atts=attribs)

This will create the variable called var1 in the file myfile.nc. The attributes defined in the Dict attribs are written to the file and are associated with the newly created variable. The dimensions "x1" and "t" of the variable are called "x1" and "t" in this example. If the dimensions do not exist yet in the file, they will be created. The dimension "x1" will be of length 10 and have the values 11..20, and the dimension "t" will have length 20 and the attribute "units" with the value "s".

Now we can write data to the file:

d = rand(10, 20)
ncwrite(d, filename, varname)

The full documentation can be found here

An alternative interface for reading NetCDF files can be found here: Alexander-Barth/NCDatasets.jl

DiskArray interface

As of version 0.9 we provide an interface to DiskArrays.jl, which lets you treat NetCDF variables as Julia AbstractArrays. Special methods for accessing, reading slices and reductions and broadcasting are implemented, so that working with the arrays should be efficient.

So the following returns a lazy AbstractArray container referencing the data in the variable.

v = NetCDF.open(filename, varname)

Credits

This package was originally started and is mostly maintained by Fabian Gans (fgans@bgc-jena.mpg.de). The automatic C wrapper generator was contributed by Martijn Visser (https://github.com/visr). Many thanks to several people who contributed bug fixes and enhancements.

Download Details:

Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/NetCDF.jl 
License: View license

#julia #geospatial 

NetCDF.jl: NetCDF Support for The Julia Programming Language

Geodesy.jl: Work with Points Defined in Various Coordinate Systems

Geodesy

Geodesy is a Julia package for working with points in various world and local coordinate systems. The primary feature of Geodesy is to define and perform coordinate transformations in a convenient and safe framework, leveraging the CoordinateTransformations package. Transformations are accurate and efficient and implemented in native Julia code (with many functions being ported from Charles Karney's GeographicLib C++ library), and some common geodetic datums are provided for convenience.

Quick start

Lets define a 3D point by its latitude, longitude and altitude (LLA):

x_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia

This can be converted to a Cartesian Earth-Centered-Earth-Fixed (ECEF) coordinate simply by calling the constructor

x_ecef = ECEF(x_lla, wgs84)

Here we have used the WGS-84 ellipsoid to calculate the transformation, but other datums such as osgb36, nad27 and grs80 are provided. All transformations use the CoordinateTransformations' interface, and the above is short for

x_ecef = ECEFfromLLA(wgs84)(x_lla)

where ECEFfromLLA is a type inheriting from CoordinateTransformations' Transformation. (Similar names XfromY exist for each of the coordinate types.)

Often, points are measured or required in a local frame, such as the north-east-up coordinates with respect to a given origin. The ENU type represents points in this coordinate system and we may transform between ENU and globally referenced coordinates using ENUfromLLA, etc.

origin_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
point_lla = LLA(-27.465933, 153.025900, 0.0)  # Central Station, Brisbane, Australia

# Define the transformation and execute it
trans = ENUfromLLA(origin_lla, wgs84)
point_enu = trans(point_lla)

# Equivalently
point_enu = ENU(point_enu, point_origin, wgs84)

Similarly, we could convert to UTM/UPS coordinates, and two types are provided for this - UTM stores 3D coordinates x, y, and z in an unspecified zone, while UTMZ includes the zone number and hemisphere bool (where true = northern, false = southern). To get the canonical zone for your coordinates, simply use:

x_utmz = UTMZ(x_lla, wgs84)

If you are transforming a large number of points to or from a given zone, it may be more effective to define the transformation explicitly and use the lighter UTM storage type.

points_lla::Vector{LLA{Float64}}
utm_from_lla = UTMfromLLA(56, false, wgs84) # Zone 56-South
points_utm = map(utm_from_lla, points_lla) # A new vector of UTM coordinates

Geodesy becomes particularly powerful when you chain together transformations. For example, you can define a single transformation from your data on disk in UTM coordinates to a local frame in ENU coordinates. Internally, this will perform UTM (+ zone) → LLA → ECEF → ENU via composing transformations with into a ComposedTransformation:

julia> origin = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
LLA(lat=-27.468937°, lon=153.023628°, alt=0.0)

julia> trans = ENUfromUTMZ(origin, wgs84)
(ENUfromECEF(ECEF(-5.046925124630393e6, 2.5689157252069353e6, -2.924416653602336e6), lat=-27.468937°, lon=153.023628°) ∘ (ECEFfromLLA(wgs84) ∘ LLAfromUTMZ(wgs84)))

This transformation can then be composed with rotations and translations in CoordinateTransformations (or your own custom-defined AbstractTransformation to define further reference frames. For example, in this way, a point measured by a scanner on a moving vehicle at a particular time may be globally georeferenced with a single call to the Transformation!

Finally, the Cartesian distance between world points can be calculated via automatic transformation to a Cartesian frame:

x_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
y_lla = LLA(-27.465933, 153.025900, 0.0) # Central Station, Brisbane, Australia
distance(x_lla, y_lla)                   # 401.54 meters

(assuming the wgs84 datum, which can be configured in distance(x, y, datum)).

Basic Terminology

This section describes some terminology and concepts that are relevant to Geodesy.jl, attempting to define Geodesy-specific jargon where possible. For a longer, less technical discussion with more historical context, ICSM's Fundamentals of Mapping page is highly recommended.

Coordinate Reference Systems and Spatial Reference Identifiers

A position on the Earth can be given by some numerical coordinate values, but those don't mean much without more information. The extra information is called the Coordinate Reference System or CRS (also known as a Spatial Reference System or SRS). A CRS tells you two main things:

  • The measurement procedure: which real world objects were used to define the frame of reference or datum of the measurement?
  • The coordinate system: how do coordinate numerical values relate to the reference frame defined by the datum?

The full specification of a CRS can be complex, so a short label called a Spatial Reference IDentifier or SRID is usually used instead. For example, EPSG:4326 is one way to refer to the 2D WGS84 latitude and longitude you'd get from a mobile phone GPS device. An SRID is of the form AUTHORITY:CODE, where the code is a number and the authority is the name of an organization maintaining a list of codes with associated CRS information. There are services where you can look up a CRS, for example, http://epsg.io is a convenient interface to the SRIDs maintained by the European Petroleum Survey Group (EPSG) authority. Likewise, http://spatialreference.org is an open registry to which anyone can contribute.

When maintaining a spatial database, it's typical to define an internal list of SRIDs (effectively making your organization the authority), and a mapping from these to CRS information. A link back to a definitive SRID from an external authority should also be included where possible.

Datums

In spatial measurement and positioning, a datum is a set of reference objects with given coordinates, relative to which other objects may be positioned. For example, in traditional surveying a datum might comprise a pair of pegs in the ground, separated by a carefully measured distance. When surveying the position of an unknown but nearby point, the angle back to the original datum objects can be measured using a theodolite. After this, the relative position of the new point can be computed using simple triangulation. Repeating this trick with any of the now three known points, an entire triangulation network of surveyed objects can be extended outward. Any point surveyed relative to the network is said to be measured in the datum of the original objects. Datums are often named with an acronym, for example OSGB36 is the Ordnance Survey of Great Britain, 1936.

In the era of satellite geodesy, coordinates are determined for an object by timing signals from a satellite constellation (eg, the GPS satellites) and computing position relative to those satellites. Where is the datum here? At first glance the situation seems quite different from the traditional setup described above. However, the satellite positions as a function of time (ephemerides, in the jargon) must themselves be defined relative to some frame. This is done by continuously observing the satellites from a set of highly stable ground stations equipped with GPS receivers. It is the full set of these ground stations and their assigned coordinates which form the datum.

Let's inspect the flow of positional information in both cases:

  • For traditional surveying,
datum object positions -> triangulation network -> newly surveyed point
  • For satellite geodesy,
datum object positions -> satellite ephemerides -> newly surveyed point

We see that the basic nature of a datum is precisely the same regardless of whether we're doing a traditional survey or using a GPS receiver.

Terrestrial reference systems and frames

Coordinates for new points are measured by transferring coordinates from the datum objects, as described above. However, how do we decide on coordinates for the datum objects themselves? This is purely a matter of convention, consistency and measurement.

For example, the International Terrestrial Reference System (ITRS) is a reference system that rotates with the Earth so that the average velocity of the crust is zero. That is, in this reference system the only crust movement is geophysical. Roughly speaking, the defining conventions for the ITRS are:

  • Space is modeled as a three-dimensional Euclidean affine space.
  • The origin is at the center of mass of the Earth (it is geocentric).
  • The z-axis is the axis of rotation of the Earth.
  • The scale is set to 1 SI meter.
  • The x-axis is orthogonal to the z-axis and aligns with the international reference meridian through Greenwich.
  • The y-axis is set to the cross product of the z and x axes, forming a right handed coordinate frame.
  • Various rates of change of the above must also be specified, for example, the scale should stay constant in time.

The precise conventions are defined in chapter 4 of the IERS conventions published by the International Earth Rotation and Reference Service (IERS). These conventions define an ideal reference system, but they're useless without physical measurements that give coordinates for a set of real world datum objects. The process of measuring and computing coordinates for datum objects is called realizing the reference system and the result is called a reference frame. For example, the International Terrestrial Reference Frame of 2014 (ITRF2014) realizes the ITRS conventions using raw measurement data gathered in the 25 years prior to 2014.

To measure and compute coordinates, several space geodesy techniques are used to gather raw measurement data; currently the IERS includes VLBI (very long baseline interferometry) of distant astronomical radio sources, SLR (satellite laser ranging), GPS (global positioning system) and DORIS (gosh these acronyms are tiring). The raw data is not in the form of positions, but must be condensed down in a large scale fitting problem, ideally by requiring physical and statistical consistency of all measurements, tying measurements at different sites together with physical models.

Coordinate systems

In geometry, a coordinate system is a system which uses one or more numbers, or coordinates to uniquely determine the position of a point in a mathematical space such as Euclidean space. For example, in geodesy a point is commonly referred to using geodetic latitude, longitude and height relative to a given reference ellipsoid; this is called a geodetic coordinate system.

An ellipsoid is chosen because it's a reasonable model for the shape of the Earth and its gravitational field without being overly complex; it has only a few parameters, and a simple mathematical form. The term spheroid is also used because the ellipsoids in use today are rotationally symmetric around the pole. Note that there's several ways to define latitude on an ellipsoid. The most natural for geodesy is geodetic latitude, used by default because it's physically accessible in any location as a good approximation to the angle between the gravity vector and the equatorial plane. (This type of latitude is not an angle measured at the centre of the ellipsoid, which may be surprising if you're used to spherical coordinates!)

There are usually several useful coordinate systems for the same space. As well as the geodetic coordinates mentioned above, it's common to see

  • The x,y,z components in an Earth-Centred Cartesian coordinate system rotating with the Earth. This is conventionally called an Earth-Centred Earth-Fixed (ECEF) coordinate system. This is a natural coordinate system in which to define coordinates for the datum objects defining a terrestrial reference frame.
  • The east,north and up ENU components of a Cartesian coordinate frame at a particular point on the ellipsoid. This coordinate system is useful as a local frame for navigation.
  • Easting,northing and vertical components of a projected coordinate system or map projection. There's an entire zoo of these, designed to represent the curved surface of an ellipsoid with a flat map.

Different coordinates systems provide different coordinates for the same point, so it's obviously important to specify exactly which coordinate system you're using. In particular, you should specify which ellipsoid parameters are in use if you deal with latitude and longitude, as in principle you could have more than one ellipsoid. This is a point of confusion, because a datum in geodesy also comes with a reference ellipsoid as a very strong matter of convention (thus being called a geodetic datum).

With its conventional ellipsoid, a geodetic datum also defines a conventional geodetic coordinate system, thus bringing together concepts which are interconnected but conceptually distinct. To emphasize:

  • A coordinate system is a mathematical abstraction allowing us to manipulate geometric quantities using numeric and algebraic techniques. By itself, mathematical geometry is pure abstraction without a connection to the physical world.
  • A datum is a set of physical objects with associated coordinates, thereby defining a reference frame in a way which is physically accessible. A datum is the bridge which connects physical reality to the abstract ideal of mathematical geometry, via the algebraic mechanism of a coordinate system.

The API

Coordinate types

Geodesy provides several in-built coordinate storage types for convenience and safety. The philosophy is to avoid carrying around raw data in generic containers like Vectors with no concept of what coordinate system it is in.

LLA{T} - latitude, longitude and altitude

The global LLA type stores data in a lat-lon-alt order, where latitude and longitude are expected in degrees (not radians). A keyword constructor, LLA(lat=x, lon=y, alt=z), is also provided to help with having to remember the storage order.

LatLon{T} - latitude and longitude

The 2D LatLon type stores data in a lat-lon order, where latitude and longitude are expected in degrees (not radians). A keyword constructor, LatLon(lat=x, lon=y), is also provided. LatLon is currently the only supported 2D coordinate.

ECEF{T} - Earth-centered, Earth-fixed

The global ECEF type stores Cartesian coordinates x, y, z, according to the usual convention. Being a Cartesian frame, ECEF is a subtype of StaticArrays' StaticVector and they can be added and subtracted with themselves and other vectors.

UTM{T} - universal transverse-Mercator

The UTM type encodes the easting x, northing y and height z of a UTM coordinate in an unspecified zone. This data type is also used to encode universal polar-stereographic (UPS) coordinates (where the zone is 0).

UTMZ{T} - universal transverse-Mercator + zone

In addition to the easting x, northing y and height z, the global UTMZ type also encodes the UTM zone and hemisphere, where zone is a UInt8 and hemisphere is a Bool for compact storage. The northern hemisphere is denoted as true, and the southern as false. Zone 0 corresponds to the UPS projection about the corresponding pole, otherwise zone is an integer between 1 and 60.

ENU{T} - east-north-up

The ENU type is a local Cartesian coordinate that encodes a point's distance towards east e, towards north n and upwards u with respect to an unspecified origin. Like ECEF, ENU is also a subtype of StaticVector.

Geodetic Datums

Geodetic datums are modelled as subtypes of the abstract type Datum. The associated ellipsoid may be obtained by calling the ellipsoid() function, for example, ellipsoid(NAD83()).

There are several pre-defined datums. Worldwide datums include

  • WGS84 - standard GPS datum for moderate precision work (representing both the latest frame realization, or if time is supplied a discontinuous dynamic datum where time looks up the frame implementation date in the broadcast ephemerides.)
  • WGS84{GpsWeek} - specific realizations of the WGS84 frame.
  • ITRF{Year} - Realizations of the International Terrestrial Reference System for high precision surveying.

National datums include

  • OSGB36 - Ordnance Survey of Great Britain of 1936.
  • NAD27, NAD83 - North American Datums of 1927 and 1983, respectively
  • GDA94 - Geocentric Datum of Australia, 1994.

Datums may also be passed to coordinate transformation constructors such as transverse-Mercator and polar-stereographic projections in which case the associated ellipsoid will be extracted to form the transformation. For datums without extra parameters (everything except ITRF and WGS84{Week}) there is a standard instance defined to reduce the amount of brackets you have to type. For example, LLAfromECEF(NAD83()) and LLAfromECEF(nad83) are equivalent.

Transformations and conversions

Geodesy provides two interfaces changing coordinate systems.

"Transformations" are based on CoordinateTransformations interface for defining AbstractTransformations and allow the user to apply them by calling them, invert them with inv() and compose them with compose() or . The transformations cache any possible pre-calculations for efficiency when the same transformation is applied to many points.

"Conversions" are based on type-constructors, obeying simple syntax like LLA(ecef, datum). The datum or other information is always necessary, as no assumptions are made by Geodesy for safety and consistency reasons. Similarly, Base.convert is not defined because, without assumptions, it would require additional information. The main drawback of this approach is that some calculations may not be pre-cached (for instance, the origin of an ENU transformation).

Between LLA and ECEF

The LLAfromECEF and ECEFfromLLA transformations require an ellipsoidal datum to perform the conversion. The exact transformation is performed in both directions, using a port the ECEF → LLA transformation from GeographicLib.

Note that in some cases where points are very close to the centre of the ellipsoid, multiple equivalent LLA points are valid solutions to the transformation problem. Here, as in GeographicLib, the point with the greatest altitude is chosen.

Between LLA and UTM/UTMZ

The LLAfromUTM(Z) and UTM(Z)fromLLA transformations also require an ellipsoidal datum to perform the conversion. The transformation retains a cache of the parameters used in the transformation, which in the case of the transverse-Mercator projection leads to a significant saving.

In all cases zone 0 corresponds to the UPS coordinate system, and the polar-stereographic projection of GeographicLib has been ported to Julia to perform the transformation.

An approximate, 6th-order expansion is used by default for the transverse-Mercator projection and its inverse (though orders 4-8 are defined). The algorithm is a native Julia port of that used in GeographicLib, and is accurate to nanometers for up to several UTM zones away from the reference meridian. However, the series expansion diverges at ±90° from the reference meridian. While the UTMZ-methods will automatically choose the canonical zone and hemisphere for the input, extreme care must be taken to choose an appropriate zone for the UTM methods. (In the future, we implement the exact UTM transformation as a fallback — contributions welcome!)

There is also UTMfromUTMZ and UTMZfromUTM transformations that are helpful for converting between these two formats and putting data into the same UTM zone.

To and from local ENU frames

The ECEFfromENU and ENUfromECEF transformations define the transformation around a specific origin. Both the origin coordinates as an ECEF as well as its corresponding latitude and longitude are stored in the transformation for maximal efficiency when performing multiple transforms. The transformation can be inverted with inv to perform the reverse transformation with respect to the same origin.

Web Mercator support

We support the Web Mercator / Pseudo Mercator projection with the WebMercatorfromLLA and LLAfromWebMercator transformations for interoperability with many web mapping systems. The scaling of the northing and easting is defined to be meters at the Equator, the same as how proj handles this (see https://proj.org/operations/projections/webmerc.html ).

If you need to deal with web mapping tile coordinate systems (zoom levels and pixel coordinates, etc) these could be added by composing another transformation on top of the web mercator projection defined in this package.

Composed transformations

Many other methods are defined as convenience constructors for composed transformations, to go between any two of the coordinate types defined here. These include:

  • ECEFfromUTMZ(datum) = ECEFfromLLA(datum) ∘ LLAfromUTMZ(datum)
  • UTMZfromECEF(datum) = UTMZfromLLA(datum) ∘ LLAfromECEF(datum)
  • UTMfromECEF(zone, hemisphere, datum) = UTMfromLLA(zone, hemisphere, datum) ∘ LLAfromECEF(datum)
  • ECEFfromUTM(zone, hemisphere, datum) = ECEFfromLLA(datum) ∘ LLAfromUTM(zone, hemisphere, datum)
  • ENUfromLLA(origin, datum) = ENUfromECEF(origin, datum) ∘ ECEFfromLLA(datum)
  • LLAfromENU(origin, datum) = LLAfromECEF(datum) ∘ ECEFfromENU(origin, datum)
  • ECEFfromUTMZ(datum) = ECEFfromLLA(datum) ∘ LLAfromUTMZ(datum)
  • ENUfromUTMZ(origin, datum) = ENUfromLLA(origin, datum) ∘ LLAfromUTMZ(datum
  • UTMZfromENU(origin, datum) = UTMZfromLLA(datum) ∘ LLAfromENU(origin, datum)
  • UTMfromENU(origin, zone, hemisphere, datum) = UTMfromLLA(zone, hemisphere, datum) ∘ LLAfromENU(origin, datum)
  • ENUfromUTM(origin, zone, hemisphere, datum) = ENUfromLLA(origin, datum) ∘ LLAfromUTM(zone, hemisphere, datum)

Constructor-based transforms for these are also provided, such as UTMZ(ecef, datum) which converts to LLA as an intermediary, as above. When converting multiple points to or from the same ENU reference frame, it is recommended to use the transformation-based approach for efficiency. However, the other constructor-based conversions should be similar in speed to their transformation counterparts.

Distance

Currently, the only defined distance measure is the straight-line or Euclidean distance, euclidean_distance(x, y, [datum = wgs84]), which works for all combinations of types for x and y - except that the UTM zone and hemisphere must also be provided for UTM types, as in euclidean_distance(utm1, utm2, zone, hemisphere, [datum = wgs84]) (the Cartesian distance for UTM types is not approximated, but achieved via conversion to ECEF).

This is the only function currently in Geodesy which takes a default datum, and should be relatively accurate for close points where Euclidean distances are most important. Future work may focus on geodesics and related calculations (contributions welcome!).

Download Details:

Author: JuliaGeo

Source Code: https://github.com/JuliaGeo/Geodesy.jl 

License: View license

#julia #geospatial 

Geodesy.jl: Work with Points Defined in Various Coordinate Systems

Thin Julia Wrapper for GDAL - Geospatial Data Abstraction Library

GDAL.jl

Julia wrapper for GDAL - Geospatial Data Abstraction Library. This package is a binding to the C API of GDAL/OGR. It provides only a C style usage, where resources must be closed manually, and datasets are pointers.

Other packages can build on top of this to provide a more Julian user experience. See for example ArchGDAL.jl.

Most users will want to use ArchGDAL.jl instead of using GDAL.jl directly.

Installation

This package is registered, so add it using Pkg. This will also download GDAL binaries created in Yggdrasil.

pkg> add GDAL

To check if it is installed correctly, you could run the test suite with:

pkg> test GDAL

Documentation

Docstrings are automatically inserted from the GDAL documentation. Note that these are written for the C API, so function names and argument type names will differ.

julia> using GDAL

help?> GDAL.ogr_g_creategeometry
  OGR_G_CreateGeometry(OGRwkbGeometryType eGeometryType) -> OGRGeometryH

  Create an empty geometry of desired type.

  Parameters
  ––––––––––––

    •    eGeometryType: the type code of the geometry to be created.

  Returns
  –––––––––

  handle to the newly create geometry or NULL on failure. Should be freed with OGRGDestroyGeometry() after use.

Further usage documentation is not yet available, but the files test/tutorial_raster.jl and test/tutorial_vector.jl should provide a good hint based on the API tutorials from GDAL.org.

The bulk of this package is generated automatically by the scripts under gen/.

Using the GDAL and OGR utilities

The provided GDAL installation also contains the commonly used utilities such as gdal_translate and ogr2ogr. They can be called from Julia like so:

using GDAL

# list information about a raster dataset
GDAL.gdalinfo_path() do gdalinfo
    run(`$gdalinfo path/to/raster-file`)
end

# convert raster data between different formats
GDAL.gdal_translate_path() do gdal_translate
    run(`$gdal_translate -of COG input.asc output.tif`)
end

# list information about an OGR-supported data source
GDAL.ogrinfo_path() do ogrinfo
    run(`$ogrinfo path/to/vector-file`)
end

# convert simple features data between file formats
GDAL.ogr2ogr_path() do ogr2ogr
    run(`$ogr2ogr -f FlatGeobuf output.fgb input.shp`)
end

The GDAL.<util>_path are defined in the GDAL_jll package. If you only wish to run the utilities, that package will have all you need. A list of the available utilities can be found here. Documentation for them is available on gdal.org/programs. Note that programs implemented in python (ending in .py) are not available, since those would require a python installations.

Since GDAL 2.1's RFC59.1 most utilities are also available as functions in the library, they are implemented here and tested here. If these are used you can avoid the need for calling the binaries.

If you want to use these utilities from outside julia, note that this will not work unless you set two things:

  1. The environment variable GDAL_DATA must be set to the value returned in julia by GDAL.GDAL_DATA[].
  2. Julia's Sys.BINDIR must be in your path.

Inside of julia (2) is always the case, and (1) happens on loading the GDAL module, in its __init__ function.

Missing driver to support a format

If you get an error such as the one below:

GDALError (CE_Failure, code 6):
    The <...> driver needs to be compiled to support <...>

This means that the GDAL binaries you are using, which normally come from the Yggdrasil community build tree, are not compiled with support for the format or feature you need. GDAL is a large library with many optional dependencies which allow support for more formats. Currently the amount of formats supported is still limited, but will grow over time. Lists of available formats can be found here for rasters and here for vectors. If you need support for another format, consider making an issue in this repository. Many formats need external libraries as added dependencies. This means an Yggdrasil build also needs to be available for that library, and added as a dependency. See issue #65 for a discussion on which new drivers should be prioritized.

Download Details:

Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/GDAL.jl 
License: View license

#julia #vector #geospatial 

Thin Julia Wrapper for GDAL - Geospatial Data Abstraction Library

A High Level API for GDAL - Geospatial Data Abstraction Library

ArchGDAL   

GDAL is a translator library for raster and vector geospatial data formats that is released under an X/MIT license by the Open Source Geospatial Foundation. As a library, it presents an abstract data model to drivers for various raster and vector formats.

This package aims to be a complete solution for working with GDAL in Julia, similar in scope to the SWIG bindings for Python and the user-friendliness of Fiona and Rasterio. It builds on top of GDAL.jl, and provides a high level API for GDAL, espousing the following principles.

Principles (The Arch Way)

(adapted from: https://wiki.archlinux.org/index.php/Arch_Linux#Principles)

  • simplicity: ArchGDAL tries to avoid unnecessary additions or modifications. It preserves the GDAL Data Model and requires minimal dependencies.
  • modernity: ArchGDAL strives to maintain the latest stable release versions of GDAL as long as systemic package breakage can be reasonably avoided.
  • pragmatism: The principles here are only useful guidelines. Ultimately, design decisions are made on a case-by-case basis through developer consensus. Evidence-based technical analysis and debate are what matter, not politics or popular opinion.
  • user-centrality: Whereas other libraries attempt to be more user-friendly, ArchGDAL shall be user-centric. It is intended to fill the needs of those contributing to it, rather than trying to appeal to as many users as possible.
  • versatility: ArchGDAL will strive to remain small in its assumptions about the range of user-needs, and to make it easy for users to build their own extensions/conveniences.

Installation

To install this package, run the following command in the Pkg REPL-mode,

pkg> add ArchGDAL

To test if it is installed correctly,

pkg> test ArchGDAL

Getting Involved

Community

This package will not be possible without JuliaLang, GDAL and GDAL.jl. They are maintained by https://julialang.org/community/, https://www.osgeo.org/ and https://juliageo.org/ respectively. In case of any contention for support and involvement, we encourage participation and contributions to those projects and communities over this package.

Style Guide

ArchGDAL.jl uses JuliaFormatter.jl as an autoformatting tool, and uses the options in .JuliaFormatter.toml.

If you wish to format code, cd to the ArchGDAL.jl directory, then run:

] add JuliaFormatter
using JuliaFormatter
format(".")

Dependencies

To manage the dependencies of this package, we work with environments:

Navigate to the directory corresponding to the package:

$ cd /Users/yeesian/.julia/dev/ArchGDAL
/Users/yeesian/.julia/dev/ArchGDAL

Start a session:

$ julia --project

Activate the environment corresponding to Project.toml):

(@v1.6) pkg> activate .
  Activating environment at `~/.julia/environments/v1.6/Project.toml`

Manage the dependencies using Pkg in https://pkgdocs.julialang.org/v1.6/managing-packages/, e.g.

(ArchGDAL) pkg> st
     Project ArchGDAL v0.6.0
      Status `~/.julia/dev/ArchGDAL/Project.toml`
  [3c3547ce] DiskArrays
  [add2ef01] GDAL
  [68eda718] GeoFormatTypes
  [cf35fbd7] GeoInterface
  [bd369af6] Tables
  [ade2ca70] Dates

(ArchGDAL) pkg> add CEnum
   Resolving package versions...
    Updating `~/.julia/dev/ArchGDAL/Project.toml`
  [fa961155] + CEnum v0.4.1
  [3c3547ce] + DiskArrays v0.2.7
  [add2ef01] + GDAL v1.2.1
  [68eda718] + GeoFormatTypes v0.3.0
  [cf35fbd7] + GeoInterface v0.5.5
  [bd369af6] + Tables v1.4.2

Update the [compat] section of Project.toml so that julia can resolve the versions, e.g.

[compat]
...
CEnum = "0.4"

Download Details:

Author: Yeesian
Source Code: https://github.com/yeesian/ArchGDAL.jl 
License: View license

#julia #vector #geospatial 

A High Level API for GDAL - Geospatial Data Abstraction Library

D3-geo-voronoi: Voronoi / Delaunay Tessellations on The Sphere

d3-geo-voronoi

This module adapts d3-delaunay for spherical data. Given a set of objects in spherical coordinates, it computes their Delaunay triangulation and its dual, the Voronoi diagram.

In addition, it offers convenience methods to extract the convex hull, the Urquhart graph, the circumcenters of the Delaunay triangles, and to find the cell that contains any given point on the sphere.

The module offers two APIs.

The GeoJSON API is the most convenient for drawing the results on a map. It follows as closely as possible the API of the d3-voronoi module. It is available with d3.geoVoronoi().

A lighter API is available with d3.geoDelaunay(). It offers the same contents, but with a different presentation, where every vertex, edge, polygon… is referenced by an id rather than by its coordinates. This allows a more compact representation in memory and eases topology computations.

Installing

If you use npm, npm install d3-geo-voronoi. You can also download the latest release on GitHub. For vanilla HTML in modern browsers, import d3-geo-voronoi from Skypack:

<script type="module">
import {geoDelaunay} from "https://cdn.skypack.dev/d3-geo-voronoi@2";
</script>

For legacy environments, you can load d3-geo-voronoi’s UMD bundle from an npm-based CDN such as jsDelivr; a d3 global is exported:

<script src="https://cdn.jsdelivr.net/npm/d3-geo-voronoi@2"></script>
<script>
d3.geoContour();
</script>

API Reference

Delaunay

This API is a similar to d3-delaunay’s API. It provides information on the Delaunay triangulation (edges, triangles, neighbors, Voronoi cells, etc) as indices in two arrays — the array of points, and the array of circumcenters. It facilitates topological computations. To draw the actual triangles, Voronoi cells etc, the Voronoi API described in the next section will often be easier to use.

# d3.geoDelaunay([data]) · Source

Creates a new spherical Delaunay layout. data must be passed as an array of [lon, lat] coordinates.

# delaunay.find(lon, lat[, node])

Returns the closest point to [lon, lat]; optionally starting the search at node to boost the performance.

# delaunay.urquhart([distances])

Given a vector of distances (in the same order as the edges list), returns a vector of boolean values: true if the edge belongs to the Urquhart graph, false otherwise.

urquhart

# delaunay.hull()

Returns an array of indices of points on the hull. The array is empty if the points cover more than a hemisphere.

# delaunay.edges

An array of edges as indices of points [from, to].

edges

# delaunay.triangles

An array of the triangles, as indices of points [a, b, c]. The triangles are orientated in a clockwise manner, triangles that span more than the hemisphere are removed.

# delaunay.centers

The array of centers in spherical coordinates; the first t centers are the t triangles’s circumcenters. More centers might be listed in order to build the Voronoi diagram for smaller number of points (n≤3).

# delaunay.neighbors

The array of neighbors indices for each vertex.

edges

# delaunay.polygons

Array of Voronoi cells for each vertex. Each cell is an array of centers ordered in a clockwise manner.

# delaunay.mesh

An array containing all the edges of the Voronoi polygons.

Voronoi

This API is a wrapper around the Delaunay API, with inputs and outputs in GeoJSON, ready to draw on a map.

# d3.geoVoronoi([data]) · Source, Examples

Creates a new spherical Voronoi layout. data can be passed as an array of [lon, lat] coordinates, an array of GeoJSON features, or a GeoJSON FeatureCollection.

The following methods are similar to d3-voronoi's methods:

# voronoi.delaunay

The geoDelaunay object used to compute this diagram.

# voronoi.x([x])

Sets or returns the x accessor. The default x and y accessors are smart enough to recognize GeoJSON objects and return the geoCentroid of each feature.

# voronoi.y([y])

Sets or returns the y accessor.

# voronoi.polygons([data]) · Source, Examples

Returns the Voronoi tessellation of the data as a GeoJSON collection of polygons. (If there is only one data point, returns the Sphere). Each polygon exposes its datum in its properties.

# voronoi.cellMesh([data]) · Source

Returns the Voronoi tessellation as a GeoJSON mesh (MultiLineString).

# voronoi.triangles([data]) · Source, Examples

Returns the Voronoi tessellation of the data as a GeoJSON collection of polygons. Each triangle exposes in its properties the three sites, its spherical area (in steradians), and its circumcenter.

# voronoi.mesh([data]) · Source, Examples

Returns the Delaunay edges as a GeoJSON mesh (MultiLineString).

# voronoi.links([data]) · Source, Examples

Returns the Delaunay links of the data as a GeoJSON collection of lines. Each line exposes its source and target in its properties, but also its length (in radians), and a boolean flag for links that belong to the Urquhart graph.

voronoi.extent([extent]) and voronoi.size([size]) are not implemented.

Indeed, defining the “paper extent” of the geoVoronoi polygons can be quite tricky, as this block demonstrates.

# voronoi.find(x,y,[angle]) · Source, Examples

Finds the closest site to point x,y, i.e. the Voronoi polygon that contains it. Optionally, return null if the distance between the point and the site is larger than angle degrees.

# voronoi.hull(data) · Source, Examples

Returns the spherical convex hull of the data array, as a GeoJSON polygon. Returns null if the dataset spans more than a hemisphere. Equivalent to:

voronoi(data).hull();

Contours

Create spherical contours for non-gridded data.

The API of geoContour is similar to that of d3-contour and d3-tricontour:

# d3.geoContour() · Source, Examples

Constructs a new geocontour generator with the default settings.

geoContour

# geocontour(data) · Examples

Returns an array of contours, one for each threshold. The contours are MultiPolygons in GeoJSON format, that contain all the points with a value larger than the threshold. The value is indicated as geometry.value.

The data is passed as an array of points, by default with the format [lon, lat, value].

# geocontour.contour(data[, threshold])

Returns a contour, as a MultiPolygon in GeoJSON format, containing all points with a value larger or equal to threshold. The threshold is indicated as geometry.value

# geocontour.contours(data)

Returns an iterable over the contours.

geoContour iterator

# geocontour.isobands(data)

Returns an iterable over the isobands: contours between pairs of consecutive threshold values v0 (inclusive) and v1 (exclusive). geometry.value is equal to v0, geometry.valueMax to v1.

geoContour isobands

# geocontour.x([x])

Sets the x (longitude) accessor. Defaults to `d => d[0]`. If x is not given, returns the current x accessor.

# geocontour.y([y])

Sets the y (latitude) accessor. Defaults to `d => d[1]`. If y is not given, returns the current y accessor.

# geocontour.value([value])

Sets the value accessor. Defaults to `d => d[2]`. Values must be defined and finite. If value is not given, returns the current value accessor.

Blurry geoContours

geoContour and H3

# geocontour.thresholds([thresholds])

Sets the thresholds, either explicitly as an array of values, or as a count that will be passed to d3.ticks. If empty, returns the current thresholds.

Note: d3.geoContour uses the experimental API of d3-tricontour: triangulate, pointInterpolate and ringsort.

Other tools & projections

There is no reason to limit the display of Voronoi cells to the orthographic projection. The example below displays the Urquhart graph of top container ports on a Winkel tripel map.

Geo_triangulate converts GeoJSON to triangles for 3d rendering.

Comparison with planar Voronoi Diagrams

the Delaunay/Voronoi topology is quite different on the sphere and on the plane. This module deals with these differences by first projecting the points with a stereographic projection, then stitching the geometries that are near the singularity of the projection (the “infinite horizon” on the plane is one point on the sphere).

geoVoronoi returns GeoJSON objects, which are often FeatureCollections. By consequence, you will have to change .data(voronoi.polygons()) to .data(geovoronoi.polygons().features), and so on.

geoVoronoi is built on d3-delaunay, which is also exposed as d3.geoDelaunay in this library. If you want to have the fastest results, you should try to use d3.geoDelaunay directly (see the examples).

geoVoronoi and geoDelaunay offer methods to compute the spherical convex hull and the Urquhart graph of the data set. These can be achieved with the planar Voronoi (hull, Urquhart), but are not part of d3-voronoi or d3-delaunay.

Author: Fil
Source Code: https://github.com/Fil/d3-geo-voronoi 
License: ISC license

#javascript #3d #geospatial 

D3-geo-voronoi: Voronoi / Delaunay Tessellations on The Sphere
Emilie  Okumu

Emilie Okumu

1624358940

Enhancing Data Visualizations with Maps

The dataset used to create the examples below contains all places of worship across the United States and its territories. The data was collected and curated in 2019 and downloaded from Data World: US Places of Worship. Containing over 49,000 records, the dataset includes locations of places of worship, as well as the religious types that each facility is used by. This includes Buddhist, Christian, Hindu, Islamic, Judaic, and Sikh places of worship.

Conventional bar graphs will be used to familiarize us with the information. The same data will then be presented using geospatial visualizations (data placed on maps). Using maps can provide more context around the information, allowing readers to make more connections and obtain a better understanding.

Places of Worship in the United States

It behooves us to first gain an understanding of the distribution of the most popular kinds of places of worship in the United States. Figure 1 provides an interesting spread, with Christian places of worship overwhelmingly the most common. The second most prevalent is Judaic, with Buddhist and Muslim places of worship following. Hindu and other religious places of worship are the least common in the United States.

Figure 2 provides a view of the total number of places of worship in each state and territory. Additionally, we can see the split of which types of places of worship are included in each state. California and Texas, being two of the three largest states, have the most places of worship.

By viewing the same data with a proportional symbol map in Figure 3, the average locations of the establishments within each state are evident. The size of each symbol represents how many of each religion type are represented in each state by the number of worship buildings. While present in all states, there is a concentration of Christian places of worship in the eastern half of the country. The non-green circles show us where other religious places of worship have a larger presence at the state level.

#data-visualization #religion #geospatial #visualization

Enhancing Data Visualizations with Maps
Grace  Lesch

Grace Lesch

1622536376

What Can I Do With Geospatial Search? Let Me Count the Ways

Couchbase [Full Text Search] (FTS) is a great tool for indexing and querying geospatial data. In this article, I’ll present a geospatial search use case and demonstrate the various ways that we can perform a search of location data using the Couchbase Full Text Search service. I’ll be using Couchbase Server Enterprise Edition 6.6 (running [locally in Docker ]) to create an FTS index on my sample geospatial dataset and then run geospatial queries against the index.

My family has always enjoyed visiting and exploring [Great Smoky Mountains National Park] (or GRSM, the National Park Service’s abbreviation), and one day we might be interested in relocating there. But you can’t live in the national park, so we need to consider the various cities and towns near the park and make a short list of the ones to evaluate and possibly visit.

#database #geospatial #geospatial search

What Can I Do With Geospatial Search? Let Me Count the Ways
Kasey  Turcotte

Kasey Turcotte

1620097860

Map Matching Done Right using Valhalla’s Meili

A production-ready solution for dealing with GPS inaccuracies

The problem of GPS being non ideally accurate is not new, as well as the “snap-to-road” functionality that mainly does the matching of GPS traces to roads. Most of the projects that tend to have this problem are solving this either through Google Maps Snap to Roads API or OSRM Match Service API. But I’m going to show you the way I had taken and didn’t regret anything.

Problem

Primarily my problem was to calculate the distances of travelling per day and month done by cars. What’s also important is that I was trying to build a production solution that could do that every day, so not only a research part.

Since I had multiple already preprocessed GPS traces in GeoJSON, I created a Python script that calculates distances using geopy.distance.geodesic function and looping over the whole trip’s geopoints. By my rough estimation, this approach sometimes had an error rate of more than 30%, which is actually horrible. Let me explain why.

As shown in the map below, sometimes GPS traces are not exactly accurate. Obviously, the driver was using the road path here, but the geopoints that we got from the GPS are placed near the road, not on it.

#pandas #gps #geospatial #valhalla #snap-to-road

Map Matching Done Right using Valhalla’s Meili
Jamison  Fisher

Jamison Fisher

1619539200

The Easiest Way to Plot Data From Pandas on A World Map

This guide is intended to be quick and easy, with the least amount of words and least amount of code, to show you how to plot data from a Pandas object on a world map using Matplotlib and Geopandas libraries.

The python libraries you need to install are pandas, geopandas and matplotlib. You can find the code for this tutorial on github: https://github.com/udiy/blogposts/tree/main/Easy%20map

The data in the Pandas object needs to have location coordinates, which means latitude and longitude. For this article, I am using data about fires in Australia, which can be found here: https://www.kaggle.com/carlosparadis/fires-from-space-australia-and-new-zeland.

The data comes from NASA satellites MODIS instrument, they monitor the fires from space, find acknowledgments at the end of this article. In the data, there’s a column called brightness, which is a measure of the temperature (in Kelvin) of the fire.

#gis #data-analysis #geospatial #python #pandas

The Easiest Way to Plot Data From Pandas on A World Map