1678835640
Package polyline
implements a Google Maps Encoding Polyline encoder and decoder.
func ExampleEncodeCoords() {
coords := [][]float64{
{38.5, -120.2},
{40.7, -120.95},
{43.252, -126.453},
}
fmt.Println(string(polyline.EncodeCoords(coords)))
// Output: _p~iF~ps|U_ulLnnqC_mqNvxq`@
}
func ExampleDecodeCoords() {
buf := []byte("_p~iF~ps|U_ulLnnqC_mqNvxq`@")
coords, _, _ := polyline.DecodeCoords(buf)
fmt.Println(coords)
// Output: [[38.5 -120.2] [40.7 -120.95] [43.252 -126.453]]
}
Author: twpayne
Source Code: https://github.com/twpayne/go-polyline
License: BSD-2-Clause license
1678823940
Package geom
implements efficient geometry types for geospatial applications.
sql.Scanner
and driver.Value
interface implementations for easy database integration.The WKB and EWKB formats encode geometry sizes, and memory is allocated for those geometries. If the input is malicious or malformed, the memory allocation can be very large, leading to a memory starvation denial-of-service attack against the server. For example, a client might send a MultiPoint
with header indicating that it contains 2^32-1 points. This will result in the server reading that geometry to allocate 2 × sizeof(float64)
× (2^32-1) = 64GB of memory to store those points. By default, malicious or malformed input protection is disabled, but can be enabled by setting positive values for wkbcommon.MaxGeometryElements
.
Author: twpayne
Source Code: https://github.com/twpayne/go-geom
License: BSD-2-Clause license
1676758080
An open source library and framework for deep learning on satellite and aerial imagery.
Raster Vision is an open source Python library and framework for building computer vision models on satellite, aerial, and other large imagery sets (including oblique drone imagery).
It has built-in support for chip classification, object detection, and semantic segmentation with backends using PyTorch.
As a library, Raster Vision provides a full suite of utilities for dealing with all aspects of a geospatial deep learning workflow: reading geo-referenced data, training models, making predictions, and writing out predictions in geo-referenced formats.
As a low-code framework, Raster Vision allows users (who don't need to be experts in deep learning!) to quickly and repeatably configure experiments that execute a machine learning pipeline including: analyzing training data, creating training chips, training models, creating predictions, evaluating models, and bundling the model files and configuration for easy deployment.
Raster Vision also has built-in support for running experiments in the cloud using AWS Batch.
See the documentation for more details.
For more details, see the Setup documentation.
pip
You can install Raster Vision directly via pip
.
pip install rastervision
Alternatively, you may use a Docker image. Docker images are published to quay.io (see the tags tab).
We publish a new tag per merge into master
, which is tagged with the first 7 characters of the commit hash. To use the latest version, pull the latest
suffix, e.g. raster-vision:pytorch-latest
. Git tags are also published, with the Github tag name as the Docker tag suffix.
You can also build a Docker image from scratch yourself. After cloning this repo, run docker/build
, and run then the container using docker/run
.
Non-developers may find it easiest to use Raster Vision as a low-code framework where Raster Vision handles all the complexities and the user only has to configure a few parameters. The Quickstart guide is a good entry-point into this. More advanced examples can be found on the Examples page.
For developers and those looking to dive deeper or combine Raster Vision with their own code, the best starting point is Usage Overview, followed by Basic Concepts and Tutorials.
You can ask questions and talk to developers (let us know what you're working on!) at:
For more information, see the Contribution page.
We are happy to take contributions! It is best to get in touch with the maintainers about larger features or design changes before starting the work, as it will make the process of accepting changes smoother.
Everyone who contributes code to Raster Vision will be asked to sign the Azavea CLA, which is based off of the Apache CLA.
Download a copy of the Raster Vision Individual Contributor License Agreement or the Raster Vision Corporate Contributor License Agreement
Print out the CLAs and sign them, or use PDF software that allows placement of a signature image.
Send the CLAs to Azavea by one of:
Author: Azavea
Source Code: https://github.com/azavea/raster-vision
License: View license
#machinelearning #python #computervision #deeplearning #geospatial #pytorch
1670711520
Geospatial Analytics is related to data that is used for locating anything on the globe, an uber driver to a man in a new neighbourhood place everybody uses its data in some way or the other. Its technology involves GPS (global positioning systems), GIS (geographical information systems), and RS (remote sensing). This blog we will explore the topic in depth. We start with the basics and then deep dive into all the details.
It is necessary for so many things and is used daily for various reasons. From commuting purposes for an ordinary man to data in missiles of a defence organization of a particular county, everything requires its data. It is extracted from various resources. Every phone having an active internet connection somehow adds up to contributing to geospatial data, satellites collect data daily. It is of great use in everyday life, and so it requires a significant amount of attention. It can be used for various reasons, to help support natural hazards and, to know of disasters, global climate change, wildlife, natural resources, etc. It is used for satellite imagery too that could be for tactical or for weather forecasting purposes. Many tech giants like uber etc. use it on daily bases to help ease everyday life. A company has to be efficient in extracting the data efficiently and use it, to stand out in the market.
Various methods could do this, but mainly Presto and hives are used to extract and reform the data that's present in hundreds of petabyte and use it efficiently and make the lives of billions easy. This data is vital as it touches the mass majority and is used every second. GIS is a part of its data that helps in the collection, storage, manipulation, analyzation, and present spatial data. Whatever the situation is going on at local, regional or national level, if where is asked for it come to play. It wouldn't be effective without Visualization.
Presto is an open-source distributed SQL query, used to solve the question of any size or type. It runs on Hadoop. It supports many non-relational resources and Teradata. It can query data on its respective location, without moving the actual data to any separate system. The execution of the query runs parallel over a pure memory-based architecture, with most results returning within seconds. Many tech giants use it. It's a popular choice for undertaking interactive queries that are in data ranging in100s of PetaByte.
It is a data warehouse infrastructure tool to process any structured data and developed on top of the Hadoop distributed file system. It resides on top of Hadoop to summarize Big Data and makes querying and analyzing of any kind of data accessible.
It is an ETL and Data Warehousing tool built on top of the Hadoop. It helps to perform many operations secure like :
It supports all the application written in languages like Java, Python, C++ etc. It is using Thrift, JDBC and ODBC drivers. It's easy to write its client application in the desired language. Its clients are categorized into three types:-
It provides with various services like -
There is two central part in it: Coordinator and Worker. It is an open-source distributed system that can be run on multiple machines. Its distributed SQL query engine was built for fast analytic queries. Its deployment will include one Coordinator and any number of it.
The key components of presto are:
It is the brain of any installation; it manages all the worker nodes for all the work comes related to queries. It gets results from workers and returns the final output to the client. It connects with workers and clients via REST.
It helps to execute the task and to process the data. These nodes share data amongst each other and get data from the Coordinator.
It contains information related to data, such as where the data is located, where the schema is located and the data source.
It is similar to what it means in a relational database. The table is set of rows organized into named columns and schema is what you use to hold your tables.
lt issued to help it to integrate with the external data source.
To execute a query, Presto breaks it up into steps.
Stages are implemented as a series of functions that might get distributed on Workers.
Tasks contains one or more parallel drivers, and they are operators in memory. An operator consumes, transforms and produces data.
The deployment strategies for Hive are listed below:
Amazon EMR is used to deploy its megastore. User can opt from three configurations that Amazon has to offer, namely – Embedded, Local or Remote. There are two options for creating an external Hive megastore for EMR:
Apache Hive on Cloud Dataproc provides an efficient and flexible way by storing data of it in Cloud Storage and hosting its metastore in MySQL database on the Cloud SQL. It offers some advantages like flexibility and agility by letting user tailor cluster configuration for specific workloads and scale the cluster according to the need. It also helps in saving cost.
The deployment strategies for Presto
Amazon EMR allows to quickly spin up a managed EMR cluster with a presto query engine and run interactive analysis on the data stored in Amazon S3. It is used to run interactive queries. Its implementation can be built on the cloud on Amazon Web Services. Amazon EMR and Amazon Athena provides with building and implementation of it.
The cluster that includes its component can easily prepare in Presto.
The various ways to optimise are described below:
The advantages of Hive and Presto are:
Modelling geospatial data has quite many complexities. Well, Known Texts are used to model different locations on the map. Various types like point and polygon shapes are used for these purposes. The Spatial Library is used for spatial processing in it with User-Defined Functions and SerDes. Through allowing this library in it, queries may be created using its Query Language (HQL), which is somewhat close to SQL. You will, therefore, stop complex MapReduce algorithms and stick to a more common workflow. Its plugin is running in production at Uber. All GeoSpatial traffic at Uber, more than 90% of it is completed within 5 minutes. Compared with brute force its MapReduce execution, Uber's Geospatial Plugin is more than 50X faster, leading to greater efficiency.
Presto has the edge over Hive as it can be used to process unstructured data too, and query processing in it is faster than that in it. The data is collected in a humongous amount daily, and it needs to be extracted efficiently and judiciously to have better working software that requires it.
Original article source at: https://www.xenonstack.com/
1668078000
terra
is an R package for spatial data analysis. There are tutorials at rspatial.org/terra.
stackoverflow is the best place to ask questions if you get stuck. Make sure to include a simple reproducible example. But if you think you have found a bug, please file an issue.
terra
replaces the raster package. The interfaces of terra
and raster
are similar, but terra
is simpler, faster and can do more.
terra
is available from CRAN, so you can use install.packages("terra")
to get the current released version.
The easiest way to use the development version on Windows or MacOS, is to install it from the R-universe, like this:
install.packages('terra', repos='https://rspatial.r-universe.dev')
To install from source-code, first install the Rcpp package that terra depends on:
install.packages("Rcpp")
And then continue based on the OS you are using.
On Windows, you need to first install Rtools to get a C++ compiler that R can use. You need a recent version of Rtools42 (rtools42-5355-5357).
Then, in R, install the package.
Sys.setenv("R_REMOTES_NO_ERRORS_FROM_WARNINGS" = "true")
remotes::install_github("rspatial/terra")
On macOS, first install gdal and proj with homebrew
brew install pkg-config
brew install gdal
Followed by (note the additional configuration argument needed for the current homebrew version of proj (9.1.0)
remotes::install_github("rspatial/terra", configure.args = "--with-proj-lib=/opt/homebrew/Cellar/proj/9.1.0/lib/")
To install the CRAN version from source you would do
install.packages("terra", configure.args = "--with-proj-lib=/opt/homebrew/Cellar/proj/9.1.0/lib/")
The easy way to install terra on linux is with r2u.
The harder way: C++11, GDAL (>= 2.2.3), GEOS (>= 3.4.0), PROJ (>= 4.9.3), sqlite3 are required, but more recent versions highly recommended.
To install these system requirements on Ubuntu you can do:
sudo add-apt-repository ppa:ubuntugis/ubuntugis-unstable
sudo apt-get update
sudo apt-get install libgdal-dev libgeos-dev libproj-dev
And now, in R, install the package
remotes::install_github("rspatial/terra")
See the sf
instructions for installation on other linux systems --- and for possible updates/improvements on the above instructions.
Author: rspatial
Source Code: https://github.com/rspatial/terra
License: GPL-3.0 license
1662059040
This library supports reading ESRI Shapefiles in pure Julia.
Basic example of reading a shapefile from test cases:
using Shapefile
path = joinpath(dirname(pathof(Shapefile)),"..","test","shapelib_testcases","test.shp")
table = Shapefile.Table(path)
# if you only want the geometries and not the metadata in the DBF file
geoms = Shapefile.shapes(table)
# whole columns can be retrieved by their name
table.Descriptio # => Union{String, Missing}["Square with triangle missing", "Smaller triangle", missing]
# example function that iterates over the rows and gathers shapes that meet specific criteria
function selectshapes(table)
geoms = empty(Shapefile.shapes(table))
for row in table
if !ismissing(row.TestDouble) && row.TestDouble < 2000.0
push!(geoms, Shapefile.shape(row))
end
end
return geoms
end
# the metadata can be converted to other Tables such as DataFrame
using DataFrames
df = DataFrame(table)
Shapefiles can contain multiple parts for each shape entity. Use GeoInterface.coordinates
to fully decompose the shape data into parts.
# Example of converting the 1st shape of the file into parts (array of coordinates)
julia> GeoInterface.coordinates(Shapefile.shape(first(table)))
2-element Vector{Vector{Vector{Vector{Float64}}}}:
[[[20.0, 20.0], [20.0, 30.0], [30.0, 30.0], [20.0, 20.0]]]
[[[0.0, 0.0], [100.0, 0.0], [100.0, 100.0], [0.0, 100.0], [0.0, 0.0]]]
If you want another lightweight pure Julia package for reading feature files, consider also GeoJSON.jl.
For much more fully featured support for reading and writing geospatial data, at the cost of a larger binary dependency, look at GDAL.jl or ArchGDAL.jl packages. The latter builds a higher level API on top of GDAL.jl.
Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/Shapefile.jl
License: View license
1662031620
NetCDF support for the Julia programming language, there is a high-level and a medium-level interface for writing and reading netcdf files.
Installation
pkg> add NetCDF
Quickstart
First, load the library:
using NetCDF
The high-level interface is quite similar to the Matlab NetCDF interface, reading files is done by:
x = ncread("myfile.nc", "Radiation")
which will read the variable called "Radiation" from the file "myfile.nc". General information can be gained by using
ncinfo(filename)
which gives an overview of the dimensions, variables and attributes stored in the file.
filename = "myfile.nc"
varname = "var1"
attribs = Dict("units" => "mm/d",
"data_min" => 0.0,
"data_max" => 87.0)
Creating variables and files is done by using the nccreate command:
nccreate(filename, varname, "x1", collect(11:20), "t", 20, Dict("units"=>"s"), atts=attribs)
This will create the variable called var1 in the file myfile.nc. The attributes defined in the Dict attribs are written to the file and are associated with the newly created variable. The dimensions "x1" and "t" of the variable are called "x1" and "t" in this example. If the dimensions do not exist yet in the file, they will be created. The dimension "x1" will be of length 10 and have the values 11..20, and the dimension "t" will have length 20 and the attribute "units" with the value "s".
Now we can write data to the file:
d = rand(10, 20)
ncwrite(d, filename, varname)
The full documentation can be found here
An alternative interface for reading NetCDF files can be found here: Alexander-Barth/NCDatasets.jl
DiskArray interface
As of version 0.9 we provide an interface to DiskArrays.jl, which lets you treat NetCDF variables as Julia AbstractArrays. Special methods for accessing, reading slices and reductions and broadcasting are implemented, so that working with the arrays should be efficient.
So the following returns a lazy AbstractArray container referencing the data in the variable.
v = NetCDF.open(filename, varname)
This package was originally started and is mostly maintained by Fabian Gans (fgans@bgc-jena.mpg.de). The automatic C wrapper generator was contributed by Martijn Visser (https://github.com/visr). Many thanks to several people who contributed bug fixes and enhancements.
Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/NetCDF.jl
License: View license
1661942160
Geodesy is a Julia package for working with points in various world and local coordinate systems. The primary feature of Geodesy is to define and perform coordinate transformations in a convenient and safe framework, leveraging the CoordinateTransformations package. Transformations are accurate and efficient and implemented in native Julia code (with many functions being ported from Charles Karney's GeographicLib C++ library), and some common geodetic datums are provided for convenience.
Lets define a 3D point by its latitude, longitude and altitude (LLA):
x_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
This can be converted to a Cartesian Earth-Centered-Earth-Fixed (ECEF) coordinate simply by calling the constructor
x_ecef = ECEF(x_lla, wgs84)
Here we have used the WGS-84 ellipsoid to calculate the transformation, but other datums such as osgb36
, nad27
and grs80
are provided. All transformations use the CoordinateTransformations' interface, and the above is short for
x_ecef = ECEFfromLLA(wgs84)(x_lla)
where ECEFfromLLA
is a type inheriting from CoordinateTransformations' Transformation
. (Similar names XfromY
exist for each of the coordinate types.)
Often, points are measured or required in a local frame, such as the north-east-up coordinates with respect to a given origin. The ENU
type represents points in this coordinate system and we may transform between ENU and globally referenced coordinates using ENUfromLLA
, etc.
origin_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
point_lla = LLA(-27.465933, 153.025900, 0.0) # Central Station, Brisbane, Australia
# Define the transformation and execute it
trans = ENUfromLLA(origin_lla, wgs84)
point_enu = trans(point_lla)
# Equivalently
point_enu = ENU(point_enu, point_origin, wgs84)
Similarly, we could convert to UTM/UPS coordinates, and two types are provided for this - UTM
stores 3D coordinates x
, y
, and z
in an unspecified zone, while UTMZ
includes the zone
number and hemisphere
bool (where true
= northern, false
= southern). To get the canonical zone for your coordinates, simply use:
x_utmz = UTMZ(x_lla, wgs84)
If you are transforming a large number of points to or from a given zone, it may be more effective to define the transformation explicitly and use the lighter UTM
storage type.
points_lla::Vector{LLA{Float64}}
utm_from_lla = UTMfromLLA(56, false, wgs84) # Zone 56-South
points_utm = map(utm_from_lla, points_lla) # A new vector of UTM coordinates
Geodesy becomes particularly powerful when you chain together transformations. For example, you can define a single transformation from your data on disk in UTM coordinates to a local frame in ENU coordinates. Internally, this will perform UTM (+ zone) → LLA → ECEF → ENU via composing transformations with ∘
into a ComposedTransformation
:
julia> origin = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
LLA(lat=-27.468937°, lon=153.023628°, alt=0.0)
julia> trans = ENUfromUTMZ(origin, wgs84)
(ENUfromECEF(ECEF(-5.046925124630393e6, 2.5689157252069353e6, -2.924416653602336e6), lat=-27.468937°, lon=153.023628°) ∘ (ECEFfromLLA(wgs84) ∘ LLAfromUTMZ(wgs84)))
This transformation can then be composed with rotations and translations in CoordinateTransformations (or your own custom-defined AbstractTransformation
to define further reference frames. For example, in this way, a point measured by a scanner on a moving vehicle at a particular time may be globally georeferenced with a single call to the Transformation
!
Finally, the Cartesian distance between world points can be calculated via automatic transformation to a Cartesian frame:
x_lla = LLA(-27.468937, 153.023628, 0.0) # City Hall, Brisbane, Australia
y_lla = LLA(-27.465933, 153.025900, 0.0) # Central Station, Brisbane, Australia
distance(x_lla, y_lla) # 401.54 meters
(assuming the wgs84
datum, which can be configured in distance(x, y, datum)
).
This section describes some terminology and concepts that are relevant to Geodesy.jl, attempting to define Geodesy-specific jargon where possible. For a longer, less technical discussion with more historical context, ICSM's Fundamentals of Mapping page is highly recommended.
A position on the Earth can be given by some numerical coordinate values, but those don't mean much without more information. The extra information is called the Coordinate Reference System or CRS (also known as a Spatial Reference System or SRS). A CRS tells you two main things:
The full specification of a CRS can be complex, so a short label called a Spatial Reference IDentifier or SRID is usually used instead. For example, EPSG:4326 is one way to refer to the 2D WGS84 latitude and longitude you'd get from a mobile phone GPS device. An SRID is of the form AUTHORITY:CODE
, where the code is a number and the authority is the name of an organization maintaining a list of codes with associated CRS information. There are services where you can look up a CRS, for example, http://epsg.io is a convenient interface to the SRIDs maintained by the European Petroleum Survey Group (EPSG) authority. Likewise, http://spatialreference.org is an open registry to which anyone can contribute.
When maintaining a spatial database, it's typical to define an internal list of SRIDs (effectively making your organization the authority), and a mapping from these to CRS information. A link back to a definitive SRID from an external authority should also be included where possible.
In spatial measurement and positioning, a datum is a set of reference objects with given coordinates, relative to which other objects may be positioned. For example, in traditional surveying a datum might comprise a pair of pegs in the ground, separated by a carefully measured distance. When surveying the position of an unknown but nearby point, the angle back to the original datum objects can be measured using a theodolite. After this, the relative position of the new point can be computed using simple triangulation. Repeating this trick with any of the now three known points, an entire triangulation network of surveyed objects can be extended outward. Any point surveyed relative to the network is said to be measured in the datum of the original objects. Datums are often named with an acronym, for example OSGB36 is the Ordnance Survey of Great Britain, 1936.
In the era of satellite geodesy, coordinates are determined for an object by timing signals from a satellite constellation (eg, the GPS satellites) and computing position relative to those satellites. Where is the datum here? At first glance the situation seems quite different from the traditional setup described above. However, the satellite positions as a function of time (ephemerides, in the jargon) must themselves be defined relative to some frame. This is done by continuously observing the satellites from a set of highly stable ground stations equipped with GPS receivers. It is the full set of these ground stations and their assigned coordinates which form the datum.
Let's inspect the flow of positional information in both cases:
datum object positions -> triangulation network -> newly surveyed point
datum object positions -> satellite ephemerides -> newly surveyed point
We see that the basic nature of a datum is precisely the same regardless of whether we're doing a traditional survey or using a GPS receiver.
Coordinates for new points are measured by transferring coordinates from the datum objects, as described above. However, how do we decide on coordinates for the datum objects themselves? This is purely a matter of convention, consistency and measurement.
For example, the International Terrestrial Reference System (ITRS) is a reference system that rotates with the Earth so that the average velocity of the crust is zero. That is, in this reference system the only crust movement is geophysical. Roughly speaking, the defining conventions for the ITRS are:
The precise conventions are defined in chapter 4 of the IERS conventions published by the International Earth Rotation and Reference Service (IERS). These conventions define an ideal reference system, but they're useless without physical measurements that give coordinates for a set of real world datum objects. The process of measuring and computing coordinates for datum objects is called realizing the reference system and the result is called a reference frame. For example, the International Terrestrial Reference Frame of 2014 (ITRF2014) realizes the ITRS conventions using raw measurement data gathered in the 25 years prior to 2014.
To measure and compute coordinates, several space geodesy techniques are used to gather raw measurement data; currently the IERS includes VLBI (very long baseline interferometry) of distant astronomical radio sources, SLR (satellite laser ranging), GPS (global positioning system) and DORIS (gosh these acronyms are tiring). The raw data is not in the form of positions, but must be condensed down in a large scale fitting problem, ideally by requiring physical and statistical consistency of all measurements, tying measurements at different sites together with physical models.
In geometry, a coordinate system is a system which uses one or more numbers, or coordinates to uniquely determine the position of a point in a mathematical space such as Euclidean space. For example, in geodesy a point is commonly referred to using geodetic latitude, longitude and height relative to a given reference ellipsoid; this is called a geodetic coordinate system.
An ellipsoid is chosen because it's a reasonable model for the shape of the Earth and its gravitational field without being overly complex; it has only a few parameters, and a simple mathematical form. The term spheroid is also used because the ellipsoids in use today are rotationally symmetric around the pole. Note that there's several ways to define latitude on an ellipsoid. The most natural for geodesy is geodetic latitude, used by default because it's physically accessible in any location as a good approximation to the angle between the gravity vector and the equatorial plane. (This type of latitude is not an angle measured at the centre of the ellipsoid, which may be surprising if you're used to spherical coordinates!)
There are usually several useful coordinate systems for the same space. As well as the geodetic coordinates mentioned above, it's common to see
Different coordinates systems provide different coordinates for the same point, so it's obviously important to specify exactly which coordinate system you're using. In particular, you should specify which ellipsoid parameters are in use if you deal with latitude and longitude, as in principle you could have more than one ellipsoid. This is a point of confusion, because a datum in geodesy also comes with a reference ellipsoid as a very strong matter of convention (thus being called a geodetic datum).
With its conventional ellipsoid, a geodetic datum also defines a conventional geodetic coordinate system, thus bringing together concepts which are interconnected but conceptually distinct. To emphasize:
Geodesy provides several in-built coordinate storage types for convenience and safety. The philosophy is to avoid carrying around raw data in generic containers like Vector
s with no concept of what coordinate system it is in.
LLA{T}
- latitude, longitude and altitude
The global LLA
type stores data in a lat-lon-alt order, where latitude and longitude are expected in degrees (not radians). A keyword constructor, LLA(lat=x, lon=y, alt=z)
, is also provided to help with having to remember the storage order.
LatLon{T}
- latitude and longitude
The 2D LatLon
type stores data in a lat-lon order, where latitude and longitude are expected in degrees (not radians). A keyword constructor, LatLon(lat=x, lon=y)
, is also provided. LatLon
is currently the only supported 2D coordinate.
ECEF{T}
- Earth-centered, Earth-fixed
The global ECEF
type stores Cartesian coordinates x
, y
, z
, according to the usual convention. Being a Cartesian frame, ECEF
is a subtype of StaticArrays' StaticVector
and they can be added and subtracted with themselves and other vectors.
UTM{T}
- universal transverse-Mercator
The UTM
type encodes the easting x
, northing y
and height z
of a UTM coordinate in an unspecified zone. This data type is also used to encode universal polar-stereographic (UPS) coordinates (where the zone is 0
).
UTMZ{T}
- universal transverse-Mercator + zone
In addition to the easting x
, northing y
and height z
, the global UTMZ
type also encodes the UTM zone
and hemisphere
, where zone
is a UInt8
and hemisphere
is a Bool
for compact storage. The northern hemisphere is denoted as true
, and the southern as false
. Zone 0
corresponds to the UPS projection about the corresponding pole, otherwise zone
is an integer between 1
and 60
.
ENU{T}
- east-north-up
The ENU
type is a local Cartesian coordinate that encodes a point's distance towards east e
, towards north n
and upwards u
with respect to an unspecified origin. Like ECEF
, ENU
is also a subtype of StaticVector
.
Geodetic datums are modelled as subtypes of the abstract type Datum
. The associated ellipsoid may be obtained by calling the ellipsoid()
function, for example, ellipsoid(NAD83())
.
There are several pre-defined datums. Worldwide datums include
WGS84
- standard GPS datum for moderate precision work (representing both the latest frame realization, or if time is supplied a discontinuous dynamic datum where time looks up the frame implementation date in the broadcast ephemerides.)WGS84{GpsWeek}
- specific realizations of the WGS84 frame.ITRF{Year}
- Realizations of the International Terrestrial Reference System for high precision surveying.National datums include
OSGB36
- Ordnance Survey of Great Britain of 1936.NAD27
, NAD83
- North American Datums of 1927 and 1983, respectivelyGDA94
- Geocentric Datum of Australia, 1994.Datums may also be passed to coordinate transformation constructors such as transverse-Mercator and polar-stereographic projections in which case the associated ellipsoid will be extracted to form the transformation. For datums without extra parameters (everything except ITRF
and WGS84{Week}
) there is a standard instance defined to reduce the amount of brackets you have to type. For example, LLAfromECEF(NAD83())
and LLAfromECEF(nad83)
are equivalent.
Geodesy provides two interfaces changing coordinate systems.
"Transformations" are based on CoordinateTransformations interface for defining AbstractTransformation
s and allow the user to apply them by calling them, invert them with inv()
and compose them with compose()
or ∘
. The transformations cache any possible pre-calculations for efficiency when the same transformation is applied to many points.
"Conversions" are based on type-constructors, obeying simple syntax like LLA(ecef, datum)
. The datum
or other information is always necessary, as no assumptions are made by Geodesy for safety and consistency reasons. Similarly, Base.convert
is not defined because, without assumptions, it would require additional information. The main drawback of this approach is that some calculations may not be pre-cached (for instance, the origin of an ENU transformation).
LLA
and ECEF
The LLAfromECEF
and ECEFfromLLA
transformations require an ellipsoidal datum to perform the conversion. The exact transformation is performed in both directions, using a port the ECEF → LLA transformation from GeographicLib.
Note that in some cases where points are very close to the centre of the ellipsoid, multiple equivalent LLA
points are valid solutions to the transformation problem. Here, as in GeographicLib, the point with the greatest altitude is chosen.
LLA
and UTM
/UTMZ
The LLAfromUTM(Z)
and UTM(Z)fromLLA
transformations also require an ellipsoidal datum to perform the conversion. The transformation retains a cache of the parameters used in the transformation, which in the case of the transverse-Mercator projection leads to a significant saving.
In all cases zone 0
corresponds to the UPS coordinate system, and the polar-stereographic projection of GeographicLib has been ported to Julia to perform the transformation.
An approximate, 6th-order expansion is used by default for the transverse-Mercator projection and its inverse (though orders 4-8 are defined). The algorithm is a native Julia port of that used in GeographicLib, and is accurate to nanometers for up to several UTM zones away from the reference meridian. However, the series expansion diverges at ±90° from the reference meridian. While the UTMZ
-methods will automatically choose the canonical zone and hemisphere for the input, extreme care must be taken to choose an appropriate zone for the UTM
methods. (In the future, we implement the exact UTM transformation as a fallback — contributions welcome!)
There is also UTMfromUTMZ
and UTMZfromUTM
transformations that are helpful for converting between these two formats and putting data into the same UTM
zone.
ENU
framesThe ECEFfromENU
and ENUfromECEF
transformations define the transformation around a specific origin. Both the origin coordinates as an ECEF
as well as its corresponding latitude and longitude are stored in the transformation for maximal efficiency when performing multiple transform
s. The transformation can be inverted with inv
to perform the reverse transformation with respect to the same origin.
We support the Web Mercator / Pseudo Mercator projection with the WebMercatorfromLLA
and LLAfromWebMercator
transformations for interoperability with many web mapping systems. The scaling of the northing and easting is defined to be meters at the Equator, the same as how proj handles this (see https://proj.org/operations/projections/webmerc.html ).
If you need to deal with web mapping tile coordinate systems (zoom levels and pixel coordinates, etc) these could be added by composing another transformation on top of the web mercator projection defined in this package.
Many other methods are defined as convenience constructors for composed transformations, to go between any two of the coordinate types defined here. These include:
ECEFfromUTMZ(datum) = ECEFfromLLA(datum) ∘ LLAfromUTMZ(datum)
UTMZfromECEF(datum) = UTMZfromLLA(datum) ∘ LLAfromECEF(datum)
UTMfromECEF(zone, hemisphere, datum) = UTMfromLLA(zone, hemisphere, datum) ∘ LLAfromECEF(datum)
ECEFfromUTM(zone, hemisphere, datum) = ECEFfromLLA(datum) ∘ LLAfromUTM(zone, hemisphere, datum)
ENUfromLLA(origin, datum) = ENUfromECEF(origin, datum) ∘ ECEFfromLLA(datum)
LLAfromENU(origin, datum) = LLAfromECEF(datum) ∘ ECEFfromENU(origin, datum)
ECEFfromUTMZ(datum) = ECEFfromLLA(datum) ∘ LLAfromUTMZ(datum)
ENUfromUTMZ(origin, datum) = ENUfromLLA(origin, datum) ∘ LLAfromUTMZ(datum
UTMZfromENU(origin, datum) = UTMZfromLLA(datum) ∘ LLAfromENU(origin, datum)
UTMfromENU(origin, zone, hemisphere, datum) = UTMfromLLA(zone, hemisphere, datum) ∘ LLAfromENU(origin, datum)
ENUfromUTM(origin, zone, hemisphere, datum) = ENUfromLLA(origin, datum) ∘ LLAfromUTM(zone, hemisphere, datum)
Constructor-based transforms for these are also provided, such as UTMZ(ecef, datum)
which converts to LLA
as an intermediary, as above. When converting multiple points to or from the same ENU reference frame, it is recommended to use the transformation-based approach for efficiency. However, the other constructor-based conversions should be similar in speed to their transformation counterparts.
Currently, the only defined distance measure is the straight-line or Euclidean distance, euclidean_distance(x, y, [datum = wgs84])
, which works for all combinations of types for x
and y
- except that the UTM zone and hemisphere must also be provided for UTM
types, as in euclidean_distance(utm1, utm2, zone, hemisphere, [datum = wgs84])
(the Cartesian distance for UTM
types is not approximated, but achieved via conversion to ECEF
).
This is the only function currently in Geodesy which takes a default datum, and should be relatively accurate for close points where Euclidean distances are most important. Future work may focus on geodesics and related calculations (contributions welcome!).
Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/Geodesy.jl
License: View license
1661938260
Julia wrapper for GDAL - Geospatial Data Abstraction Library. This package is a binding to the C API of GDAL/OGR. It provides only a C style usage, where resources must be closed manually, and datasets are pointers.
Other packages can build on top of this to provide a more Julian user experience. See for example ArchGDAL.jl.
Most users will want to use ArchGDAL.jl instead of using GDAL.jl directly.
This package is registered, so add it using Pkg
. This will also download GDAL binaries created in Yggdrasil.
pkg> add GDAL
To check if it is installed correctly, you could run the test suite with:
pkg> test GDAL
Docstrings are automatically inserted from the GDAL documentation. Note that these are written for the C API, so function names and argument type names will differ.
julia> using GDAL
help?> GDAL.ogr_g_creategeometry
OGR_G_CreateGeometry(OGRwkbGeometryType eGeometryType) -> OGRGeometryH
Create an empty geometry of desired type.
Parameters
––––––––––––
• eGeometryType: the type code of the geometry to be created.
Returns
–––––––––
handle to the newly create geometry or NULL on failure. Should be freed with OGRGDestroyGeometry() after use.
Further usage documentation is not yet available, but the files test/tutorial_raster.jl
and test/tutorial_vector.jl
should provide a good hint based on the API tutorials from GDAL.org.
The bulk of this package is generated automatically by the scripts under gen/
.
The provided GDAL installation also contains the commonly used utilities such as gdal_translate
and ogr2ogr
. They can be called from Julia like so:
using GDAL
# list information about a raster dataset
GDAL.gdalinfo_path() do gdalinfo
run(`$gdalinfo path/to/raster-file`)
end
# convert raster data between different formats
GDAL.gdal_translate_path() do gdal_translate
run(`$gdal_translate -of COG input.asc output.tif`)
end
# list information about an OGR-supported data source
GDAL.ogrinfo_path() do ogrinfo
run(`$ogrinfo path/to/vector-file`)
end
# convert simple features data between file formats
GDAL.ogr2ogr_path() do ogr2ogr
run(`$ogr2ogr -f FlatGeobuf output.fgb input.shp`)
end
The GDAL.<util>_path
are defined in the GDAL_jll
package. If you only wish to run the utilities, that package will have all you need. A list of the available utilities can be found here. Documentation for them is available on gdal.org/programs. Note that programs implemented in python (ending in .py) are not available, since those would require a python installations.
Since GDAL 2.1's RFC59.1 most utilities are also available as functions in the library, they are implemented here and tested here. If these are used you can avoid the need for calling the binaries.
If you want to use these utilities from outside julia, note that this will not work unless you set two things:
GDAL_DATA
must be set to the value returned in julia by GDAL.GDAL_DATA[]
.Sys.BINDIR
must be in your path.Inside of julia (2) is always the case, and (1) happens on loading the GDAL
module, in its __init__
function.
If you get an error such as the one below:
GDALError (CE_Failure, code 6):
The <...> driver needs to be compiled to support <...>
This means that the GDAL binaries you are using, which normally come from the Yggdrasil community build tree, are not compiled with support for the format or feature you need. GDAL is a large library with many optional dependencies which allow support for more formats. Currently the amount of formats supported is still limited, but will grow over time. Lists of available formats can be found here for rasters and here for vectors. If you need support for another format, consider making an issue in this repository. Many formats need external libraries as added dependencies. This means an Yggdrasil build also needs to be available for that library, and added as a dependency. See issue #65 for a discussion on which new drivers should be prioritized.
Author: JuliaGeo
Source Code: https://github.com/JuliaGeo/GDAL.jl
License: View license
1661930342
GDAL is a translator library for raster and vector geospatial data formats that is released under an X/MIT license by the Open Source Geospatial Foundation. As a library, it presents an abstract data model to drivers for various raster and vector formats.
This package aims to be a complete solution for working with GDAL in Julia, similar in scope to the SWIG bindings for Python and the user-friendliness of Fiona and Rasterio. It builds on top of GDAL.jl, and provides a high level API for GDAL, espousing the following principles.
(adapted from: https://wiki.archlinux.org/index.php/Arch_Linux#Principles)
To install this package, run the following command in the Pkg REPL-mode,
pkg> add ArchGDAL
To test if it is installed correctly,
pkg> test ArchGDAL
This package will not be possible without JuliaLang, GDAL and GDAL.jl. They are maintained by https://julialang.org/community/, https://www.osgeo.org/ and https://juliageo.org/ respectively. In case of any contention for support and involvement, we encourage participation and contributions to those projects and communities over this package.
ArchGDAL.jl uses JuliaFormatter.jl as an autoformatting tool, and uses the options in .JuliaFormatter.toml
.
If you wish to format code, cd
to the ArchGDAL.jl directory, then run:
] add JuliaFormatter
using JuliaFormatter
format(".")
To manage the dependencies of this package, we work with environments:
Navigate to the directory corresponding to the package:
$ cd /Users/yeesian/.julia/dev/ArchGDAL
/Users/yeesian/.julia/dev/ArchGDAL
Start a session:
$ julia --project
Activate the environment corresponding to Project.toml
):
(@v1.6) pkg> activate .
Activating environment at `~/.julia/environments/v1.6/Project.toml`
Manage the dependencies using Pkg in https://pkgdocs.julialang.org/v1.6/managing-packages/, e.g.
(ArchGDAL) pkg> st
Project ArchGDAL v0.6.0
Status `~/.julia/dev/ArchGDAL/Project.toml`
[3c3547ce] DiskArrays
[add2ef01] GDAL
[68eda718] GeoFormatTypes
[cf35fbd7] GeoInterface
[bd369af6] Tables
[ade2ca70] Dates
(ArchGDAL) pkg> add CEnum
Resolving package versions...
Updating `~/.julia/dev/ArchGDAL/Project.toml`
[fa961155] + CEnum v0.4.1
[3c3547ce] + DiskArrays v0.2.7
[add2ef01] + GDAL v1.2.1
[68eda718] + GeoFormatTypes v0.3.0
[cf35fbd7] + GeoInterface v0.5.5
[bd369af6] + Tables v1.4.2
Update the [compat]
section of Project.toml
so that julia can resolve the versions, e.g.
[compat]
...
CEnum = "0.4"
Author: Yeesian
Source Code: https://github.com/yeesian/ArchGDAL.jl
License: View license
1656562923
This module adapts d3-delaunay for spherical data. Given a set of objects in spherical coordinates, it computes their Delaunay triangulation and its dual, the Voronoi diagram.
In addition, it offers convenience methods to extract the convex hull, the Urquhart graph, the circumcenters of the Delaunay triangles, and to find the cell that contains any given point on the sphere.
The module offers two APIs.
The GeoJSON API is the most convenient for drawing the results on a map. It follows as closely as possible the API of the d3-voronoi module. It is available with d3.geoVoronoi().
A lighter API is available with d3.geoDelaunay(). It offers the same contents, but with a different presentation, where every vertex, edge, polygon… is referenced by an id rather than by its coordinates. This allows a more compact representation in memory and eases topology computations.
If you use npm, npm install d3-geo-voronoi
. You can also download the latest release on GitHub. For vanilla HTML in modern browsers, import d3-geo-voronoi from Skypack:
<script type="module">
import {geoDelaunay} from "https://cdn.skypack.dev/d3-geo-voronoi@2";
</script>
For legacy environments, you can load d3-geo-voronoi’s UMD bundle from an npm-based CDN such as jsDelivr; a d3
global is exported:
<script src="https://cdn.jsdelivr.net/npm/d3-geo-voronoi@2"></script>
<script>
d3.geoContour();
</script>
This API is a similar to d3-delaunay’s API. It provides information on the Delaunay triangulation (edges, triangles, neighbors, Voronoi cells, etc) as indices in two arrays — the array of points, and the array of circumcenters. It facilitates topological computations. To draw the actual triangles, Voronoi cells etc, the Voronoi API described in the next section will often be easier to use.
# d3.geoDelaunay([data]) · Source
Creates a new spherical Delaunay layout. data must be passed as an array of [lon, lat] coordinates.
# delaunay.find(lon, lat[, node])
Returns the closest point to [lon, lat]; optionally starting the search at node to boost the performance.
# delaunay.urquhart([distances])
Given a vector of distances (in the same order as the edges list), returns a vector of boolean values: true if the edge belongs to the Urquhart graph, false otherwise.
# delaunay.hull()
Returns an array of indices of points on the hull. The array is empty if the points cover more than a hemisphere.
# delaunay.edges
An array of edges as indices of points [from, to].
# delaunay.triangles
An array of the triangles, as indices of points [a, b, c]. The triangles are orientated in a clockwise manner, triangles that span more than the hemisphere are removed.
# delaunay.centers
The array of centers in spherical coordinates; the first t centers are the t triangles’s circumcenters. More centers might be listed in order to build the Voronoi diagram for smaller number of points (n≤3).
# delaunay.neighbors
The array of neighbors indices for each vertex.
# delaunay.polygons
Array of Voronoi cells for each vertex. Each cell is an array of centers ordered in a clockwise manner.
# delaunay.mesh
An array containing all the edges of the Voronoi polygons.
This API is a wrapper around the Delaunay API, with inputs and outputs in GeoJSON, ready to draw on a map.
# d3.geoVoronoi([data]) · Source, Examples
Creates a new spherical Voronoi layout. data can be passed as an array of [lon, lat] coordinates, an array of GeoJSON features, or a GeoJSON FeatureCollection.
The following methods are similar to d3-voronoi's methods:
# voronoi.delaunay
The geoDelaunay object used to compute this diagram.
# voronoi.x([x])
Sets or returns the x accessor. The default x and y accessors are smart enough to recognize GeoJSON objects and return the geoCentroid of each feature.
# voronoi.y([y])
Sets or returns the y accessor.
# voronoi.polygons([data]) · Source, Examples
Returns the Voronoi tessellation of the data as a GeoJSON collection of polygons. (If there is only one data point, returns the Sphere). Each polygon exposes its datum in its properties.
# voronoi.cellMesh([data]) · Source
Returns the Voronoi tessellation as a GeoJSON mesh (MultiLineString).
# voronoi.triangles([data]) · Source, Examples
Returns the Voronoi tessellation of the data as a GeoJSON collection of polygons. Each triangle exposes in its properties the three sites, its spherical area (in steradians), and its circumcenter.
# voronoi.mesh([data]) · Source, Examples
Returns the Delaunay edges as a GeoJSON mesh (MultiLineString).
# voronoi.links([data]) · Source, Examples
Returns the Delaunay links of the data as a GeoJSON collection of lines. Each line exposes its source and target in its properties, but also its length (in radians), and a boolean flag for links that belong to the Urquhart graph.
voronoi.extent([extent]) and voronoi.size([size]) are not implemented.
Indeed, defining the “paper extent” of the geoVoronoi polygons can be quite tricky, as this block demonstrates.
# voronoi.find(x,y,[angle]) · Source, Examples
Finds the closest site to point x,y, i.e. the Voronoi polygon that contains it. Optionally, return null if the distance between the point and the site is larger than angle degrees.
# voronoi.hull(data) · Source, Examples
Returns the spherical convex hull of the data array, as a GeoJSON polygon. Returns null if the dataset spans more than a hemisphere. Equivalent to:
voronoi(data).hull();
Create spherical contours for non-gridded data.
The API of geoContour is similar to that of d3-contour and d3-tricontour:
# d3.geoContour() · Source, Examples
Constructs a new geocontour generator with the default settings.
Returns an array of contours, one for each threshold. The contours are MultiPolygons in GeoJSON format, that contain all the points with a value larger than the threshold. The value is indicated as geometry.value.
The data is passed as an array of points, by default with the format [lon, lat, value].
# geocontour.contour(data[, threshold])
Returns a contour, as a MultiPolygon in GeoJSON format, containing all points with a value larger or equal to threshold. The threshold is indicated as geometry.value
# geocontour.contours(data)
Returns an iterable over the contours.
# geocontour.isobands(data)
Returns an iterable over the isobands: contours between pairs of consecutive threshold values v0 (inclusive) and v1 (exclusive). geometry.value is equal to v0, geometry.valueMax to v1.
# geocontour.x([x])
Sets the x (longitude) accessor. Defaults to `d => d[0]`. If x is not given, returns the current x accessor.
# geocontour.y([y])
Sets the y (latitude) accessor. Defaults to `d => d[1]`. If y is not given, returns the current y accessor.
# geocontour.value([value])
Sets the value accessor. Defaults to `d => d[2]`. Values must be defined and finite. If value is not given, returns the current value accessor.
# geocontour.thresholds([thresholds])
Sets the thresholds, either explicitly as an array of values, or as a count that will be passed to d3.ticks. If empty, returns the current thresholds.
Note: d3.geoContour uses the experimental API of d3-tricontour: triangulate, pointInterpolate and ringsort.
There is no reason to limit the display of Voronoi cells to the orthographic projection. The example below displays the Urquhart graph of top container ports on a Winkel tripel map.
Geo_triangulate converts GeoJSON to triangles for 3d rendering.
the Delaunay/Voronoi topology is quite different on the sphere and on the plane. This module deals with these differences by first projecting the points with a stereographic projection, then stitching the geometries that are near the singularity of the projection (the “infinite horizon” on the plane is one point on the sphere).
geoVoronoi returns GeoJSON objects, which are often FeatureCollections
. By consequence, you will have to change .data(voronoi.polygons())
to .data(geovoronoi.polygons().features)
, and so on.
geoVoronoi is built on d3-delaunay, which is also exposed as d3.geoDelaunay in this library. If you want to have the fastest results, you should try to use d3.geoDelaunay directly (see the examples).
geoVoronoi and geoDelaunay offer methods to compute the spherical convex hull and the Urquhart graph of the data set. These can be achieved with the planar Voronoi (hull, Urquhart), but are not part of d3-voronoi or d3-delaunay.
Author: Fil
Source Code: https://github.com/Fil/d3-geo-voronoi
License: ISC license
1624358940
The dataset used to create the examples below contains all places of worship across the United States and its territories. The data was collected and curated in 2019 and downloaded from Data World: US Places of Worship. Containing over 49,000 records, the dataset includes locations of places of worship, as well as the religious types that each facility is used by. This includes Buddhist, Christian, Hindu, Islamic, Judaic, and Sikh places of worship.
Conventional bar graphs will be used to familiarize us with the information. The same data will then be presented using geospatial visualizations (data placed on maps). Using maps can provide more context around the information, allowing readers to make more connections and obtain a better understanding.
It behooves us to first gain an understanding of the distribution of the most popular kinds of places of worship in the United States. Figure 1 provides an interesting spread, with Christian places of worship overwhelmingly the most common. The second most prevalent is Judaic, with Buddhist and Muslim places of worship following. Hindu and other religious places of worship are the least common in the United States.
Figure 2 provides a view of the total number of places of worship in each state and territory. Additionally, we can see the split of which types of places of worship are included in each state. California and Texas, being two of the three largest states, have the most places of worship.
By viewing the same data with a proportional symbol map in Figure 3, the average locations of the establishments within each state are evident. The size of each symbol represents how many of each religion type are represented in each state by the number of worship buildings. While present in all states, there is a concentration of Christian places of worship in the eastern half of the country. The non-green circles show us where other religious places of worship have a larger presence at the state level.
#data-visualization #religion #geospatial #visualization
1622536376
Couchbase [Full Text Search] (FTS) is a great tool for indexing and querying geospatial data. In this article, I’ll present a geospatial search use case and demonstrate the various ways that we can perform a search of location data using the Couchbase Full Text Search service. I’ll be using Couchbase Server Enterprise Edition 6.6 (running [locally in Docker ]) to create an FTS index on my sample geospatial dataset and then run geospatial queries against the index.
My family has always enjoyed visiting and exploring [Great Smoky Mountains National Park] (or GRSM, the National Park Service’s abbreviation), and one day we might be interested in relocating there. But you can’t live in the national park, so we need to consider the various cities and towns near the park and make a short list of the ones to evaluate and possibly visit.
#database #geospatial #geospatial search
1620097860
A production-ready solution for dealing with GPS inaccuracies
The problem of GPS being non ideally accurate is not new, as well as the “snap-to-road” functionality that mainly does the matching of GPS traces to roads. Most of the projects that tend to have this problem are solving this either through Google Maps Snap to Roads API or OSRM Match Service API. But I’m going to show you the way I had taken and didn’t regret anything.
Primarily my problem was to calculate the distances of travelling per day and month done by cars. What’s also important is that I was trying to build a production solution that could do that every day, so not only a research part.
Since I had multiple already preprocessed GPS traces in GeoJSON, I created a Python script that calculates distances using geopy.distance.geodesic
function and looping over the whole trip’s geopoints. By my rough estimation, this approach sometimes had an error rate of more than 30%, which is actually horrible. Let me explain why.
As shown in the map below, sometimes GPS traces are not exactly accurate. Obviously, the driver was using the road path here, but the geopoints that we got from the GPS are placed near the road, not on it.
#pandas #gps #geospatial #valhalla #snap-to-road
1619539200
This guide is intended to be quick and easy, with the least amount of words and least amount of code, to show you how to plot data from a Pandas object on a world map using Matplotlib and Geopandas libraries.
The python libraries you need to install are pandas, geopandas and matplotlib. You can find the code for this tutorial on github: https://github.com/udiy/blogposts/tree/main/Easy%20map
The data in the Pandas object needs to have location coordinates, which means latitude and longitude. For this article, I am using data about fires in Australia, which can be found here: https://www.kaggle.com/carlosparadis/fires-from-space-australia-and-new-zeland.
The data comes from NASA satellites MODIS instrument, they monitor the fires from space, find acknowledgments at the end of this article. In the data, there’s a column called brightness, which is a measure of the temperature (in Kelvin) of the fire.
#gis #data-analysis #geospatial #python #pandas