DBInterface.jl: Database interface definitions for Julia

DBInterface.jl  

Purpose

DBInterface.jl provides interface definitions to allow common database operations to be implemented consistently across various database packages.

For Users

To use DBInterface.jl, select an implementing database package, then utilize the consistent DBInterface.jl interface methods:

conn = DBInterface.connect(T, args...; kw...) # create a connection to a specific database T; required parameters are database-specific

stmt = DBInterface.prepare(conn, sql) # prepare a sql statement against the connection; returns a statement object

results = DBInterface.execute(stmt) # execute a prepared statement; returns an iterator of rows (property-accessible & indexable)

rowid = DBInterface.lastrowid(results) # get the last row id of an INSERT statement, as supported by the database

# example of using a query resultset
for row in results
    @show propertynames(row) # see possible column names of row results
    row.col1 # access the value of a column named `col1`
    row[1] # access the first column in the row results
end

# results also implicitly satisfy the Tables.jl `Tables.rows` inteface, so any compatible sink can ingest results
df = DataFrame(results)
CSV.write("results.csv", results)

results = DBInterface.execute(conn, sql) # convenience method if statement preparation/re-use isn't needed

stmt = DBInterface.prepare(conn, "INSERT INTO test_table VALUES(?, ?)") # prepare a statement with positional parameters

DBInterface.execute(stmt, [1, 3.14]) # execute the prepared INSERT statement, passing 1 and 3.14 as positional parameters

stmt = DBInterface.prepare(conn, "INSERT INTO test_table VALUES(:col1, :col2)") # prepare a statement with named parameters

DBInterface.execute(stmt, (col1=1, col2=3.14)) # execute the prepared INSERT statement, with 1 and 3.14 as named parameters

DBInterface.executemany(stmt, (col1=[1,2,3,4,5], col2=[3.14, 1.23, 2.34 3.45, 4.56])) # execute the prepared statement multiple times for each set of named parameters; each named parameter must be an indexable collection

results = DBInterface.executemultiple(conn, sql) # where sql is a query that returns multiple resultsets

# first iterate through resultsets
for result in results
    # for each resultset, we can iterate through resultset rows
    for row in result
        @show propertynames(row)
        row.col1
        row[1]
    end
end

DBInterface.close!(stmt) # close the prepared statement
DBInterface.close!(conn) # close connection

For Database Package Developers

See the documentation for expanded details on required interface methods.

Download Details:

Author: JuliaDatabases 
Source Code: https://github.com/JuliaDatabases/DBInterface.jl 
License: View license

#julia #database #interface 

What is GEEK

Buddha Community

DBInterface.jl: Database interface definitions for Julia

DBInterface.jl: Database interface definitions for Julia

DBInterface.jl  

Purpose

DBInterface.jl provides interface definitions to allow common database operations to be implemented consistently across various database packages.

For Users

To use DBInterface.jl, select an implementing database package, then utilize the consistent DBInterface.jl interface methods:

conn = DBInterface.connect(T, args...; kw...) # create a connection to a specific database T; required parameters are database-specific

stmt = DBInterface.prepare(conn, sql) # prepare a sql statement against the connection; returns a statement object

results = DBInterface.execute(stmt) # execute a prepared statement; returns an iterator of rows (property-accessible & indexable)

rowid = DBInterface.lastrowid(results) # get the last row id of an INSERT statement, as supported by the database

# example of using a query resultset
for row in results
    @show propertynames(row) # see possible column names of row results
    row.col1 # access the value of a column named `col1`
    row[1] # access the first column in the row results
end

# results also implicitly satisfy the Tables.jl `Tables.rows` inteface, so any compatible sink can ingest results
df = DataFrame(results)
CSV.write("results.csv", results)

results = DBInterface.execute(conn, sql) # convenience method if statement preparation/re-use isn't needed

stmt = DBInterface.prepare(conn, "INSERT INTO test_table VALUES(?, ?)") # prepare a statement with positional parameters

DBInterface.execute(stmt, [1, 3.14]) # execute the prepared INSERT statement, passing 1 and 3.14 as positional parameters

stmt = DBInterface.prepare(conn, "INSERT INTO test_table VALUES(:col1, :col2)") # prepare a statement with named parameters

DBInterface.execute(stmt, (col1=1, col2=3.14)) # execute the prepared INSERT statement, with 1 and 3.14 as named parameters

DBInterface.executemany(stmt, (col1=[1,2,3,4,5], col2=[3.14, 1.23, 2.34 3.45, 4.56])) # execute the prepared statement multiple times for each set of named parameters; each named parameter must be an indexable collection

results = DBInterface.executemultiple(conn, sql) # where sql is a query that returns multiple resultsets

# first iterate through resultsets
for result in results
    # for each resultset, we can iterate through resultset rows
    for row in result
        @show propertynames(row)
        row.col1
        row[1]
    end
end

DBInterface.close!(stmt) # close the prepared statement
DBInterface.close!(conn) # close connection

For Database Package Developers

See the documentation for expanded details on required interface methods.

Download Details:

Author: JuliaDatabases 
Source Code: https://github.com/JuliaDatabases/DBInterface.jl 
License: View license

#julia #database #interface 

LevelDB.jl: Julia interface to Google's LevelDB Key Value Database

LevelDB 

LevelDB is Google's open source on-disk key-value storage library that provides an ordered mapping from string keys to binary values. In many applications where only key based accesses are needed, it tends to be a faster alternative than databases. LevelDB was written in C++ with a C calling API included. This module provides a Julia interface to LevelDB using Julia's ccall mechanism.

Install LevelDB

You can build LevelDB from its source code at https://github.com/google/leveldb. Please install the final dynamic library into a system directory such as /usr/lib or make sure libleveldb.so is in one of your LD_LIBRARY_PATH directories. If libleveldb.so is not installed, Julia will try to download and build it automatically.

Run Testing Code

(v1.1) pkg> test LevelDB

This will exercise batched and non-batched writes and reads for string and float array values.

Create/Open/Close a LevelDB database

julia> db = LevelDB.DB(file_path; create_if_missing = false, error_if_exists = false)

Here file_path is the full path to a directory that hosts a LevelDB database. create_if_missing is a boolean flag when true the database will be created if it does not exist. error_if_exists is a boolean flag when true an error will be thrown if the database already exists. The return value is a database object for passing to read/write calls.

julia> close(db)

Close a database, db is the object returned from a LevelDB.DB call. A directory can only be opened by a single LevelDB.DB at a time.

Read and Write Operations

julia> db[key] = value

key and value are Array{UInt8}.

julia> db[key]

Return value is an Array{UInt8}, one can use the reinterpret function to cast it into the right array type (see test code).

julia> delete!(db, key)

Delete a key from db.

Batched Write

LevelDB supports grouping a number of put operations into a write batch, the batch will either succeed as a whole or fail altogether, behaving like an atomic update.

julia> db[keys] = values

keys and values must behave like iterators returning Array{UInt8}. Creates a write batch internally which is then commited to db.

Iterate

julia> for (key, value) in db
           #do something with the key value pair
       end

Iterate over all key => value pairs in a LevelDB.DB.

julia> for (key, value) in LevelDB.RangeView(db, key1, key2)
           #do something with the key value pair
       end

Iterate over a range between key1 and key2 (inclusive)

Authors

  • Jerry Zhenlei Cai ( jpenguin at gmail dot com )
  • Guido Kraemer

additional contributions by

  • @huwenshuo
  • @tmlbl

Download Details:

Author: jerryzhenleicai
Source Code: https://github.com/jerryzhenleicai/LevelDB.jl 
License: View license

#julia #leveldb #interface #database 

Ruth  Nabimanya

Ruth Nabimanya

1620633584

System Databases in SQL Server

Introduction

In SSMS, we many of may noticed System Databases under the Database Folder. But how many of us knows its purpose?. In this article lets discuss about the System Databases in SQL Server.

System Database

Fig. 1 System Databases

There are five system databases, these databases are created while installing SQL Server.

  • Master
  • Model
  • MSDB
  • Tempdb
  • Resource
Master
  • This database contains all the System level Information in SQL Server. The Information in form of Meta data.
  • Because of this master database, we are able to access the SQL Server (On premise SQL Server)
Model
  • This database is used as a template for new databases.
  • Whenever a new database is created, initially a copy of model database is what created as new database.
MSDB
  • This database is where a service called SQL Server Agent stores its data.
  • SQL server Agent is in charge of automation, which includes entities such as jobs, schedules, and alerts.
TempDB
  • The Tempdb is where SQL Server stores temporary data such as work tables, sort space, row versioning information and etc.
  • User can create their own version of temporary tables and those are stored in Tempdb.
  • But this database is destroyed and recreated every time when we restart the instance of SQL Server.
Resource
  • The resource database is a hidden, read only database that holds the definitions of all system objects.
  • When we query system object in a database, they appear to reside in the sys schema of the local database, but in actually their definitions reside in the resource db.

#sql server #master system database #model system database #msdb system database #sql server system databases #ssms #system database #system databases in sql server #tempdb system database

Django-allauth: A simple Boilerplate to Setup Authentication

Django-Authentication 

A simple Boilerplate to Setup Authentication using Django-allauth, with a custom template for login and registration using django-crispy-forms.

Getting Started

Prerequisites

  • Python 3.8.6 or higher

Project setup

# clone the repo
$ git clone https://github.com/yezz123/Django-Authentication

# move to the project folder
$ cd Django-Authentication

Creating virtual environment

  • Create a virtual environment for this project:
# creating pipenv environment for python 3
$ virtualenv venv

# activating the pipenv environment
$ cd venv/bin #windows environment you activate from Scripts folder

# if you have multiple python 3 versions installed then
$ source ./activate

Configured Enviromment

Environment variables

SECRET_KEY = #random string
DEBUG = #True or False
ALLOWED_HOSTS = #localhost
DATABASE_NAME = #database name (You can just use the default if you want to use SQLite)
DATABASE_USER = #database user for postgres
DATABASE_PASSWORD = #database password for postgres
DATABASE_HOST = #database host for postgres
DATABASE_PORT = #database port for postgres
ACCOUNT_EMAIL_VERIFICATION = #mandatory or optional
EMAIL_BACKEND = #email backend
EMAIL_HOST = #email host
EMAIL_HOST_PASSWORD = #email host password
EMAIL_USE_TLS = # if your email use tls
EMAIL_PORT = #email port

change all the environment variables in the .env.sample and don't forget to rename it to .env.

Run the project

After Setup the environment, you can run the project using the Makefile provided in the project folder.

help:
 @echo "Targets:"
 @echo "    make install" #install requirements
 @echo "    make makemigrations" #prepare migrations
 @echo "    make migrations" #migrate database
 @echo "    make createsuperuser" #create superuser
 @echo "    make run_server" #run the server
 @echo "    make lint" #lint the code using black
 @echo "    make test" #run the tests using Pytest

Preconfigured Packages

Includes preconfigured packages to kick start Django-Authentication by just setting appropriate configuration.

PackageUsage
django-allauthIntegrated set of Django applications addressing authentication, registration, account management as well as 3rd party (social) account authentication.
django-crispy-formsdjango-crispy-forms provides you with a crispy filter and {% crispy %} tag that will let you control the rendering behavior of your Django forms in a very elegant and DRY way.

Contributing

  • Django-Authentication is a simple project, so you can contribute to it by just adding your code to the project to improve it.
  • If you have any questions, please feel free to open an issue or create a pull request.

Download Details:
Author: yezz123
Source Code: https://github.com/yezz123/Django-Authentication
License: MIT License

#django #python 

 iOS App Dev

iOS App Dev

1625133780

SingleStore: The One Stop Shop For Everything Data

  • SingleStore works toward helping businesses embrace digital innovation by operationalising “all data through one platform for all the moments that matter”

The pandemic has brought a period of transformation across businesses globally, pushing data and analytics to the forefront of decision making. Starting from enabling advanced data-driven operations to creating intelligent workflows, enterprise leaders have been looking to transform every part of their organisation.

SingleStore is one of the leading companies in the world, offering a unified database to facilitate fast analytics for organisations looking to embrace diverse data and accelerate their innovations. It provides an SQL platform to help companies aggregate, manage, and use the vast trove of data distributed across silos in multiple clouds and on-premise environments.

**Your expertise needed! **Fill up our quick Survey

#featured #data analytics #data warehouse augmentation #database #database management #fast analytics #memsql #modern database #modernising data platforms #one stop shop for data #singlestore #singlestore data analytics #singlestore database #singlestore one stop shop for data #singlestore unified database #sql #sql database