1647707220
dbbench
dbbench
is a simple tool to benchmark or stress test databases. You can use the simple built-in benchmarks or run your own queries.
Attention: This tool comes with no warranty. Don't run it on production databases.
$ dbbench postgres --user postgres --pass example --iter 100000
inserts 6.199670776s 61996 ns/op
updates 7.74049898s 77404 ns/op
selects 2.911541197s 29115 ns/op
deletes 5.999572479s 59995 ns/op
total: 22.85141994s
Binaries are available for all major platforms. See the releases page. Unfortunately, cgo
is disabled for these builds, which means there is no SQLite support (#1).
Using the Homebrew package manager for macOS:
brew install sj14/tap/dbbench
It's also possible to install the current development snapshot with go get
(not recommended):
go get -u github.com/sj14/dbbench/cmd/dbbench
Databases | Driver |
---|---|
Cassandra and compatible databases (e.g. ScyllaDB) | github.com/gocql/gocql |
MS SQL and compatible databases (no built-in benchmarks yet) | github.com/denisenkom/go-mssqldb |
MySQL and compatible databases (e.g. MariaDB and TiDB) | github.com/go-sql-driver/mysql |
PostgreSQL and compatible databases (e.g. CockroachDB) | github.com/lib/pq |
SQLite3 and compatible databases | github.com/mattn/go-sqlite3 |
Available subcommands:
cassandra|cockroach|mssql|mysql|postgres|sqlite
Use 'subcommand --help' for all flags of the specified command.
Generic flags for all subcommands:
--clean only cleanup benchmark data, e.g. after a crash
--iter int how many iterations should be run (default 1000)
--noclean keep benchmark data
--noinit do not initialize database and tables, e.g. when only running own script
--run string only run the specified benchmarks, e.g. "inserts deletes" (default "all")
--script string custom sql file to execute
--sleep duration how long to pause after each single benchmark (valid units: ns, us, ms, s, m, h)
--threads int max. number of green threads (iter >= threads > 0) (default 25)
--version print version information
You can run your own SQL statements with the --script
flag. You can use the auto-generate tables. Beware the file size as it will be completely loaded into memory.
The script must contain valid SQL statements for your database.
There are some built-in variables and functions which can be used in the script. It's using the golang template engine which uses the delimiters {{
and }}
. Functions are executed with the call
command and arguments are passed after the function name.
A new benchmark is created with the \benchmark
keyword, followed by either once
or loop
. Optional parameters can be added afterwards in the same line.
The the usage description and the example subsection for more information.
Usage | Description |
---|---|
\benchmark once | Execute the following statements (lines) only once (e.g. to create and delete tables). |
\benchmark loop | Default mode. Execute the following statements (lines) in a loop. Executes them one after another and then starts a new iteration. Add another \benchmark loop to start another benchmark of statements. |
\name insert | Set a custom name for the DB statement(s), which will be output instead the line numbers (insert is an examplay name). |
Usage | Description |
---|---|
{{.Iter}} | The iteration counter. Will return 1 when \benchmark once . |
{{call .Seed 42}} | godoc (42 is an examplary seed) |
{{call .RandInt63}} | godoc |
{{call .RandInt63n 9999}} | godoc (9999 is an examplary upper limit) |
{{call .RandFloat32}} | godoc |
{{call .RandFloat64}} | godoc |
{{call .RandExpFloat64}} | godoc |
{{call .RandNormFloat64}} | godoc |
Exemplary sqlite_bench.sql
file:
-- Create table
\benchmark once \name init
CREATE TABLE dbbench_simple (id INT PRIMARY KEY, balance DECIMAL);
-- How long takes an insert and delete?
\benchmark loop \name single
INSERT INTO dbbench_simple (id, balance) VALUES({{.Iter}}, {{call .RandInt63}});
DELETE FROM dbbench_simple WHERE id = {{.Iter}};
-- How long takes it in a single transaction?
\benchmark loop \name batch
BEGIN TRANSACTION;
INSERT INTO dbbench_simple (id, balance) VALUES({{.Iter}}, {{call .RandInt63}});
DELETE FROM dbbench_simple WHERE id = {{.Iter}};
COMMIT;
-- Delete table
\benchmark once \name clean
DROP TABLE dbbench_simple;
In this script, we create and delete the table manually, thus we will pass the --noinit
and --noclean
flag, which would otherwise create this default table for us:
dbbench sqlite --script scripts/sqlite_bench.sql --iter 5000 --noinit --noclean
output:
(once) init: 3.404784ms 3404784 ns/op
(loop) single: 10.568390874s 2113678 ns/op
(loop) batch: 5.739021596s 1147804 ns/op
(once) clean: 1.065703ms 1065703 ns/op
total: 16.312319959s
Error message
failed to insert: UNIQUE constraint failed: dbbench_simple.id
Description The previous data wasn't removed (e.g. because the benchmark was canceled). Try to run the same command again, but with the --clean
flag attached, which will remove the old data. Then run the original command again.
Error message
failed to create table: Binary was compiled with 'CGO_ENABLED=0', go-sqlite3 requires cgo to work. This is a stub
Description
Currently, the released binary builds don't contain SQLite support. You have to compile dbbench manually, either from the particular release source code (recommended) or from the current master branch (not recommended).
Below are some examples how to run different databases and the equivalent call of dbbench
for testing/developing.
docker run --name dbbench-cassandra -p 9042:9042 -d cassandra:latest
dbbench cassandra
# port 8080 is the webinterface (optional)
docker run --name dbbench-cockroach -d -p 26257:26257 -p 8080:8080 cockroachdb/cockroach:latest start --insecure
dbbench cockroach
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -p 1433:1433 -d microsoft/mssql-server-linux
dbbench mssql -user sa -pass 'yourStrong(!)Password'
docker run --name dbbench-mariadb -p 3306:3306 -d -e MYSQL_ROOT_PASSWORD=root mariadb
dbbench mariadb
docker run --name dbbench-mysql -p 3306:3306 -d -e MYSQL_ROOT_PASSWORD=root mysql
dbbench mysql
docker run --name dbbench-postgres -p 5432:5432 -d postgres
dbbench postgres --user postgres --pass example
docker run --name dbbench-scylla -p 9042:9042 -d scylladb/scylla
dbbench scylla
dbbench sqlite
git clone https://github.com/pingcap/tidb-docker-compose.git
cd tidb-docker-compose && docker-compose pull
docker-compose up -d
dbbench tidb --pass '' --port 4000
Thanks to the authors of Go and those of the directly and indirectly used libraries, especially the driver developers. It wouldn't be possible without all your work.
This tool was highly inspired by the snippet from user Fale and the tool pgbench. Later, also inspired by MemSQL's dbbench which had the name and a similar idea before.
Author: SJ14
Source Code: https://github.com/sj14/dbbench
License: MIT License
1598916060
The demand for delivering quality software faster — or “Quality at Speed” — requires organizations to search for solutions in Agile, continuous integration (CI), and DevOps methodologies. Test automation is an essential part of these aspects. The latest World Quality Report 2018–2019 suggests that test automation is the biggest bottleneck to deliver “Quality at Speed,” as it is an enabler of successful Agile and DevOps adoption.
Test automation cannot be realized without good tools; as they determine how automation is performed and whether the benefits of automation can be delivered. Test automation tools is a crucial component in the DevOps toolchain. The current test automation trends have increased in applying artificial intelligence and machine learning (AI/ML) to offer advanced capabilities for test optimization, intelligent test generation, execution, and reporting. It will be worthwhile to understand which tools are best poised to take advantage of these trends.****
#automation-testing #automation-testing-tools #testing #testing-tools #selenium #open-source #test-automation #automated-testing
1621931381
Challenge for brands: how to offer a seamless, fast, and user-friendly mobile experience?
App users have a low tolerance for slowness, with a reported 43% of users unhappy if they have to wait longer than three seconds for an app to load. ([App Samurai])
It’s not enough to ensure that your mobile app functions properly, but also to test how it behaves on different devices, under heavy user load, different network connections, etcetera. It’s equally important to test different metrics on both the client-side as well as the server-side. This is where finding the right tool or set of tools for mobile performance testing is essential.
After extensively researching, I’ve put together a list of top-rated mobile performance testing tools and provided an overview of each below.
#testing #load testing tool #testing tools #performance #mobile testing tools
1620183744
In the software development cycle, testing is one of the important criteria. There are many tools available in this space for testing such as Junit, Jmeter, manual, automation, and many performance testing tools. Some of these tools are third-party tools and have a cost-heavy license for the company to manage. For small start-up companies, these license costs can be unbearable. We analyze a tool to make the process easier and more cost effective.
The tool can have two parts. One part can be making a main interface web page where developers/testers can fill in the details and start testing. The other part can be the onboarding template page, where the team can onboard new applications, templates, and stacks so that it appears on the main interface page.
#performance testing #testing tool #performance test tools #testing
1596754901
The shift towards microservices and modular applications makes testing more important and more challenging at the same time. You have to make sure that the microservices running in containers perform well and as intended, but you can no longer rely on conventional testing strategies to get the job done.
This is where new testing approaches are needed. Testing your microservices applications require the right approach, a suitable set of tools, and immense attention to details. This article will guide you through the process of testing your microservices and talk about the challenges you will have to overcome along the way. Let’s get started, shall we?
Traditionally, testing a monolith application meant configuring a test environment and setting up all of the application components in a way that matched the production environment. It took time to set up the testing environment, and there were a lot of complexities around the process.
Testing also requires the application to run in full. It is not possible to test monolith apps on a per-component basis, mainly because there is usually a base code that ties everything together, and the app is designed to run as a complete app to work properly.
Microservices running in containers offer one particular advantage: universal compatibility. You don’t have to match the testing environment with the deployment architecture exactly, and you can get away with testing individual components rather than the full app in some situations.
Of course, you will have to embrace the new cloud-native approach across the pipeline. Rather than creating critical dependencies between microservices, you need to treat each one as a semi-independent module.
The only monolith or centralized portion of the application is the database, but this too is an easy challenge to overcome. As long as you have a persistent database running on your test environment, you can perform tests at any time.
Keep in mind that there are additional things to focus on when testing microservices.
Test containers are the method of choice for many developers. Unlike monolith apps, which lets you use stubs and mocks for testing, microservices need to be tested in test containers. Many CI/CD pipelines actually integrate production microservices as part of the testing process.
As mentioned before, there are many ways to test microservices effectively, but the one approach that developers now use reliably is contract testing. Loosely coupled microservices can be tested in an effective and efficient way using contract testing, mainly because this testing approach focuses on contracts; in other words, it focuses on how components or microservices communicate with each other.
Syntax and semantics construct how components communicate with each other. By defining syntax and semantics in a standardized way and testing microservices based on their ability to generate the right message formats and meet behavioral expectations, you can rest assured knowing that the microservices will behave as intended when deployed.
It is easy to fall into the trap of making testing microservices complicated, but there are ways to avoid this problem. Testing microservices doesn’t have to be complicated at all when you have the right strategy in place.
There are several ways to test microservices too, including:
What’s important to note is the fact that these testing approaches allow for asynchronous testing. After all, asynchronous development is what makes developing microservices very appealing in the first place. By allowing for asynchronous testing, you can also make sure that components or microservices can be updated independently to one another.
#blog #microservices #testing #caylent #contract testing #end-to-end testing #hoverfly #integration testing #microservices #microservices architecture #pact #testing #unit testing #vagrant #vcr
1640257440
A simple Boilerplate to Setup Authentication using Django-allauth, with a custom template for login and registration using django-crispy-forms
.
# clone the repo
$ git clone https://github.com/yezz123/Django-Authentication
# move to the project folder
$ cd Django-Authentication
virtual environment
for this project:# creating pipenv environment for python 3
$ virtualenv venv
# activating the pipenv environment
$ cd venv/bin #windows environment you activate from Scripts folder
# if you have multiple python 3 versions installed then
$ source ./activate
SECRET_KEY = #random string
DEBUG = #True or False
ALLOWED_HOSTS = #localhost
DATABASE_NAME = #database name (You can just use the default if you want to use SQLite)
DATABASE_USER = #database user for postgres
DATABASE_PASSWORD = #database password for postgres
DATABASE_HOST = #database host for postgres
DATABASE_PORT = #database port for postgres
ACCOUNT_EMAIL_VERIFICATION = #mandatory or optional
EMAIL_BACKEND = #email backend
EMAIL_HOST = #email host
EMAIL_HOST_PASSWORD = #email host password
EMAIL_USE_TLS = # if your email use tls
EMAIL_PORT = #email port
change all the environment variables in the
.env.sample
and don't forget to rename it to.env
.
After Setup the environment, you can run the project using the Makefile
provided in the project folder.
help:
@echo "Targets:"
@echo " make install" #install requirements
@echo " make makemigrations" #prepare migrations
@echo " make migrations" #migrate database
@echo " make createsuperuser" #create superuser
@echo " make run_server" #run the server
@echo " make lint" #lint the code using black
@echo " make test" #run the tests using Pytest
Includes preconfigured packages to kick start Django-Authentication by just setting appropriate configuration.
Package | Usage |
---|---|
django-allauth | Integrated set of Django applications addressing authentication, registration, account management as well as 3rd party (social) account authentication. |
django-crispy-forms | django-crispy-forms provides you with a crispy filter and {% crispy %} tag that will let you control the rendering behavior of your Django forms in a very elegant and DRY way. |
Download Details:
Author: yezz123
Source Code: https://github.com/yezz123/Django-Authentication
License: MIT License