1670508600
You may think database maintenance is none of your business. But if you design your models proactively, you get databases that make life easier for those who have to maintain them.
A good database design requires proactivity, a well-regarded quality in any work environment. In case you are unfamiliar with the term, proactivity is the ability to anticipate problems and have solutions ready when problems occur – or better yet, plan and act so that problems don’t occur in the first place.
Employers understand the proactivity of their employees or contractors equals cost savings. That’s why they value it and why they encourage people to practice it.
In your role as a data modeler, the best way to demonstrate proactivity is to design models that anticipate and avoid problems that routinely plague database maintenance. Or, at least, that substantially simplify the solution to those problems.
Even if you are not responsible for database maintenance, modeling for easy database maintenance reaps many benefits. For example, it keeps you from being called at any time to solve data emergencies that take away valuable time you could be spending on the design or modeling tasks you enjoy so much!
When designing our databases, we need to think beyond the delivery of a DER and the generation of update scripts. Once a database goes into production, maintenance engineers have to deal with all sorts of potential problems, and part of our task as database modelers is to minimize the chances that those problems occur.
Let’s start by looking at what it means to create a good database design and how that activity relates to regular database maintenance tasks.
Data modeling is the task of creating an abstract, usually graphical, representation of an information repository. The goal of data modeling is to expose the attributes of, and the relationships between, the entities whose data is stored in the repository.
Data models are built around the needs of a business problem. Rules and requirements are defined in advance through input from business experts so that they can be incorporated into the design of a new data repository or adapted in the iteration of an existing one.
Ideally, data models are living documents that evolve with changing business needs. They play an important role in supporting business decisions and in planning systems architecture and strategy. The data models must be kept in sync with the databases they represent so that they are useful to the maintenance routines of those databases.
Maintaining a database requires constant monitoring, automated or otherwise, to ensure it does not lose its virtues. Database maintenance best practices ensure databases always keep their:
MANY DATA MODELING TIPS are available to help you CREATE A GOOD DATABASE DESIGN EVERY TIME. The ones discussed below aim specifically at ensuring or facilitating the maintenance of the database qualities mentioned above.
A fundamental goal of database maintenance best practices is to ensure the information in the database keeps its integrity. This is critical to the users keeping their faith in the information.
There are two types of integrity: physical integrity and logical integrity.
Maintaining the physical integrity of a database is done by protecting the information from external factors such as hardware or power failures. The most common and widely accepted approach is through an adequate backup strategy that allows the recovery of a database in a reasonable time if a catastrophe destroys it.
For DBAs and server administrators who manage database storage, it is useful to know if databases can be partitioned into sections with different update frequencies. This allows them to optimize storage usage and backup plans.
Data models can reflect that partitioning by identifying areas of different data “temperature” and by grouping entities into those areas. “Temperature” refers to the frequency with which tables receive new information. Tables that are updated very frequently are the “hottest”; those that are never or rarely updated are the “coldest.”
Data model of an e-commerce system differentiating hot, warm, and cold data.
A DBA or system administrator can use this logical grouping to partition the database files and create different backup plans for each partition.
Maintaining the logical integrity of a database is essential for the reliability and usefulness of the information it delivers. If a database lacks logical integrity, the applications that use it reveal inconsistencies in the data sooner or later. Faced with these inconsistencies, users distrust the information and simply look for more reliable data sources.
Among the database maintenance tasks, maintaining the logical integrity of the information is an extension of the database modeling task, only that it begins after the database is put into production and continues throughout its lifetime. The most critical part of this area of maintenance is adapting to changes.
Changes in business rules or requirements are a constant threat to the logical integrity of databases. You may feel happy with the data model you have built, knowing that it is perfectly adapted to the business, that it responds with the right information to any query, and that it leaves out any insertion, update, or deletion anomalies. Enjoy this moment of satisfaction, because it is short-lived!
Maintenance of a database involves facing the need to make changes in the model daily. It forces you to add new objects or alter the existing ones, modify the cardinality of the relationships, redefine primary keys, change data types, and do other things that make us modelers shiver.
Changes happen all the time. It may be some requirement was explained wrong from the beginning, new requirements have surfaced, or you have unintentionally introduced some flaw in your model (after all, we data modelers are only human).
Your models must be easy to modify when a need for changes arises. It is critical to use a database design tool for modeling that allows you to version your models, generate scripts to migrate a database from one version to another, and PROPERLY DOCUMENT EVERY DESIGN DECISION.
Without these tools, every change you make to your design creates integrity risks that come to light at the most inopportune times. VERTABELO gives you all this functionality and takes care of maintaining the version history of a model without you even having to think about it.
The automatic versioning built into Vertabelo is a tremendous help in maintaining changes to a data model.
Change management and version control are also crucial factors in EMBEDDING DATA MODELING ACTIVITIES INTO THE SOFTWARE DEVELOPMENT LIFECYCLE.
When you apply changes to a database in use, you need to be 100% sure that no information is lost and that its integrity is unaffected as a consequence of the changes. To do this, you can use refactoring techniques. They are normally applied when you want to improve a design without affecting its semantics, but they can also be used to correct design errors or adapt a model to new requirements.
There are a large number of refactoring techniques. They are usually employed to GIVE NEW LIFE TO LEGACY DATABASES, and there are textbook procedures that ensure the changes do not harm the existing information. Entire books have been written about it; I recommend you read them.
But to summarize, we can group refactoring techniques into the following categories:
Techniques that modify the semantics of the model, as well as those that do not alter the data model in any way, are not considered refactoring techniques. These include inserting rows to a table, adding a new column, creating a new table or view, and updating the data in a table.
The information quality in a database is the degree to which the data meets the organization’s expectations for accuracy, validity, completeness, and consistency. Maintaining data quality throughout the life cycle of a database is vital for its users for making correct and informed decisions using the data in it.
Your responsibility as a data modeler is to ensure your models keep their information quality at the highest possible level. To do this:
Another major challenge in maintaining a database is preventing its growth from reaching the storage capacity limit unexpectedly. To help with storage space management, you can apply the same principle used in backup procedures: group the tables in your model according to the rate at which they grow.
A division into two areas is usually sufficient. Place the tables with frequent row additions in one area, those to which rows are rarely inserted in another. Having the model sectored this way allows storage administrators to partition the database files according to the growth rate of each area. They can distribute the partitions among different storage media with different capacities or growth possibilities.
A grouping of tables by their growth rate helps determine the storage requirements and manage its growth.
We create a data model expecting it to provide the information as it is at the time of the query. However, we tend to overlook the need for a database to remember everything that has happened in the past unless users specifically require it.
Part of maintaining a database is knowing how, when, why, and by whom a particular piece of data was altered. This may be for things such as finding out when a product price changed or reviewing changes in the medical record of a patient in a hospital. Logging can be used even to correct user or application errors since it allows you to roll back the state of information to a point in the past without the need to resort to complicated backup restoration procedures.
Again, even if users do not need it explicitly, considering the need for proactive logging is a very valuable means of facilitating database maintenance and demonstrating your ability to anticipate problems. Having logging data allows immediate responses when someone needs to review historical information.
There are different STRATEGIES FOR A DATABASE MODEL TO SUPPORT LOGGING, all of which add complexity to the model. One approach is called in-place logging, which adds columns to each table to record version information. This is a simple option that does not involve creating separate schemas or logging-specific tables. However, it does impact the model design because the original primary keys of the tables are no longer valid as primary keys – their values are repeated in rows that represent different versions of the same data.
Another option to keep log information is to use shadow tables. Shadow tables are replicas of the model tables with the addition of columns to record log trail data. This strategy does not require modifying the tables in the original model, but you need to remember to update the corresponding shadow tables when you change your data model.
Yet another strategy is to employ a subschema of generic tables that record every insertion, deletion, or modification to any other table.
Generic tables to keep an audit trail of a database.
This strategy has the advantage that it does not require modifications to the model for recording an audit trail. However, because it uses generic columns of the varchar type, it limits the types of data that can be recorded in the log trail.
Practically any database has good performance when it is just starting to be used and its tables contain only a few rows. But as soon as applications start to populate it with data, performance may degrade very quickly if precautions are not taken in designing the model. When this happens, DBAs and system administrators call on you to help them solve performance problems.
The automatic creation/suggestion of indexes on production databases is a useful tool for solving performance problems “in the heat of the moment.” Database engines can analyze database activities to see which operations take the longest and where there are opportunities to speed up by creating indexes.
However, it is much better to be proactive and anticipate the situation by defining indexes as part of the data model. This greatly reduces maintenance efforts for improving database performance. If you are not familiar with the benefits of database indexes, I suggest reading ALL ABOUT INDEXES, STARTING WITH THE VERY BASICS.
There are practical rules that provide enough guidance for creating the most important indexes for efficient queries. The first is to generate indexes for the primary key of each table. Practically every RDBMS generates an index for each primary key automatically, so you can forget about this rule.
Another rule is to generate indexes for alternative keys of a table, particularly in tables for which a surrogate key is created. If a table has a natural key that is not used as a primary key, queries to join that table with others very likely do so with the natural key, not the surrogate. Those queries do not perform well unless you create an index on the natural key.
The next rule of thumb for indexes is to generate them for all fields that are foreign keys. These fields are great candidates for establishing joins with other tables. If they are included in indexes, they are used by query parsers to speed up execution and improve database performance.
Finally, it is a good idea to use a profiling tool on a staging or QA database during performance tests to detect any index creation opportunities that are not obvious. Incorporating the indexes suggested by the profiling tools into the data model is extremely helpful in achieving and maintaining the performance of the database once it is in production.
In your role as a data modeler, you can help maintain database security by providing a solid and secure base in which to store data for user authentication. Keep in mind this information is highly sensitive and must not be exposed to cyber-attacks.
For your design to simplify the maintenance of database security, follow the BEST PRACTICES FOR STORING AUTHENTICATION DATA, the main one among which is not to store passwords in the database even in encrypted form. Storing only its hash instead of the password for each user allows an application to authenticate a user login without creating any password exposure risk.
A complete schema for user authentication that includes columns for storing password hashes.
So, create your models for easy database maintenance with good database designs by taking into account the tips given above. With more maintainable data models, your work looks better, and you gain the appreciation of DBAs, maintenance engineers, and system administrators.
You also invest in peace of mind. Creating easily maintainable databases means you can spend your working hours designing new data models, rather than running around patching databases that fail to deliver correct information on time.
Original article source at: https://www.vertabelo.com/
1663941240
In this Postgres article, let's learn about Optimization: 10 Popular PostgreSQL Optimization Libraries
PostgreSQL is a powerful, open source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale the most complicated data workloads. The origins of PostgreSQL date back to 1986 as part of the POSTGRES project at the University of California at Berkeley and has more than 30 years of active development on the core platform.
PostgreSQL has earned a strong reputation for its proven architecture, reliability, data integrity, robust feature set, extensibility, and the dedication of the open source community behind the software to consistently deliver performant and innovative solutions. PostgreSQL runs on all major operating systems, has been ACID-compliant since 2001, and has powerful add-ons such as the popular PostGIS geospatial database extender. It is no surprise that PostgreSQL has become the open source relational database of choice for many people and organisations.
Just like any advanced relational database, PostgreSQL uses a cost-based query optimizer that tries to turn your SQL queries into something efficient that executes in as little time as possible
A flamegraph generator for Postgres EXPLAIN ANALYZE
output.
Installation
You can install via Homebrew with the follow command:
$ brew install mgartner/tap/pg_flame
Download one of the compiled binaries in the releases tab. Once downloaded, move pg_flame
into your $PATH
.
Alternatively, if you'd like to use Docker to build the program, you can.
$ docker pull mgartner/pg_flame
If you'd like to build a binary from the source code, run the following commands. Note that compiling requires Go version 1.13+.
$ git clone https://github.com/mgartner/pg_flame.git
$ cd pg_flame
$ go build
A pg_flame
binary will be created that you can place in your $PATH
.
A performance dashboard for Postgres
Documentation
PgHero is available as a Docker image, Linux package, and Rails engine.
pgtune takes the wimpy default postgresql.conf and expands the database server to be as powerful as the hardware it's being deployed on.
There is no need to build/compile pgtune, it is a Python script. Extracting the tarball to a convenient location is sufficient. Note that you will need the multiple pg_settings-<version>_<architecture> files included with the program too, pgtune can't work without those.
The RPM package installs:
- The pgtune binary under/usr/bin
- Documents in /usr/share/doc/pgtune-$version
- Setting files in /usr/share/pgtune
pgtune works by taking an existing postgresql.conf file as an input, making changes to it based on the amount of RAM in your server and suggested workload, and output a new file.
Here's a sample usage:
pgtune -i $PGDATA/postgresql.conf -o $PGDATA/postgresql.conf.pgtune
pgtune --help will give you additional usage information. These are the current parameters:
- -i or --input-config : Specifies the current postgresql.conf file.
- -o or --output-config : Specifies the file name for the new postgresql.conf file.
- -M or --memory: Use this parameter to specify total system memory. If not specified, pgtune will attempt to detect memory size.
- -T or --type : Specifies database type. Valid options are: DW, OLTP, Web, Mixed, Desktop
- -P or --platform : Specifies platform, defaults to the platform running the program. Valid options are Windows, Linux, and Darwin (Mac OS X).
- -c or --connections: Specifies number of maximum connections expected. If not specified, it depends on database type.
- -D or --debug : Enables debugging mode.
- -S or --settings: Directory where settings data files are located at. Defaults to the directory where the script is being run from. The RPM package includes a patch to use the correct location these files were installed into.
Pgtune - tuning PostgreSQL config by your hardware
Tuning PostgreSQL config by your hardware. Based on original pgtune. Illustration by Kate.
Web app build on top of middleman. To start it in development mode, you need install ruby, node.js and run in terminal:
$ bundle # get all ruby deps
$ yarn # get all node.js deps
$ middleman server # start server on 4567 port
Web Based PostgreSQL configuration tool
You can try powa at demo-powa.anayrat.info. Just click "Login" and try its features! Note that in order to get interesting metrics, resources have been limited on this server (2 vCPU, 384MB of RAM and 150iops for the disks). Please be patient when using it.
Thanks to Adrien Nayrat for providing it.
PoWA (PostgreSQL Workload Analyzer) is a performance tool for PostgreSQL 9.4 and newer allowing to collect, aggregate and purge statistics on multiple PostgreSQL instances from various :ref:`stat_extensions`.
Depending on your needs, you can either use the provided background worker (requires a PostgreSQL restart, and more suited for single-instance setups), or the provided :ref:`powa_collector` daemon (does not require a PostgreSQL restart, can gather performance metrics from multiple instances, including standby).
Web UI to view pg_stat_statements
pg_stat_statements
extension and execute CREATE EXTENSION pg_stat_statements
inside the database you want to inspect. Hint: there is an awesome article about pg_stat_statements in russian.config.yml.example
with your credentians and save it as config.yml
rake server
(or run rake console
to have command line)Add this line to your application's Gemfile:
gem 'pg_web_stats', require: 'pg_web_stats_app'
Or if gem is not released yet
gem 'pg_web_stats', git: 'https://github.com/shhavel/pg_web_stats', require: 'pg_web_stats_app'
And then execute:
$ bundle
Create file config/initializers/pg_web_stats.rb
# Configure database connection
config_hash = YAML.load_file(Rails.root.join('config', 'database.yml'))[Rails.env]
PG_WEB_STATS = PgWebStats.new(config_hash)
# Restrict access to pg_web_stats with Basic Authentication
# (or use any other authentication system).
PgWebStatsApp.use(Rack::Auth::Basic) do |user, password|
password == "secret"
end
Add to routes.rb
mount PgWebStatsApp, at: '/pg_stats'
timescaledb-tune
is a program for tuning a TimescaleDB database to perform its best based on the host's resources such as memory and number of CPUs. It parses the existing postgresql.conf
file to ensure that the TimescaleDB extension is appropriately installed and provides recommendations for memory, parallelism, WAL, and other settings.
You need the Go runtime (1.12+) installed, then simply go install
this repo:
$ go install github.com/timescale/timescaledb-tune/cmd/timescaledb-tune@main
It is also available as a binary package on a variety systems using Homebrew, yum
, or apt
. Search for timescaledb-tools
.
By default, timescaledb-tune
attempts to locate your postgresql.conf
file for parsing by using heuristics based on the operating system, so the simplest invocation would be:
$ timescaledb-tune
You'll then be given a series of prompts that require minimal user input to make sure your config file is up to date:
Using postgresql.conf at this path:
/usr/local/var/postgres/postgresql.conf
Is this correct? [(y)es/(n)o]: y
Writing backup to:
/var/folders/cr/zpgdkv194vz1g5smxl_5tggm0000gn/T/timescaledb_tune.backup201901071520
shared_preload_libraries needs to be updated
Current:
#shared_preload_libraries = 'timescaledb'
Recommended:
shared_preload_libraries = 'timescaledb'
Is this okay? [(y)es/(n)o]: y
success: shared_preload_libraries will be updated
Tune memory/parallelism/WAL and other settings? [(y)es/(n)o]: y
Recommendations based on 8.00 GB of available memory and 4 CPUs for PostgreSQL 11
Memory settings recommendations
Current:
shared_buffers = 128MB
#effective_cache_size = 4GB
#maintenance_work_mem = 64MB
#work_mem = 4MB
Recommended:
shared_buffers = 2GB
effective_cache_size = 6GB
maintenance_work_mem = 1GB
work_mem = 26214kB
Is this okay? [(y)es/(s)kip/(q)uit]:
Just like any advanced relational database, PostgreSQL uses a cost-based query optimizer that tries to turn your SQL queries into something efficient that executes in as little time as possible
4 Ways To Optimise PostgreSQL Database With Millions of Data
The goal of database performance tuning is to minimize the response time of your queries by making the best use of your system resources. The best use of these resources involves minimizing network traffic, disk I/O, and CPU time.
Some of the tricks we used to speed up SELECT-s in PostgreSQL: LEFT JOIN with redundant conditions, VALUES, extended statistics, primary key type conversion, CLUSTER, pg_hint_plan + bonus.
As commercial database vendors are bragging about their capabilities we decided to push PostgreSQL to the next level and exceed 1 billion rows per second to show what we can do with Open Source. To those who need even more: 1 billion rows is by far not the limit - a lot more is possible.
PostgreSQL Query Optimization Techniques
1649144319
Effective Web Solutions takes pride in its capacity to assess results when it comes to digital marketing. They not only measure their own performance, but they also compare it to that of their competitors. As the world of automation and artificial intelligence takes off, the importance of digital marketing, as well as the necessity for successful online marketing organizations, is expected to expand. But how do they assess success? Continue reading to find out. Here are a few of the most important ways they do it.
Social media is one of the fastest growing areas of online marketing. More individuals than ever before are interacting internationally and expressing their opinions and views about products, businesses, and services. With the use of social media, ESP Inspire can help your company break into new social circles. Creating an effective social media marketing plan needs knowledge and consistency. Here are some things to think about if you want to improve your social media presence. You may start constructing a social media marketing plan once you've determined your target audience.
Planning and implementing the appropriate message is one of the most critical phases in social media marketing. The main distinction between social media marketing and traditional forms of marketing is that social media channels allow the customer more influence over the message. You have control over the message and the substance in other kinds of marketing, but not on social media. The purpose of social media marketing is to get your company's name out there and in front of the proper individuals. It's important to realize that your company isn't only communicating to your target demographic, regardless of the media. They're going to make a big deal out of you!
Social media is becoming more widely used and has established itself as a vital route of communication between consumers and businesses. It links companies with their current and potential consumers. The company may use social media to interact with new and returning consumers. You may communicate directly with your consumers on social media and build a long-lasting brand. Social media is more accessible than ever before, and it's a very cost-effective way to communicate. To make your business stand out, Effective Web Solutions can create a social media marketing plan.
Offsite optimization is a crucial component of search engine optimization. To drive visitors and boost site authority, this method entails link building and smart placement. Offsite SEO has several advantages and may be quite valuable to a company. Creating and maintaining high-quality links is the greatest approach to ensure that your website ranks highly for the keywords that are most relevant to your business. Offshore optimization should be a priority for both onshore and offsite websites in effective web solutions.
Off-site optimization for a website entails a number of steps. It's a time-consuming operation that necessitates high-quality backlinks. Fortunately, there are a plethora of off-site optimization services available to assist. Continue reading to find out more about off-site SEO. You'll be pleased you took the time to do so. Some of the advantages of off-site SEO are as follows:
Increased website traffic is an important aspect of search engine optimization. Websites that are well-optimized generate more revenue and customers. Users who learn about companies and goods through search engines are more inclined to tell their friends about them. Offsite SEO methods are equally as vital as on-site SEO strategies. You can communicate to search engines that your material is valuable and useful by optimizing your pages for them.
Building links to your website is an important part of offsite SEO tactics. Link building is a crucial component of any digital marketing plan and an important aspect of SEO. It entails spreading the word about your articles and other stuff. This aids in the development of your backlink profile and the enhancement of your organic SEO rankings. Backlinks should be created for you by a competent website optimization business. This may be accomplished by producing a newsletter or a blog. It may also be done with the help of a spreadsheet application in some circumstances.
The first step in achieving exposure in search engine result pages is to create an SEO-friendly website. Web crawlers are used by search engines to index your sites and deliver relevant results. These crawlers are designed to operate with SEO web solutions, making your website SEO-friendly. Not every technology is designed with SEO in mind. Here are some suggestions for making your website more visible. Continue reading to find out more about SEO web solutions. Let's have a look at the various approaches.
When it comes to merging SEO with web design, there are several aspects to consider. The design must take into account the site's overall optimization. Content must be allocated by the designer, and marketing calls to action must be accommodated by the programmer. Finally, the SEO professional must engage with the design team to ensure that everything runs properly. The SEO professional will work with the entire team to seamlessly execute the overall look and feel of the site once the design and development teams have agreed on it.
ESP Inspire is a digital marketing firm that combines strategy, design, and technology in its work. Their strategy is built on the demands of their clients and the goals of their companies. ESP Inspire, a web development agency situated in California, is a strategic digital marketing firm. They focus on e-commerce, social media marketing, and B2B businesses with limited resources. Stealth Dicing, Makr Furniture, ISS, Remote Face, GDSI, and The Splash Lab are among the clients. The firm also collaborates with non-profit groups on initiatives.
#websolutions #effective #seo #smm #ppc #digitalmarketing #webdevelopement #webdesign #ecommercewebsite #maintenance
1648890000
timescaledb-tune
is a program for tuning a TimescaleDB database to perform its best based on the host's resources such as memory and number of CPUs. It parses the existing postgresql.conf
file to ensure that the TimescaleDB extension is appropriately installed and provides recommendations for memory, parallelism, WAL, and other settings.
You need the Go runtime (1.12+) installed, then simply go install
this repo:
$ go install github.com/timescale/timescaledb-tune/cmd/timescaledb-tune@main
It is also available as a binary package on a variety systems using Homebrew, yum
, or apt
. Search for timescaledb-tools
.
By default, timescaledb-tune
attempts to locate your postgresql.conf
file for parsing by using heuristics based on the operating system, so the simplest invocation would be:
$ timescaledb-tune
You'll then be given a series of prompts that require minimal user input to make sure your config file is up to date:
Using postgresql.conf at this path:
/usr/local/var/postgres/postgresql.conf
Is this correct? [(y)es/(n)o]: y
Writing backup to:
/var/folders/cr/zpgdkv194vz1g5smxl_5tggm0000gn/T/timescaledb_tune.backup201901071520
shared_preload_libraries needs to be updated
Current:
#shared_preload_libraries = 'timescaledb'
Recommended:
shared_preload_libraries = 'timescaledb'
Is this okay? [(y)es/(n)o]: y
success: shared_preload_libraries will be updated
Tune memory/parallelism/WAL and other settings? [(y)es/(n)o]: y
Recommendations based on 8.00 GB of available memory and 4 CPUs for PostgreSQL 11
Memory settings recommendations
Current:
shared_buffers = 128MB
#effective_cache_size = 4GB
#maintenance_work_mem = 64MB
#work_mem = 4MB
Recommended:
shared_buffers = 2GB
effective_cache_size = 6GB
maintenance_work_mem = 1GB
work_mem = 26214kB
Is this okay? [(y)es/(s)kip/(q)uit]:
If you have moved the configuration file to a different location, or auto-detection fails (file an issue please!), you can provide the location with the --conf-path
flag:
$ timescaledb-tune --conf-path=/path/to/postgresql.conf
At the end, your postgresql.conf
will be overwritten with the changes that you accepted from the prompts.
If you want recommendations for a specific amount of memory and/or CPUs:
$ timescaledb-tune --memory="4GB" --cpus=2
If you want to set a specific number of background workers (timescaledb.max_background_workers
):
$ timescaledb-tune --max-bg-workers=16
If you have a dedicated disk for WAL, or want to specify how much of a shared disk should be used for WAL:
$ timescaledb-tune --wal-disk-size="10GB"
If you want to accept all recommendations, you can use --yes
:
$ timescaledb-tune --yes
If you just want to see the recommendations without writing:
$ timescaledb-tune --dry-run
If there are too many prompts:
$ timescaledb-tune --quiet
And if you want to skip all prompts and get quiet output:
$ timescaledb-tune --quiet --yes
And if you want to append the recommendations to the end of your conf file instead of in-place replacement:
$ timescaledb-tune --quiet --yes --dry-run >> /path/to/postgresql.conf
timescaledb-tune
makes a backup of your postgresql.conf
file each time it runs (without the --dry-run
flag) in your temp directory. If you find that the configuration given is not working well, you can restore a backup by using the --restore
flag:
$ timescaledb-tune --restore
Using postgresql.conf at this path:
/usr/local/var/postgres/postgresql.conf
Is this correct? [(y)es/(n)o]: y
Available backups (most recent first):
1) timescaledb_tune.backup201901222056 (14 hours ago)
2) timescaledb_tune.backup201901221640 (18 hours ago)
3) timescaledb_tune.backup201901221050 (24 hours ago)
4) timescaledb_tune.backup201901211817 (41 hours ago)
Use which backup? Number or (q)uit: 1
Restoring 'timescaledb_tune.backup201901222056'...
success: restored successfully
We welcome contributions to this utility, which like TimescaleDB is released under the Apache2 Open Source License. The same Contributors Agreement applies; please sign the Contributor License Agreement (CLA) if you're a new contributor.
Author: timescale
Source Code: https://github.com/timescale/timescaledb-tune
License: Apache-2.0 License
1628758200
In today’s video, we’ll check out the 5 best WordPress maintenance and support services.
#elegantthemes #wordpress #maintenance
1626509520
Azure SQL Database and SQL Managed instance are Microsoft offerings for PaaS SQL Server in cloud infrastructure. In the case of a traditional on-premises SQL Server, at certain times, we require database system downtime to perform specific operations such as hardware upgrades, OS and SQL Server patching. In a critical production database system, it is challenging to get downtime and schedule these activities.
Like the on-premises infrastructure, Azure also performs planned maintenance for SQL Databases and SQL Managed Instances. Although Azure carries out the patching or maintenance, you might think of possible downtime so that you can plan your application availability.
Azure database and managed instance offer the following guaranteed availability.
Therefore, to maintain the commitment and SLA, Azure uses modern, robust service architecture for providing non-impactful and fully transparent service availability. For example, it uses hot patching (dynamic patching or live update) patching to apply updates without restarting the services. However, few updates require service restart, but it is maintained within the defined SLA.
To configure the maintenance window during the Azure SQL database deployment, navigate to the Additional Settings tab on Create SQL database.
Here by default, it uses system default – 5 PM to 8 AM maintenance window; however, you can configure it per the two additional maintenance window options as shown below.
You can also modify the maintenance window for the existing Azure SQL Database and SQL Managed Instance. In the Azure portal, navigate to Settings and Maintenance.
Currently, you do not get an option to configure the maintenance window in the following cases.
We can use Azure PowerShell to create a new Azure SQL Database with a specified maintenance window from the options shown earlier.
#azure #maintenance #sql azure
1623270600
With machine learning, we can reduce maintenance efforts and improve the quality of products. It can be used in various stages of the software testing life-cycle, including bug management, which is an important part of the chain. We can analyze large amounts of data for classifying, triaging, and prioritizing bugs in a more efficient way by means of machine learning algorithms.
Mesut Durukal, a test automation engineer at Rapyuta Robotics, spoke at Aginext 2021 about using machine learning in testing.
Durukal uses machine learning to classify and cluster bugs. Bugs can be classified according to severity levels or responsible team or person. Severity assignment is called triage and important in terms of prioritization, where the assignment of bugs to the correct team or person prevents a waste of time. Clustering bugs helps to see whether they heap together on specific features.
Exploring the available data on bugs with machine learning algorithms gave him more insight into the health of their products and the effectiveness of the processes that were used.
According to Durukal, machine learning can also be used to automate code reviews and for the self-healing of broken test cases after the code has been updated.
InfoQ interviewed Durukal about how he applied machine learning in his daily work as a tester.
InfoQ: What are the challenges that testers are facing nowadays?
Mesut Durukal: Nowadays, we have smartphones in our pockets. It would have sounded crazy to think about this some time ago, yet we have already normalized it. The point is, we have lots of smart solutions in our daily life. We can control the temperature in our room by vocal commands over mobile phones. We can connect them to the navigation panel in our cars as well.
Now let’s check the reflection of this conjecture onto use cases. As applications and platforms are connected to various others, there are lots of integration interfaces. The same application can be installed on various platforms: Mobile, PC, and IoT. This leads to a wide scope to be verified on various platforms with numerous integrations.
Since we can use smart solutions anywhere, there are lots of use cases in many domains like automotive, industry, robotics, and healthcare. Hence, domain knowledge is required to test successfully; learning never ends.
Using smart solutions in our daily life this much, we are generating a huge amount of data. Data management is difficult. We as testers need to monitor activities to be able to fully trace progress.
#prioritization #bug triaging #testing #maintenance #agile conferences #aginext.io london #data analysis #automation #big data #machine learning #ai # ml & data engineering #culture & methods #development #news
1622797690
WordPress is not static software, and WordPress websites are usually combinations of various plugins put together by different developers. As the WordPress core updates, plugins need to be updated too — either to keep them in sync with the WordPress core or to close vulnerabilities discovered by users.
WordPress does a good job of making these updates user-driven and intuitive through the control panel. However, increasingly site owners are turning to WordPress Maintenance agencies like Fixed.net to look after their WordPress sites.
#wordpress #maintenance
1619761182
If you have ever searched about app development, you might get a lot of stuff about app development frameworks,
app development steps, app store optimization tips and so on.
After the launch of the App, there is one thing that has its own importance i.e.
mobile app maintenance.
But there are just a few things that can guide you regarding the maintenance of mobile app.
Have a look at the complete guide at-
https://solaceinfotech.com/blog/an-ultimate-guide-to-mobile-app-maintenance/
You can hire ios developers of Solace team for an effective ios app development an dmaintenance at-
https://solaceinfotech.com/hire-developer/mobile/hire-ios-developer.php
#mobile-apps #maintenance #app #developers
1603436400
Data consistency errors are the nightmares of database administrators (DBAs) and when we notice “Could not continue scan with NOLOCK due to data movement” explanation in any error message, we are sure of getting in trouble. In this article, we will discuss the details of this data consistency problem.
Protecting the data is the main role of database administrators. However, due to some issues, DBA’s can experience data consistency errors. Due to the following situations, the logical and physical data consistencies can be corrupted:
In this section, we will corrupt the consistency of the Adventureworks2016 database so that we will realize this issue: “Error 601:Could not continue scan with NOLOCK due to data movement”.
In order to corrupt the Adventureworks database, we need a hex editor to edit the data file (MDF). XVI32 is free and does not require any installation so it can be a good option for editing the hex codes. At first, we will bring the database status to OFFLINE so that we can modify the MDF file.
ALTER DATABASE AdventureWorks2017 SET OFFLINE
In this step, we will launch the XVI32 with administrative rights and then click the File->Open and select the data file of the Adventurework2017 database.
We will press the Ctrl+F and search the **54 00 31 00 38 00 59 00 2D 00 35 00 30 **hex string in the editor.
#backup and restore #maintenance #data-science
1597204140
Recently, we have received a strange request from our customer. They want us to set up a schedule a backup job that generates a backup of SQL database, compress the backup file in multiple compressed archive files (WinRAR files.) We tried to explain to the customer that the SQL Server native backups are capable of compressing the backup file, and it can split a large and compressed backup into multiple backup files. But they insisted us to use the WinRAR software to compress and split the backup.
The IT team of the company has set up the network drive to save the backup file. To accomplish the task, we had taken the following approach:
For the demonstration, I have installed WinRAR software from here, restored a SQL database named AdventureWorks2017 on my workstation.
To set the environment variable, Open Control Panel Click on System. See the following image:
A dialog box, Systems opens. (Screen 1). On the dialog box, click on Advance System properties. On System Properties dialog box (Screen 2), click on the Advanced tab. In the Advanced tab, click on Environment variables. See the following image:
A dialog box environment variable opens. From User Variable for list box, select PATH and click on Edit. See the following image:
A dialog box named Edit environment variable opens. On the dialog box, click on New and add the location of the Winrar.exe file. Click on OK to close the Environment Variables dialog box. See the following image:
Click OK to close the environment variable dialog box and click OK to close the System Properties dialog box.
We will use a SQL Server stored procedure to generate the backup. The logic of the stored procedure is as follows:
The stored procedure accepts the following input parameters:
#backup and restore #maintenance #database
1597101660
Monitoring the growth of the SQL Database is one of the essential tasks of the SQL Server DBA. In this article, I am going to explain how we can monitor the growth of the SQL database using the default trace. First, let me explain the default trace in SQL Server.
SQL Server default trace was added as a feature in SQL Server 2005. It is a lightweight trace, and it contains five trace files. The default trace captures the following events:
It captures the following database events:
It captures the following object events:
It captures the following warnings and errors:
It also captures other SQL database events, and you can see the entire list by executing the following query:
select * from sys . trace_events order by category_id asc
The following is the output:
If the default trace is running, then you can view the schema change report from the SQL Server management studio (SSMS). To do that, launch SQL Server management studio -> connect to the database engine -> right-click on the desired database -> hover on **Reports **-> hover on **Standard Reports **-> select Schema Changes History“. See the following image:
The report contains a list of objects that have been created, altered, or deleted. See the following image:
As mentioned, the default trace is lightweight, but if you want to disable it, you can do it by executing the following queries.
EXEC sp_configure ‘default trace enabled’ , 0 ;
GO
RECONFIGURE ;
GO
You can view the location of the trace (*.trc) file by executing the following query.
SELECT * FROM :: fn_trace_getinfo ( default )
The following is the output:
#jobs #maintenance #monitoring #database
1596968467
In this article, we are going to learn how we can back up the SQL database to Azure using a database maintenance plan. To demonstrate the process, I have restored the AdventureWorks2017 database on my workstation. I have created an Azure container named sqlbackups in my storage account.
To view the storage account, log in to the Azure portal -> Click on Storage accounts -> on Storage Accounts screen, you can view the list of the storage accounts that have been created. See the following image:
Microsoft does not support the backup to the URL with the SAS token. If you try to create it using SAS token, you will receive the following error:
Msg 3225, Level 16, State 1, Line 5
Use of WITH CREDENTIAL syntax is not valid for credentials containing a Shared Access Signature.
Msg 3013, Level 16, State 1, Line 5
BACKUP DATABASE is terminating abnormally.
To fix the issue, we must create a SQL Server credentials using Access keys of the Azure storage account. To copy the access keys, log in to the Azure Portal -> Navigate to the Storage Account -> Click on Access Keys ->Copy the storage key specified in the key 1 textbox. Seethe following image:
Now, let us create a SQL Server credentials using the access keys. To do that, execute the below script:
CREATE CREDENTIAL [ Credentials To Connect Azure Storage ]
WITH IDENTITY = ‘sqlbkpstorageaccount’ ,
SECRET = ‘mQm1/TtieAhD/hHvY6V2e**********************SBJVvvLrUVbLwiA==’ ;
In the script,
Once credentials are created, we are going to use them to connect to the Azure storage account.
To create a maintenance plan, Open SQL Server Management Studio -> Connect to the database engine -> Expand Management -> Right-click on the Maintenance plan. See the following image:
Drag the Back Up Database Task from the toolbar and drop it on the maintenance plan designer. See the following image:
Double-click on Back Up Database Task. A dialog box opens to configure the settings of the maintenance plan. As mentioned, we want to generate the backup of the AdventureWorks2017 SQL Database so on the dialog box, click on the database (s) and select AdventureWorks2017 database from the list and click on OK. See the following image:
Select the database from the component section. To back up the SQL Database to Azure, instead of Disk location, we must provide the URL of the Azure storage container. To do that, select URL from the Back up to the drop-down box. See the following image:
To configure the backup destination, click on the Destination tab of the dialog box. From the SQL credentials drop-down box, choose the [Credentials To Connect Azure Storage]. When you select it, the name of the Azure storage container, URL prefix, and backup extension will be populated automatically. See the following image:
From the Options tab, you can specify the following details:
We do not want to change any other configuration, so click OK to save the maintenance plan and close the window.
Once the backup job is created, let us configure the notification operator. We want to create a notification for success and failure, so we must add two notification operator tasks from the toolbox. To do that, drag the notify operator task from the toolbox and drop on the maintenance plan designer. See the following image:
#azure #backup and restore #maintenance
1596803058
The App Development industry has skyrocketed. So has the demand for a high quality mobile app development company. While developing an app the app developers must strive to achieve a healthy balance between product ideas and project constraints. But before you hop on to the app development bandwagon and scout for or an app developer, it is essential that you understand a few important things about app maintenance. This is an aspect that most businesses fail to understand or else have not aptly understood its importance.
There are many ways in which bugs creep into an app. Instances such as the release of a new OS, release of a new device, changes in the design of the app inadvertently invite bugs.
To ensure that the app does not lose its fan base due to the occurrence of these bugs, app maintenance is necessary.
App maintenance is the oil that will keep the engine of your app running smoothly.
Why should you go for app maintenance?
Let’s understand the importance of maintenance for a mobile app in brief;
Updating according to store policy
Two of the world’s most loved operating systems Android and iOS constantly keep on updating themselves. Due to these regular advancements, it is common that many apps develop bugs as they are not designed to support these latest versions. By conducting proactive maintenance, you can be ready for these updates. For instance, whenever the beta version of these operating systems is made available, you can gear up to release the new version of your app. This way, when your competitors are scrambling to update their apps you will gain a pole position and release the new version of your app earlier, thus attracting more customers to your app.
Securing Your Mobile App from Cyber Threats
We at Prismetric have been providing Mobile App Services: From ideation to app maintenance for more than a decade now. We know that not maintaining an app regularly can have grave consequences from a security perspective. Today the customers are using mobile apps to conduct banking transactions as well as for availing various e-commerce services.
This has opened the floodgates for hackers who are exploiting the security loopholes in mobile apps to gain access to the victim’s smartphone. In this fast moving technology world, where tomorrow’s technology is outdated today no matter how good you have kept your security protocols, with time they tend to get outdated. By opting for app maintenance and support service, you can ensure that the app is always updated with the latest security protocols.
To keep competition at bay
To gain the advantage over competition the app developers need to constantly monitor the efficiency of an app. By ensuring that the app is bug-free, the app developers can make sure that the user experience is seamless. Thus, the users who are satisfied with the app have no reason to switch over.
Hence we see that app maintenance is an important step in the various steps involved in building an app. You can ignore app maintenance at your own peril.
Benefits of having a Maintenance Plan for your app
Avoiding Downtimes to Avoid Revenue Losses
Even big brands like Amazon, Flipkart, Bank of America, and BlackBerry have faced downtimes in their websites as well as apps. Downtime can cause serious long-term damage to the reputation of an app apart from causing short term damage of revenue lost during the downtime. A start up or a small company might not recover from such a loss. This is where the team of app developers takes responsibility for handling such situations and makes the necessary provisions to reduce downtimes.
Optimizing for new hardware
New processing chips and hardware are launched in smartphones at regular intervals. These changes in hardware have the ability to change the way an app functions. Also, these hardware changes bring with them new opportunities which can add new capabilities to an app. By constantly investing time and effort in the maintenance of the mobile app, the developers ensure that the app is always ahead of the curve by making itself compatible with these hardware changes.
Integrating emerging technologies
The mobile app development world is constantly evolving and new technologies are replacing the old ones at a breakneck speed. Technologies like artificial intelligence (AI), Internet of Things (IoT), AR, VR, and Blockchain are enhancing the limits of smartphones. If the business doesn’t avail the maintenance plan and fails to update the app according to the changing requirements of emerging technologies, then there is a high probability that the app will lose its place from the user’s smartphones.
Minimizing uninstalls
Contrary to popular belief, the journey of a mobile app does not end when the users download the app. Once the user starts using the app, then the real test begins. If the app fails to live up to the standards set by the users, then they won’t waste their time and uninstall it immediately. Worse, the users will rate the app poorly on their respective app stores, affecting future app installs in a negative manner. Through continuous monitoring and pre-emptive maintenance policies, the app development team can ensure that the app is updated and upgraded as a result of which the app uninstalls can be minimized.
Types of Maintenance Support
There are a few basic types of app maintenance, that as an app owner you should be aware of.
Corrective Maintenance
While hiring mobile app developers, ensure that the company provides corrective maintenance. This maintenance is done to remove any residual bugs or little errors that have crept in during the app development process. Normally good companies will offer you 1-3 months of corrective maintenance support.
Adaptive Maintenance
Adaptive maintenance is done so that the app attains good adaptability towards the ever changing app store guidelines and other external parameters like launch of new hardware and integration of new technologies. The essence of adaptive maintenance is to ensure that the app continues to work seamlessly in ever changing and upgrading environment.
Emergency Maintenance
Many times the need for emergency app maintenance arises. During this time it is essential that the team of developers are ready and competent enough to solve the issue. Ask your App developers regarding their emergency maintenance policy, so that if an emergency arises, then you are not left in the middle of the sea with no oars.
Preventive Maintenance
To ensure that everything works smoothly in the app, preventive maintenance is done. It is an important aspect of app development and should not be ignored. Discuss with the app development company regarding their preventive maintenance policy to make sure that everything goes on smoothly later.
Knowing a bit about app maintenance will help you in discussing about it with your app development company. Understand that app maintenance is one of the core fundamentals of the app development process.
Factors affecting the maintenance cost of an app
While considering the cost of app development, many times businesses tend to ignore the cost of maintaining the app. Have a look at the various costs involved in maintaining an app so that you do not make this grave mistake and take a wise decision for your business.
Hosting
Hosting is an important costing component, as for your app to actually work, you will have to pay for maintaining the database of your app on a server. The cost of hosting has come drastically down in recent years due to the availability of cloud hosting services like AWS (Amazon Web Services) and GCP (Google Cloud Platform). Make sure that you get estimation for your app maintenance cost from the trusted app development company, and do include the hosting costs in that estimate.
Read more at: https://www.prismetric.com/important-factors-to-know-mobile-app-maintenance/
#app #maintenance #app-maintenance #app-performance
1596785084
Mobile apps have gained importance for businesses due to its innovative features. User-friendly interface and marketing campaigns. If you have ever searched about app development, you might get a lot of stuff about app development frameworks, app development steps, app store optimization tips and so on. After the launch of the App, there is one thing that has its own importance i.e. mobile app maintenance. But there are just a few things that can guide you regarding the maintenance of mobile app. You already know the importance of mobile app development, but do you know the importance of app maintenance?
Here we will discuss all about the need, types and the engagement models of maintenance so that you can plan out your mobile app’s maintenance cost.
1. Hardware-
Nearly every 6 months, new hardware is being launched and adapted by each new phone that is to be launched. Hence your app should be optimized so as to work with modern hardware. This is a time where you need a maintenance team to make changes and release new versions.
2. Operating system-
Each year, there are new updates for iOS and Android version and accordingly apps should be updated for proper functioning.
3. Update app with technological trends-
Let us consider an example of Dark mode which changes the complete theme of your phone to dark in order to lessen the eye strain in case of low lighting. Now consider that, a user is accessing a mobile with dark theme, then he would expect the same from your app too. If you didn’t change the app as needed, the user may face eye strain and hence he may search for another similar app that provides dark mode and uninstall your app.
4. Minimizing uninstalls-
The app journey will not end with the launch of an app. Continuous monitoring and maintenance are the best ways to achieve success and it can be done after the app is being used. It is beneficial and necessary to consider the feedback you receive or the analytics you get. It will help to analyse the behaviour of users and get rid of uninstalls.
5. Integrating emerging technologies-
New technologies are taking over the market and you cannot predict anything about them. The only thing is to integrate it after a stable release. Let us see how maintenance is necessary for your app if it revolves around emerging technologies.
•* AI-*
Artificial intelligence needs data regularly and hiring a team for maintenance is cheaper than getting panel to control how data is fetched. AI algorithms could be anything but what it will fetch has to go through app directly. In this way, maintenance can solve your problems by changing the code.
• *IoT- *
IoT allows you to operate devices of daily chores through app, and hence maintenance is necessary
• AR-
Aligning images that are displayed on your phone is an ongoing task and it can be carried out smoothly through maintenance. Augmented reality always need more objects as your customers increases and it is better to hire permanent designers to make them, maintenance team would be cheaper for deploying them and updating the app accordingly.
6. Security-
Applications may be affected by security holes, and you have to update the app so as to fix the issues.
Apart from all these points, you should frequently update your app to keep it top on the app store.
1. Corrective-
Corrective maintenance includes removing faults and residual errors in the daily app functions. Residual errors are the errors in design, logic and coding.
2. Adaptive-
Know more at- https://solaceinfotech.com/blog/an-ultimate-guide-to-mobile-app-maintenance/
#mobile #apps #technology #maintenance