Monty  Boehm

Monty Boehm


Model for Easy Database Maintenance

You may think database maintenance is none of your business. But if you design your models proactively, you get databases that make life easier for those who have to maintain them.

A good database design requires proactivity, a well-regarded quality in any work environment. In case you are unfamiliar with the term, proactivity is the ability to anticipate problems and have solutions ready when problems occur – or better yet, plan and act so that problems don’t occur in the first place.

Employers understand the proactivity of their employees or contractors equals cost savings. That’s why they value it and why they encourage people to practice it.

In your role as a data modeler, the best way to demonstrate proactivity is to design models that anticipate and avoid problems that routinely plague database maintenance. Or, at least, that substantially simplify the solution to those problems.

Even if you are not responsible for database maintenance, modeling for easy database maintenance reaps many benefits. For example, it keeps you from being called at any time to solve data emergencies that take away valuable time you could be spending on the design or modeling tasks you enjoy so much!

Making Life Easier for the IT Guys

When designing our databases, we need to think beyond the delivery of a DER and the generation of update scripts. Once a database goes into production, maintenance engineers have to deal with all sorts of potential problems, and part of our task as database modelers is to minimize the chances that those problems occur.

Let’s start by looking at what it means to create a good database design and how that activity relates to regular database maintenance tasks.

What Is Data Modeling?

Data modeling is the task of creating an abstract, usually graphical, representation of an information repository. The goal of data modeling is to expose the attributes of, and the relationships between, the entities whose data is stored in the repository.

Data models are built around the needs of a business problem. Rules and requirements are defined in advance through input from business experts so that they can be incorporated into the design of a new data repository or adapted in the iteration of an existing one.

Ideally, data models are living documents that evolve with changing business needs. They play an important role in supporting business decisions and in planning systems architecture and strategy. The data models must be kept in sync with the databases they represent so that they are useful to the maintenance routines of those databases.

Common Database Maintenance Challenges

Maintaining a database requires constant monitoring, automated or otherwise, to ensure it does not lose its virtues. Database maintenance best practices ensure databases always keep their:

  • Integrity and quality of information
  • Performance
  • Availability
  • Scalability
  • Adaptability to changes
  • Traceability
  • Security

MANY DATA MODELING TIPS are available to help you CREATE A GOOD DATABASE DESIGN EVERY TIME. The ones discussed below aim specifically at ensuring or facilitating the maintenance of the database qualities mentioned above.

Integrity and Information Quality

A fundamental goal of database maintenance best practices is to ensure the information in the database keeps its integrity. This is critical to the users keeping their faith in the information.

There are two types of integrity: physical integrity and logical integrity.

Physical Integrity

Maintaining the physical integrity of a database is done by protecting the information from external factors such as hardware or power failures. The most common and widely accepted approach is through an adequate backup strategy that allows the recovery of a database in a reasonable time if a catastrophe destroys it.

For DBAs and server administrators who manage database storage, it is useful to know if databases can be partitioned into sections with different update frequencies. This allows them to optimize storage usage and backup plans.

Data models can reflect that partitioning by identifying areas of different data “temperature” and by grouping entities into those areas. “Temperature” refers to the frequency with which tables receive new information. Tables that are updated very frequently are the “hottest”; those that are never or rarely updated are the “coldest.”

How to Model for Easy Database Maintenance

Data model of an e-commerce system differentiating hot, warm, and cold data.

A DBA or system administrator can use this logical grouping to partition the database files and create different backup plans for each partition.

Logical Integrity

Maintaining the logical integrity of a database is essential for the reliability and usefulness of the information it delivers. If a database lacks logical integrity, the applications that use it reveal inconsistencies in the data sooner or later. Faced with these inconsistencies, users distrust the information and simply look for more reliable data sources.

Among the database maintenance tasks, maintaining the logical integrity of the information is an extension of the database modeling task, only that it begins after the database is put into production and continues throughout its lifetime. The most critical part of this area of maintenance is adapting to changes.

Change Management

Changes in business rules or requirements are a constant threat to the logical integrity of databases. You may feel happy with the data model you have built, knowing that it is perfectly adapted to the business, that it responds with the right information to any query, and that it leaves out any insertion, update, or deletion anomalies. Enjoy this moment of satisfaction, because it is short-lived!

Maintenance of a database involves facing the need to make changes in the model daily. It forces you to add new objects or alter the existing ones, modify the cardinality of the relationships, redefine primary keys, change data types, and do other things that make us modelers shiver.

Changes happen all the time. It may be some requirement was explained wrong from the beginning, new requirements have surfaced, or you have unintentionally introduced some flaw in your model (after all, we data modelers are only human).

Your models must be easy to modify when a need for changes arises. It is critical to use a database design tool for modeling that allows you to version your models, generate scripts to migrate a database from one version to another, and PROPERLY DOCUMENT EVERY DESIGN DECISION.

Without these tools, every change you make to your design creates integrity risks that come to light at the most inopportune times. VERTABELO gives you all this functionality and takes care of maintaining the version history of a model without you even having to think about it.

How to Model for Easy Database Maintenance

The automatic versioning built into Vertabelo is a tremendous help in maintaining changes to a data model.

Change management and version control are also crucial factors in EMBEDDING DATA MODELING ACTIVITIES INTO THE SOFTWARE DEVELOPMENT LIFECYCLE.


When you apply changes to a database in use, you need to be 100% sure that no information is lost and that its integrity is unaffected as a consequence of the changes. To do this, you can use refactoring techniques. They are normally applied when you want to improve a design without affecting its semantics, but they can also be used to correct design errors or adapt a model to new requirements.

There are a large number of refactoring techniques. They are usually employed to GIVE NEW LIFE TO LEGACY DATABASES, and there are textbook procedures that ensure the changes do not harm the existing information. Entire books have been written about it; I recommend you read them.

But to summarize, we can group refactoring techniques into the following categories:

  • Data quality: Making changes that ensure data consistency and coherence. Examples include adding a lookup table and migrating to it data repeated in another table and adding a constraint on a column.
  • Structural: Making changes to table structures that do not alter the semantics of the model. Examples include combining two columns into one, adding a substitute key, and splitting a column into two.
  • Referential integrity: Applying changes to ensure that a referenced row exists within a related table or that an unreferenced row can be deleted. Examples include adding a foreign key constraint on a column and adding a non-null value constraint to a table.
  • Architectural: Making changes aimed at improving the interaction of applications with the database. Examples include creating an index, making a table read-only, and encapsulating one or more tables in a view.

Techniques that modify the semantics of the model, as well as those that do not alter the data model in any way, are not considered refactoring techniques. These include inserting rows to a table, adding a new column, creating a new table or view, and updating the data in a table.

Maintaining Information Quality

The information quality in a database is the degree to which the data meets the organization’s expectations for accuracy, validity, completeness, and consistency. Maintaining data quality throughout the life cycle of a database is vital for its users for making correct and informed decisions using the data in it.

Your responsibility as a data modeler is to ensure your models keep their information quality at the highest possible level. To do this:

  • The design must follow at least the 3rd normal form so that insertion, update, or deletion anomalies do not occur. This consideration applies mainly to databases for transactional use, where data is added, updated, and deleted regularly. It does not strictly apply in databases for analytical use (i.e., data warehouses), since data update and deletion are rarely performed, if ever.
  • The data types of each field in each table must be appropriate to the attribute they represent in the logical model. This goes beyond properly defining whether a field is of a numeric, date, or alphanumeric data type. It is also important to correctly define the range and the precision of values supported by each field. An example: an attribute of type Date implemented in a database as a Date/Time field may cause problems in queries, since a value stored with its time part other than zero may fall outside the scope of a query that uses a date range.
  • The dimensions and facts that define the structure of a data warehouse must align with the needs of the business. When designing a data warehouse, the dimensions and facts of the model must be defined correctly from the very beginning. Making modifications once the database is operational comes with a very high maintenance cost.

Managing Growth

Another major challenge in maintaining a database is preventing its growth from reaching the storage capacity limit unexpectedly. To help with storage space management, you can apply the same principle used in backup procedures: group the tables in your model according to the rate at which they grow.

A division into two areas is usually sufficient. Place the tables with frequent row additions in one area, those to which rows are rarely inserted in another. Having the model sectored this way allows storage administrators to partition the database files according to the growth rate of each area. They can distribute the partitions among different storage media with different capacities or growth possibilities.

How to Model for Easy Database Maintenance

A grouping of tables by their growth rate helps determine the storage requirements and manage its growth.


We create a data model expecting it to provide the information as it is at the time of the query. However, we tend to overlook the need for a database to remember everything that has happened in the past unless users specifically require it.

Part of maintaining a database is knowing how, when, why, and by whom a particular piece of data was altered. This may be for things such as finding out when a product price changed or reviewing changes in the medical record of a patient in a hospital. Logging can be used even to correct user or application errors since it allows you to roll back the state of information to a point in the past without the need to resort to complicated backup restoration procedures.

Again, even if users do not need it explicitly, considering the need for proactive logging is a very valuable means of facilitating database maintenance and demonstrating your ability to anticipate problems. Having logging data allows immediate responses when someone needs to review historical information.

There are different STRATEGIES FOR A DATABASE MODEL TO SUPPORT LOGGING, all of which add complexity to the model. One approach is called in-place logging, which adds columns to each table to record version information. This is a simple option that does not involve creating separate schemas or logging-specific tables. However, it does impact the model design because the original primary keys of the tables are no longer valid as primary keys – their values are repeated in rows that represent different versions of the same data.

Another option to keep log information is to use shadow tables. Shadow tables are replicas of the model tables with the addition of columns to record log trail data. This strategy does not require modifying the tables in the original model, but you need to remember to update the corresponding shadow tables when you change your data model.

Yet another strategy is to employ a subschema of generic tables that record every insertion, deletion, or modification to any other table.

How to Model for Easy Database Maintenance

Generic tables to keep an audit trail of a database.

This strategy has the advantage that it does not require modifications to the model for recording an audit trail. However, because it uses generic columns of the varchar type, it limits the types of data that can be recorded in the log trail.

Performance Maintenance and Index Creation

Practically any database has good performance when it is just starting to be used and its tables contain only a few rows. But as soon as applications start to populate it with data, performance may degrade very quickly if precautions are not taken in designing the model. When this happens, DBAs and system administrators call on you to help them solve performance problems.

The automatic creation/suggestion of indexes on production databases is a useful tool for solving performance problems “in the heat of the moment.” Database engines can analyze database activities to see which operations take the longest and where there are opportunities to speed up by creating indexes.

However, it is much better to be proactive and anticipate the situation by defining indexes as part of the data model. This greatly reduces maintenance efforts for improving database performance. If you are not familiar with the benefits of database indexes, I suggest reading ALL ABOUT INDEXES, STARTING WITH THE VERY BASICS.

There are practical rules that provide enough guidance for creating the most important indexes for efficient queries. The first is to generate indexes for the primary key of each table. Practically every RDBMS generates an index for each primary key automatically, so you can forget about this rule.

Another rule is to generate indexes for alternative keys of a table, particularly in tables for which a surrogate key is created. If a table has a natural key that is not used as a primary key, queries to join that table with others very likely do so with the natural key, not the surrogate. Those queries do not perform well unless you create an index on the natural key.

The next rule of thumb for indexes is to generate them for all fields that are foreign keys. These fields are great candidates for establishing joins with other tables. If they are included in indexes, they are used by query parsers to speed up execution and improve database performance.

Finally, it is a good idea to use a profiling tool on a staging or QA database during performance tests to detect any index creation opportunities that are not obvious. Incorporating the indexes suggested by the profiling tools into the data model is extremely helpful in achieving and maintaining the performance of the database once it is in production.


In your role as a data modeler, you can help maintain database security by providing a solid and secure base in which to store data for user authentication. Keep in mind this information is highly sensitive and must not be exposed to cyber-attacks.

For your design to simplify the maintenance of database security, follow the BEST PRACTICES FOR STORING AUTHENTICATION DATA, the main one among which is not to store passwords in the database even in encrypted form. Storing only its hash instead of the password for each user allows an application to authenticate a user login without creating any password exposure risk.

How to Model for Easy Database Maintenance

A complete schema for user authentication that includes columns for storing password hashes.

Vision for the Future

So, create your models for easy database maintenance with good database designs by taking into account the tips given above. With more maintainable data models, your work looks better, and you gain the appreciation of DBAs, maintenance engineers, and system administrators.

You also invest in peace of mind. Creating easily maintainable databases means you can spend your working hours designing new data models, rather than running around patching databases that fail to deliver correct information on time.

Original article source at:

#database #maintenance 

What is GEEK

Buddha Community

Model for Easy Database Maintenance
Ruth  Nabimanya

Ruth Nabimanya


System Databases in SQL Server


In SSMS, we many of may noticed System Databases under the Database Folder. But how many of us knows its purpose?. In this article lets discuss about the System Databases in SQL Server.

System Database

Fig. 1 System Databases

There are five system databases, these databases are created while installing SQL Server.

  • Master
  • Model
  • MSDB
  • Tempdb
  • Resource
  • This database contains all the System level Information in SQL Server. The Information in form of Meta data.
  • Because of this master database, we are able to access the SQL Server (On premise SQL Server)
  • This database is used as a template for new databases.
  • Whenever a new database is created, initially a copy of model database is what created as new database.
  • This database is where a service called SQL Server Agent stores its data.
  • SQL server Agent is in charge of automation, which includes entities such as jobs, schedules, and alerts.
  • The Tempdb is where SQL Server stores temporary data such as work tables, sort space, row versioning information and etc.
  • User can create their own version of temporary tables and those are stored in Tempdb.
  • But this database is destroyed and recreated every time when we restart the instance of SQL Server.
  • The resource database is a hidden, read only database that holds the definitions of all system objects.
  • When we query system object in a database, they appear to reside in the sys schema of the local database, but in actually their definitions reside in the resource db.

#sql server #master system database #model system database #msdb system database #sql server system databases #ssms #system database #system databases in sql server #tempdb system database

Django-allauth: A simple Boilerplate to Setup Authentication


A simple Boilerplate to Setup Authentication using Django-allauth, with a custom template for login and registration using django-crispy-forms.

Getting Started


  • Python 3.8.6 or higher

Project setup

# clone the repo
$ git clone

# move to the project folder
$ cd Django-Authentication

Creating virtual environment

  • Create a virtual environment for this project:
# creating pipenv environment for python 3
$ virtualenv venv

# activating the pipenv environment
$ cd venv/bin #windows environment you activate from Scripts folder

# if you have multiple python 3 versions installed then
$ source ./activate

Configured Enviromment

Environment variables

SECRET_KEY = #random string
DEBUG = #True or False
ALLOWED_HOSTS = #localhost
DATABASE_NAME = #database name (You can just use the default if you want to use SQLite)
DATABASE_USER = #database user for postgres
DATABASE_PASSWORD = #database password for postgres
DATABASE_HOST = #database host for postgres
DATABASE_PORT = #database port for postgres
ACCOUNT_EMAIL_VERIFICATION = #mandatory or optional
EMAIL_BACKEND = #email backend
EMAIL_HOST = #email host
EMAIL_HOST_PASSWORD = #email host password
EMAIL_USE_TLS = # if your email use tls
EMAIL_PORT = #email port

change all the environment variables in the .env.sample and don't forget to rename it to .env.

Run the project

After Setup the environment, you can run the project using the Makefile provided in the project folder.

 @echo "Targets:"
 @echo "    make install" #install requirements
 @echo "    make makemigrations" #prepare migrations
 @echo "    make migrations" #migrate database
 @echo "    make createsuperuser" #create superuser
 @echo "    make run_server" #run the server
 @echo "    make lint" #lint the code using black
 @echo "    make test" #run the tests using Pytest

Preconfigured Packages

Includes preconfigured packages to kick start Django-Authentication by just setting appropriate configuration.

django-allauthIntegrated set of Django applications addressing authentication, registration, account management as well as 3rd party (social) account authentication.
django-crispy-formsdjango-crispy-forms provides you with a crispy filter and {% crispy %} tag that will let you control the rendering behavior of your Django forms in a very elegant and DRY way.


  • Django-Authentication is a simple project, so you can contribute to it by just adding your code to the project to improve it.
  • If you have any questions, please feel free to open an issue or create a pull request.

Download Details:
Author: yezz123
Source Code:
License: MIT License

#django #python 

Rahim Makhani

Rahim Makhani


Website Support & Maintenance Service | Web Maintenance Company In India

The performance of the website leaves a major impact on the customer’s usage. As we all know the importance of the website in business and the effects it creates on the customers who visit your website.

Web performance is important for accessibility and also for another website that serves the goal of an organization or business. The good or bad performance of a website indirectly relates to the user experience as well as the overall performance of your business. This is why the maintenance of the website is a major thing to consider.

For website maintenance services you must contact Nevina Infotech that is the best company for website maintenance and services and has dedicated developers who will solve all the bugs from your website and improve the performance of your website.

#website maintenance services #website support and maintenance #website maintenance support #website maintenance packages #website maintenance company #website maintenance plans

 iOS App Dev

iOS App Dev


SingleStore: The One Stop Shop For Everything Data

  • SingleStore works toward helping businesses embrace digital innovation by operationalising “all data through one platform for all the moments that matter”

The pandemic has brought a period of transformation across businesses globally, pushing data and analytics to the forefront of decision making. Starting from enabling advanced data-driven operations to creating intelligent workflows, enterprise leaders have been looking to transform every part of their organisation.

SingleStore is one of the leading companies in the world, offering a unified database to facilitate fast analytics for organisations looking to embrace diverse data and accelerate their innovations. It provides an SQL platform to help companies aggregate, manage, and use the vast trove of data distributed across silos in multiple clouds and on-premise environments.

**Your expertise needed! **Fill up our quick Survey

#featured #data analytics #data warehouse augmentation #database #database management #fast analytics #memsql #modern database #modernising data platforms #one stop shop for data #singlestore #singlestore data analytics #singlestore database #singlestore one stop shop for data #singlestore unified database #sql #sql database

Rahim Makhani

Rahim Makhani


Get a Bug Free and smooth website with website Maintenance

Having a website for your own firm or business is very important as it can benefit you in many ways like your users can get 24/7 service from your company, you can exchange your information, it can help you to expand your business in the market. One must also maintain their website to keep it bug free and updated.

Your website should be bug free because if there is any bug in your website it will slow down the performance of it and will not even work properly if this happens then there are chances that you may lose your customers.

Are you searching for a company that can provide you with website support and maintenance? Nevina Infotech is the best company that can help you with the maintenance and support, as we have enthusiastic web app developers who can help you to maintain your website.

#website maintenance services #website support and maintenance #website maintenance support #website maintenance packages #website maintenance company #website maintenance plans