Normalization in DBMS: Anomalies, Advantages, Disadvantages: At a basic level, normalization is the simplification of any bulk quantity to an optimum value. In the digital world, normalization usually refers to database normalization which is the process of organizing the columns (attributes) and tables (relations) of a relational database to minimize data repetition. In the process of database creation, normalization involves organizing data into optimal tables in such a way that the results obtained are always unambiguous and clear in concept.

Though database normalization can have the effect of duplication of data, it completely removes data redundancy. This process can be considered as a refinement process after the initial identification of data objects that are to be included in the database. It involves identification of the relationship between the data objects and defining the tables required and the columns to be added within each table.

This article is all about Normalization in DBMS: Anomalies, Advantages and disadvantages.

Normalization in DBMS: Anomalies, Advantages, Disadvantages

If a database design is not done properly, it may cause several anomalies to occur in it. Normalization is essential for removing various anomalies like:

Anomalies in Database

1) Update Anomalies: When several instances of the same data are scattered across the database without proper relationship/link, it could cause strange conditions where a few of the instances will get updated with new values whereas some of them will not. This leaves the database in an inconsistent state.

2) Deletion Anomalies: Incomplete deletion of a particular data section which leaves some residual instances. The database creator remains unaware of such unwanted data as it is present at a different location.

3)  Insertion Anomalies: This occurs when an attempt to insert data into a non-existent record.

Paying attention to these anomalies can help to maintain a consistent database.


Here we can see why normalization is an attractive prospect in RDBMS concepts.

1) A smaller database can be maintained as normalization eliminates the duplicate data. Overall size of the database is reduced as a result.

2) Better performance is ensured which can be linked to the above point. As databases become lesser in size, the passes through the data becomes faster and shorter thereby improving response time and speed.

3) Narrower tables are possible as normalized tables will be fine-tuned and will have lesser columns which allows for more data records per page.

4) Fewer indexes per table ensures faster maintenance tasks (index rebuilds).

5) Also realizes the option of joining only the tables that are needed.


1) More tables to join as by spreading out data into more tables, the need to join table’s increases and the task becomes more tedious. The database becomes harder to realize as well.

2) Tables will contain codes rather than real data as the repeated data will be stored as lines of codes rather than the true data. Therefore, there is always a need to go to the lookup table.

3) Data model becomes extremely difficult to query against as the data model is optimized for applications, not for ad hoc querying. (Ad hoc query is a query that cannot be determined before the issuance of the query. It consists of an SQL that is constructed dynamically and is usually constructed by desktop friendly query tools.). Hence it is hard to model the database without knowing what the customer desires.

4) As the normal form type progresses, the performance becomes slower and slower.

5) Proper knowledge is required on the various normal forms to execute the normalization process efficiently. Careless use may lead to terrible design filled with major anomalies and data inconsistency.

So it was all about Normalization in DBMS: Anomalies, Advantages and disadvantages. If you want to ask any questions then please comment below.