DBMS Concepts

DBMS Tutorial Components of DBMS. Applications of DBMS The difference between file system and DBMS. Types of DBMS DBMS Architecture DBMS Schema Three Schema Architecture. DBMS Languages. What is Homogeneous Database? DBMS Functions and Components Advantages and Disadvantages of Distributed Database Relational Database Schema in DBMS Relational Schema Transaction Processing in DBMS Discriminator in DBMS Introduction to Databases

DBMS ER Model

ER model: Entity Relationship Diagram (ERD) Components of ER Model. DBMS Generalization, Specialization and Aggregation.

DBMS Relational Model

Codd’s rule of DBMS Relational DBMS concepts Relational Integrity Constraints DBMS keys Convert ER model into Relational model Difference between DBMS and RDBMS Relational Algebra DBMS Joins

DBMS Normalization

Functional Dependency Inference Rules Multivalued Dependency Normalization in DBMS: 1NF, 2NF, 3NF, BCNF and 4NF

DBMS Transaction

What is Transaction? States of transaction ACID Properties in DBMS Concurrent execution and its problems DBMS schedule DBMS Serializability Conflict Serializability View Serializability Deadlock in DBMS Concurrency control Protocols

Difference

Difference between DFD and ERD

Misc

Advantages of DBMS Disadvantages of DBMS Data Models in DBMS Relational Algebra in DBMS Cardinality in DBMS Entity in DBMS Attributes in DBMS Data Independence in DBMS Primary Key in DBMS Foreign Key in DBMS Candidate Key in DBMS Super Key in DBMS Aggregation in DBMS Hashing in DBMS Generalization in DBMS Specialization in DBMS View in DBMS File Organization in DBMS What Is A Cloud Database What Is A Database Levels Of Locking In DBMS What is RDBMS Fragmentation in Distributed DBMS What is Advanced Database Management System Data Abstraction in DBMS Checkpoint In DBMS B Tree in DBMS BCNF in DBMS Advantages of Threaded Binary Tree in DBMS Advantages of Database Management System in DBMS Enforcing Integrity Constraints in DBMS B-Tree Insertion in DBMS B+ Tree in DBMS Advantages of B-Tree in DBMS Types of Data Abstraction in DBMS Levels of Abstraction in DBMS 3- Tier Architecture in DBMS Anomalies in Database Management System Atomicity in Database Management System Characteristics of DBMS DBMS Examples Difference between Relational and Non-Relational Databases Domain Constraints in DBMS Entity and Entity set in DBMS ER Diagram for Banking System in DBMS ER Diagram for Company Database in DBMS ER Diagram for School Management System in DBMS ER Diagram for Student Management System in DBMS ER Diagram for University Database in DBMS ER Diagram of Company Database in DBMS Er Diagram Symbols and Notations in DBMS How to draw ER-Diagram in DBMS Integrity Constraints in DBMS Red-Black Tree Deletion in DBMS Red-Black Tree Properties in DBMS Red-Black Tree Visualization in DBMS Redundancy in Database Management System Secondary Key in DBMS Structure of DBMS 2-Tier Architecture in DBMS Advantages and Disadvantages of Binary Search Tree Closure of Functional Dependency in DBMS Consistency in Database Management System Durability in Database Management System ER Diagram for Bank Management System in DBMS ER Diagram for College Management System in DBMS ER Diagram for Hotel Management System in DBMS ER Diagram for Online Shopping ER Diagram for Railway Reservation System ER Diagram for Student Management System in DBMS Isolation in DBMS Lossless Join and Dependency Preserving Decomposition in DBMS Non-Key Attributes in DBMS Data Security Requirements in DBMS DBMS functions and Components Difference between RDBMS and MongoDB Database Languages and Interfaces in DBMS Starvation in DBMS Properties of Transaction in DBMS What is Heuristic Optimization In DBMS Transaction and its Properties in DBMS What is Denormalization in DBMS Domain Key Normal Form Types of Databases Advantages and Disadvantages of RDBMS Difference between RDBMS and MongoDB Database Languages and Interfaces in DBMS Starvation in DBMS Properties of Transaction in DBMS What is Heuristic Optimization In DBMS Transaction and its Properties in DBMS What is Denormalization in DBMS Algorithms and Complexities Database Backup and Recovery Distributed DBMS DDBMS - Transaction Processing Systems Magnetic Disks in DBMS Centralized and Client-Server Architectures for DBMS Representation of Class Hierarchy in DBMS Difference between Hierarchical Database and Relational Database A File Processing System in DBMS Homogeneous and Heterogeneous Databases Data Fragmentation and Replication in DBMS Data Integrity Meaning in DBMS What is Database Security in DBMS Difference between Database Schema and Database State

Algorithms and Complexities

Algorithm

An algorithm is a step-by-step process or set of well-defined rules to solve a problem or perform a specific task. It is the cornerstone of computing and programming.

Characteristics of an Algorithm

  • Well-defined: Algorithms are defined explicitly and unequivocally, leaving no possibility for ambiguity or various interpretations. Each stage must be distinct and easy to grasp.
  • Input: Algorithms accept zero or more inputs, which are values or variables upon which the algorithm acts to produce the desired result.
  • Output: Algorithms generate at least one output, which can be the result, a solution to a problem, or data.
  • Finiteness: Algorithms must have a finite number of steps. They eventually terminate after executing a defined sequence of operations.
  • Determinism: Every step of an algorithm must be deterministic, which means that given the same input and beginning conditions, the algorithm must yield the same result each time it is executed.
  • Feasibility: Algorithms use simple and feasible operations that are practical to execute, whether by a computer or a human.
  • Correctness: Algorithms must produce the correct output for all valid inputs. The output should meet the specified requirements and solve the problem as intended.
  • Ease of Implementation: Algorithms should be designed in a way that allows them to be translated into a specific programming language or implemented in hardware with relative ease.

Complexity

The amount of resources, such as time and memory, required by an algorithm to solve a problem as a function of input size is called its complexity. Complexity analysis is critical for evaluating and comparing the efficiency of different algorithms as the input data is more significant.

There are two kinds of complexity:

  • Time Complexity: This metric quantifies how long it takes an algorithm to finish its execution as a function of input size. It illustrates how the execution time of an algorithm increases as the amount of input data increases. Time complexity is sometimes represented using "big O" notation (e.g., O(n), O(n log n), O(1), and so on), which gives an upper bound on the rate of expansion of an algorithm's runtime about input size.
  • Space Complexity: The amount of memory or storage space an algorithm uses to solve a problem as a function of input size. Space complexity, like time complexity, to describe the upper constraint on the pace of memory consumption development, big O notation is widely used.

Types of Algorithms

Sorting Algorithms: Sorting algorithms rearrange elements in a list or an array into a specific order, such as numerical or lexicographic. Examples include Bubble Sort, Merge Sort, Quicksort, and Insertion Sort.

1. Bubble Sort: Steps over the list repeatedly, comparing neighboring components and swapping them if they are out of order.

  • Time Complexity: O(n^2) (worst and average case), O(n) (best case)
  • Space Complexity: O(1)

2.Insertion Sort: Builds the sorted array one item at a time by adding a new element into the array's already sorted segment.

  • Time Complexity: O(n^2) (worst and average case), O(n) (best case)
  • Space Complexity: O(1)

3.Selection Sort: Divides the input array into sorted and unsorted subarrays. It selects the smallest element from the unsorted portion and regularly transfers it to the sorted segment's end.

  • Time Complexity: O(n^2) (worst, average, and best cases)
  • Space Complexity: O(1)

4. Merge Sort: Divides the unsorted list into n sublists containing one element, then repeatedly merges sub-lists to produce new sorted sub-lists.

  • Time Complexity: O(n log n) (worst, average, and best cases)
  • Space Complexity: O(n)

5. Quick Sort: Selects a "pivot" element and divides the array into two sub-arrays, with elements smaller than the pivot to the left and elements more significant than the right. The subarrays are then sorted recursively.

  • Time Complexity: O(n^2) (worst case), O(n log n) (average and best cases)
  • Space Complexity: O(log n) (average and best cases), O(n) (worst case)

6. Heap Sort: Builds a binary heap from the input array and repeatedly extracts the biggest element from the heap, and places it in the sorted portion of the array.

  • Time Complexity: O(n log n) (worst, average, and best cases)
  • Space Complexity: O(1)

7. Counting Sort: When the range of input values is known and not extremely large, it works well. It counts the occurrences of each element and uses this information to arrange the elements correctly.

  • Time Complexity: O(n + k) (where n denotes the number of items and k is the input value range).
  • Space Complexity: O(k)

8. Radix Sort: It begins with the smallest significant digit and progresses to the largest significant digit.

  • Time Complexity: O(n * k) (where n represents the number of elements and k represents the maximum number of digits).
  • Space Complexity: O(n + k)

9. Searching Algorithms: Searching algorithms find the location of a specific item in a collection of data. The search algorithms binary search and linear search are the most widely utilized.

10. Binary Search: Efficient for sorted arrays; each comparison divides the search space in half.

  • Time Complexity: O(log n)
  • Space Complexity: O(1)

11. Linear Search: Simplest searching algorithm; checks each element sequentially.

  • Time Complexity: O(n)
  • Space Complexity: O(1)

12. Hashing: A hash function maps keys to indices in a hash table, allowing for rapid retrieval of values.

  • Time Complexity: O(1) in the average case, O(n) in the worst case.
  • Space Complexity: O(n)

13. Interpolation Search: Works well for uniformly distributed values; estimates position based on values.

  • Time Complexity: O(log log n) on average, O(n) in worst case
  • Space Complexity: O(1)

14. Exponential Search: Used when- the array is unbounded; doubles the range until a larger value is found.

  • Time Complexity: O(log n)
  • Space Complexity: O(1)

15. Fibonacci Search: Similar to binary se arch, it divides the array into segments based on Fibonacci numbers.

  • Time Complexity: O(log n)
  • Space Complexity: O(1)

16. Indexed Search (using B-trees): Uses a balanced tree structure to index data, providing efficient search and insertion.

  • Time Complexity: Depends on B-tree height (O(log n))
  • Space Complexity: Depends on the number of nodes in the B-tree

17. Graph Algorithms: Graph algorithms address problems involving networks, connectedness, and shortest pathways. Depth-First Search (DFS), Breadth-First Search (BFS), and Dijkstra's method are a few examples.

18. Breadth-First Search (BFS): BFS explores the graph level by level, starting from a source vertex. It's often used to find the shortest path and to explore all vertices in the shortest order.

  • Time Complexity: O(V + E) (V: number of vertices, E: number of edges)
  • Space Complexity: O(V)

19. Depth-First Search (DFS): DFS explores the graph as profoundly as possible along each branch before backtracking. It's often used for topological sorting, cycle detection, and connected component analysis.

  • Time Complexity: O(V + E)
  • Space Complexity: O(V) for recursion, O(log V) for iterative approach

20. Dijkstra's Algorithm: Dijkstra calculates the shortest paths from an initial node to all nodes in a weighted network. It's beneficial for finding the shortest path in non-negative weighted graphs.

  • Time Complexity: O((V + E) log V) with a binary heap or Fibonacci heap
  • Space Complexity: O(V + E)

21. Bellman-Ford Algorithm: Bellman-Ford finds the shortest paths from a starting node to all other vertices in a weighted graph, even when opposing weight edges are present. It can detect negative weight cycles as well.

  • Time Complexity: O(V * E)
  • Space Complexity: O(V)

22. Kruskal's Algorithm (Minimum Spanning Tree): Kruskal's approach computes the minimal spanning tree of a connected, undirected graph with weighted edges. It helps find the minimum edges that connect all vertices without forming cycles.

  • Time Complexity: O(E log E)
  • Space Complexity: O(V + E)

23. Prim's Algorithm (Minimum Spanning Tree): Prim's algorithm also finds the minimum spanning tree of a connected, undirected graph with weighted edges. It starts with a single vertex and grows the tree by adding the shortest edge that connects a vertex to a vertex outside the tree.

  • Time Complexity: O(E + V log V)
  • Space Complexity: O(V + E)

24.Topological Sorting: Topological sorting organizes the vertices of a directed acyclic graph (DAG) in a linear order such that vertex u appears before vertex v in the ordering for every directed edge (u, v).

  • Time Complexity: O(V + E)
  • Space Complexity: O(V)

25. Dynamic Programming: Dynamic programming algorithms solve problems by breaking them into overlapping subproblems and efficiently reusing previously computed results to find the final solution.

  • Space Complexity: O(n^2)

26. Fibonacci Sequence (Memoization and Tabulation): Calculates Fibonacci numbers using either memoization (top-down approach) or tabulation (bottom-up approach) to avoid redundant calculations.

  • Time Complexity: O(n) for both memoization and tabulation
  • Space Complexity: O(n) for memoization, O(1) for tabulation

27. Longest Common Subsequence (LCS): Finds the length of the longest non-contiguous subsequence shared by two sequences.

  • Time Complexity: O(m * n), where m and n are the input sequence lengths.
  • Space Complexity: O(m * n)

28. Knapsack Problem (0/1 Knapsack): Determines the most significant value that may be produced by picking items from a set without exceeding a specific weight limit.

  • Time Complexity: O(n * W), where n is the number of items and W is the knapsack capacity
  • Space Complexity: O(n * W)

29. Matrix Chain Multiplication: Finds the best way to multiply a series of matrices while minimizing the overall number of scalar multiplications.

  • Time Complexity: O(n^3)