Python Binary Tree

A binary tree is a hierarchical record shape composed of nodes, where each node will have at maximum youngsters, called the left baby and the right baby. The form of a binary tree is such that every node could have zero, one, or two toddler nodes. 

In a binary tree: 

  • Each node contains a price or statistics element. 
  • A node's left toddler has a fee that is less than or identical to its determined node. 
  • A node's proper toddler has a value that is more than its figure node. 
  • The left and right subtrees of a node in a binary tree are binary trees.

Binary trees may represent various hierarchical systems and relationships, file structures, expression bushes, selection trees, and more. They are specifically helpful in seeking and sorting algorithms due to their ordered nature and green research abilities. It's essential to word that a binary tree no longer necessarily needs to be balanced or whole. It will have varying shapes and heights, with nodes allotted inconsistently throughout the tree. However, flat binary timber of AVL or red-black trees provides a more predictable overall performance.  

Operations 

  • Insertion: Insertion entails including a new node in the binary tree. The node is located in a function that keeps the ordering belongings of the binary tree, in which values within the left subtree are smaller than the determined node, and values inside the right subtree are large. 
  • Traversal: Traversal involves traveling each binary tree node in a selected order. Specific traversal strategies exist. They are in-order traversal, pre-order traversal, and post-order traversal.
  • Search: Searching entails locating a specific value or determining if it exists in the binary tree.
  • Deletion: Deletion includes removing a selected node from the binary tree while preserving the tree's properties. 

To delete a node, there are 3 cases to consider: 

  • Case 1: Node has no children. In this case, get rid of the node from the tree. 
    • Case 2: Node has one baby. Replace the node with its child, preserving the ordering assets. 
    • Case 3: Node has youngsters. Find the node's in-order predecessor or successor (typically the most cost inside the left subtree or the minimal cost inside the proper subtree), update the node's fee with the predecessor/successor cost, after which recursively delete the predecessor/successor node. 

Types

There are numerous binary trees, each with particular residences and traits. Here are a few typically recognized forms of binary timber: 

  • Full Binary Tree: Every node has zero or two youngsters in a complete binary tree. All leaf nodes (nodes without children) are at the same degree. Each non-leaf node has precisely two kids.  Example: 
1

           / \

         2   3

        / \ / \

       4 5 6 7
  • Complete Binary Tree: In a whole binary tree, all stages are filled, besides likely the final level, which is crammed from left to proper. It is a good data shape that may be implemented using an array. Example: 
1

           / \

         2   3

        / \ /

       4 5 6
  • Perfect Binary Tree: In a perfect binary tree, all inner nodes have two youngsters, and all leaf nodes are equal. The number of nodes in an ideal binary tree is 2^(h 1) - 1, where h is the tree's height. Example: 
1

           / \

         2   3

        / \ / \

       4 5 6 7
  • Balanced Binary Tree: In a balanced binary tree, the distinction in height among any node's left and proper subtrees is, at most, 1. It guarantees that the tree stays noticeably flat, primary to green search and insertion operations. Example: Encompass AVL trees and crimson-black timber. 
  • Binary Search Tree (BST): A binary seek tree is a binary tree that follows the ordering belongings, in which for any node, the values inside the left subtree are less than the node's cost, and the values inside the right subtree are extra. It allows for green searching, insertion, and deletion operations.  Example:
4

           / \

         2   6

        / \ / \

       1 3 5 7

Applications

  • Binary Search Trees (BSTs): Binary trees are usually used to enforce green-looking sorting algorithms. BSTs provide rapid average case search, insertion, and deletion operations. 
  • Expression Trees: Binary bushes can constitute arithmetic expressions, wherein the nodes represent operators and operands. Expression trees help compare and manipulate mathematical expressions. 
  • Huffman Coding: Binary trees are used compression algorithms like Huffman coding to encode and decode facts efficaciously. 
  • Hierarchical Data Structures: Binary trees represent hierarchical structures, including report systems, agency charts, and selection timber. 

Advantages 

  • Efficient Search: Binary seek trees provide a green way to search for elements, as each contrast eliminates 1/2 of the remaining search space. 
  • Ordered Data: Binary bushes can keep records in taken care of order, making it easier to carry out operations like finding minimum and maximum values or iterating over records in a particular order. 
  • Fast Insertion and Deletion: When balanced, binary search bushes offer fast insertion and deletion operations, typically with an O(log n) time complexity in ordinary and exceptional instances. 

Disadvantages 

  • Imbalanced Trees: If a binary tree becomes fantastically imbalanced (e.g., skewed or degenerate), it can degrade overall performance and result in the worst-case time complexity of O(n) for operations instead of the anticipated O(log n).
  • Limited Sorting Order: Binary bushes are inherently limited to an unmarried sorting order. For instance, in a binary seek tree, the sorting order is based on comparing factors and their keys. 
  • Overhead: Binary timber requires additional reminiscence to store the tree structure and pointers for left and proper kids, which can consume more reminiscence than arrays or related lists. 

Implementation of Binary Tree

Binary Tree Creation

Binary tree creation involves cautiously setting nodes in accurate positions based on the ordering assets, guaranteeing that the resulting tree keeps the favored shape and homes.

Algorithm Steps  

  • Define a class `Node` to represent a single binary tree node. It should have attributes for facts, left infant, and proper toddler. 
  • Create an instance of the `Node` class for the root node, passing the record's value. 
  • For every subsequent node, create a new example of `Node` and set it because of its determined node's left or right infant. 

Code

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

 # Create the binary tree manually

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

root.right.left = Node(6)

root.right.right = Node(7)

Complexity Analysis: The complexity of growing a binary tree is O(1) in keeping with node because it entails instantiating a brand-new item and placing the ideal attributes.

Insertion into the Binary Tree

Binary tree insertion ensures that the ordering assets of the binary tree are maintained, wherein values within the left subtree are much less than the figure node, and deals inside the right subtree are more than or identical to the discern node.

Algorithm Steps  

  • Start at the root node. 
  • If the modern-day node has no left child, create a new node with the given statistics and set it because of the left toddler of the present-day node. 
  • If the current node has no proper child, create a brand new node with the given information and set it because of the proper baby of the present-day node. 
  • If the modern node has each left and proper youngsters, recursively traverse the tree to find an appropriate role for insertion. 

Code

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

   # Print the tree

  def printTree(self):

        if self.left:

            self.left.printTree()

        print(self.data),

        if self.right:

            self.right.printTree()

 def insert_node(root, data):

    if root is None:

        root = Node(data)

    else:

        queue = [root]

        while queue:

            current = queue.pop(0)

            if current.left is None:

                current.left = Node(data)

                break

            else:

                queue.append(current.left)

            if current.right is None:

                current.right = Node(data)

                break

            else:

                queue.append(current.right)

# Create the binary tree

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

 # Insert a new node

insert_node(root, 6)

root.printTree()

Output

4

2

5

1

6

3

Complexity Analysis: The complexity of binary tree insertion relies upon the tree's structure. In the worst case, while the tree is relatively imbalanced, the time complexity may be O(N), given that it may require traversing all nodes. 

Search in Binary Tree  

Searching entails locating a specific value or determining if it exists in the binary tree. Binary trees seek benefits from the binary tree's ordering assets, wherein values in the left subtree are much less than the figure node, and values within the right subtree are extra than or equal to the parent node. These belongings facilitate correctly narrowing down the hunt area.

Algorithm Steps  

  • Start at the foundation node. 
  • If the cutting-edge node is `None` or its statistics suit the target fee, return to the present-day node. 
  • If the target cost is less than the present-day node's information, recursively search within the left subtree. 
  • If the goal price exceeds the cutting-edge node's records, recursively search inside the proper subtree. 

Code

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

 def search_node(root, target):

    if root is None or root.data == target:

        return root

    if target < root.data:

        return search_node(root.left, target)

    else:

        return search_node(root.right, target)

 # Create the binary tree

root = Node(4)

root.left = Node(2)

root.right = Node(6)

root.left.left = Node(1)

root.left.right = Node(3)

root.right.left = Node(5)

root.right.right = Node(7)

 # Search for a node

target = 5

result = search_node(root, target)

 if result is not None:

    print(f"Node with data {target} found in the binary tree.")

else:

    print(f"Node with data {target} not found in the binary tree.")

Output

Node with data 5 found in the binary tree.

Complexity Analysis: The complexity of binary tree search depends on the tree's shape. In the worst case, while the tree is enormously imbalanced, the time complexity can be O(N), seeing that it can require traversing all nodes. 

Tree Traversals

Tree traversal refers to journeying and processing every node in a tree records shape. Three commonplace forms of tree traversals exist: in-order, pre-order, and post-order. These traversals decide the order wherein the nodes are visited. 

Here's an explanation of each traversal approach 

In-Order Traversal

In in-order traversal, the tree nodes are visited in the order: left subtree, modern-day node, right subtree. In a binary search tree (BST), in-order traversal results in touring the nodes in ascending order based totally on their values. In preferred timber, the in-order traversal visits the nodes in a selected order decided by the tree's shape. In-order traversal is generally used in binary timber for responsibilities along with retrieving the nodes in taking care of the order, validating the order of the nodes, and comparing expressions in expression trees.

Algorithm  

  • Start at the root node. 
  • Recursively traverse the left subtree. 
  • Print the records of the present-day node. 
  • Recursively traverse the proper subtree. 

Code  

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

 def inorder_traversal(node):

    if node:

        inorder_traversal(node.left)

        print(node.data, end=" ")

        inorder_traversal(node.right)

 # Create the binary tree

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

# Perform in-order traversal

print("In-Order Traversal:")

inorder_traversal(root)

Output

In-Order Traversal:

4 2 5 1 3

Complexity Analysis

  • Time Complexity: The in-order traversal visits every node precisely as soon as, so the time complexity is O(N), wherein N is the number of nodes inside the binary tree. 
  • Space Complexity: The space complexity is O(H), where H is the height of the binary tree. This is due to the recursive characteristic calls which might be placed on the decision stack for the duration of the traversal method. In the worst case, while the tree is skewed, the distance complexity may be O(N); however, in a balanced binary tree, its miles are usually O(log N).

Pre-Order Traversal

In pre-order traversal, the tree nodes are visited within the order: cutting-edge node, left subtree, proper subtree. Pre-order traversal is valid when visiting the determined node before its youngsters are essential. Pre-order traversal is frequently used for operations together with growing a copy of the tree or when visiting the figure node first is essential. Pre-order traversal is commonly utilized in binary trees for duties, including creating a copy of the tree, building prefix expressions, and cloning the tree's structure.

Algorithm  

  • Start at the root node. 
  • Print the information of the current node. 
  • Recursively traverse the left subtree.
  • Recursively traverse the right subtree. 

Code  

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

def preorder_traversal(node):

    if node:

        print(node.data, end=" ")

        preorder_traversal(node.left)

        preorder_traversal(node.right)

# Create the binary tree

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

# Perform pre-order traversal

print("Pre-Order Traversal:")

preorder_traversal(root)

Output

Pre-Order Traversal:

1 2 4 5 3

Complexity Analysis

  • Time Complexity: The pre-order traversal visits every node precisely once, so the time complexity is O(N), wherein N is the wide variety of nodes in the binary tree.
  • Space Complexity: The area complexity is O(H), wherein H is the height of the binary tree. This is because of the recursive function calls positioned on the call stack at some stage in the traversal procedure. In the worst case, while the tree is skewed, the distance complexity can be O(N), but in a balanced binary tree, it's far generally O(log N). 

Post-Order Traversal

In post-order traversal, the tree nodes are visited in the order: left subtree, right subtree, and present-day node. Post-order traversal is beneficial when visiting the children nodes before the figure node is necessary. Post-order traversal is typically used for obligations such as deleting the nodes of a tree or comparing postfix expressions. Post-order traversal is usually utilized in binary timber for obligations, including deleting the nodes of a tree, evaluating postfix expressions, and freeing reminiscence allotted to the tree.

Algorithm  

  • Start at the root node. 
  • Recursively traverse the left subtree. 
  • Recursively traverse the right subtree. 
  • Print the records of the present-day node. 

Code  

class Node:

    def __init__(self, data):

        self.data = data

        self.left = None

        self.right = None

def postorder_traversal(node):

    if node:

        postorder_traversal(node.left)

        postorder_traversal(node.right)

        print(node.data, end=" ")

# Create the binary tree

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

# Perform post-order traversal

print("Post-Order Traversal:")

postorder_traversal(root)

Output

Post-Order Traversal:

4 5 2 3 1

Complexity Analysis

  • Time Complexity: The put-up-order traversal visits every node exactly once, so the time complexity is O(N), wherein N is the wide variety of nodes inside the binary tree. 
  • Space Complexity: The area complexity is O(H), where H is the peak of the binary tree. This is because of the recursive feature calls, which might be located on the decision stack throughout the traversal manner. In the worst case, while the tree is skewed, the distance complexity can be O(N); however, in a balanced binary tree, it's far more commonly O(log N). 

Deletion of nodes in Binary Tree

Deletion of nodes in a binary tree includes doing away with a particular node from the tree at the same time as preserving the structure and homes of the binary tree. Deleting in a binary tree involves traversing the tree to find the node to be deleted and managing the three cases. It is essential to regulate the determine-toddler relationships efficaciously after deletion to keep the integrity of the binary tree structure. 

Case 1: Deleting a Leaf Node  

When deleting a leaf node in a binary tree, sincerely remove the node from the tree. A leaf node is a node without youngsters. 

Algorithm  

  • Start at the root node and traverse the tree to locate the node to be deleted. 
  • If the node to be deleted is discovered: 
    • Check if it's far from a leaf node (has no left or proper infant). 
    • Remove it from the tree if it's far from a leaf node. 
    • Adjust the figure-toddler relationships, therefore, after deletion. 

Code

class Node:

    def __init__(self,key):

        self.key=key

        self.left = None

        self.right = None

def delete_leaf_node(root, key):

    if root is None:

        return root

    if root.left is not None and root.left.key == key:

        root.left = None

    elif root.right is not None and root.right.key == key:

        root.right = None

    else:

        delete_leaf_node(root.left, key)

        delete_leaf_node(root.right, key)

    return root

# Create the binary tree

root = Node(1)

root.left = Node(2)

root.right = Node(3)

root.left.left = Node(4)

root.left.right = Node(5)

# Delete a leaf node

key = 5

root = delete_leaf_node(root, key)

# In-order traversal of the modified tree

def inorder_traversal(node):

    if node:

        inorder_traversal(node.left)

        print(node.key, end=" ")

        inorder_traversal(node.right)

print("In-order Traversal of the Modified Tree:")

inorder_traversal(root)

Output

In-order Traversal of the Modified Tree:

4 2 1 3

Complexity Analysis

  • Time Complexity: O(N), in which N is the variety of nodes in the binary tree, as we want to traverse the tree to discover the node to be deleted. 
  • Space Complexity: O(H), where H is the height of the binary tree. The space complexity is due to the recursive calls positioned on the call stack throughout the deletion process. 

Case 2: Deleting a Node with One Child  

When deleting a node with one toddler in a binary tree, promote the kid node to replace the deleted node. 

Algorithm  

  • Start at the basis node and traverse the tree to locate the node to be deleted. 
  • If the node to be deleted is determined: 
    • Check if it has the most straightforward toddler (left or right). 
    • Promote the child node to take the place of the deleted node. 
    • Adjust the discern-toddler relationships as a consequence after deletion. 

Code

class Node:

    def __init__(self,key):

        self.key=key

        self.left = None

        self.right = None

def delete_node_with_one_child(root, key):

    if root is None:

        return root

    if key < root.key:

        root.left = delete_node_with_one_child(root.left, key)

    elif key > root.key:

        root.right = delete_node_with_one_child(root.right, key)

    else:

        # Case 2: Node with One Child

        if root.left is None:

            temp = root.right

            root = None

            return temp

        elif root.right is None:

            temp = root.left

            root = None

            return temp

    return root

# Create the binary tree

root = Node(8)

root.left = Node(3)

root.right = Node(10)

root.left.left = Node(1)

root.left.right = Node(6)

root.right.right = Node(14)

root.left.right.left = Node(4)

root.left.right.right = Node(7)

root.right.right.left = Node(13)

# Delete a node with one child

key = 6

root = delete_node_with_one_child(root, key)

# In-order traversal of the modified tree

def inorder_traversal(node):

    if node:

        inorder_traversal(node.left)

        print(node.key, end=" ")

        inorder_traversal(node.right)

print("In-order Traversal of the Modified Tree:")

inorder_traversal(root)

Output

In-order Traversal of the Modified Tree:

1 3 4 7 8 10 13 14

Complexity Analysis

  • Time Complexity: O(N), where N is the variety of nodes in the binary tree, as we must traverse the tree to find the node to be deleted. 
  • Space Complexity: O(H), wherein H is the peak of the binary tree. The space complexity is due to the recursive calls placed on the call stack during the deletion manner. 

Case 3: Deleting a Node with Two Children  

A binary tree's in-order successor (the minor node in its proper subtree) or in-order predecessor (the largest node in its left subtree) must be identified before a node with children can be deleted. Recursively delete the successor/predecessor node after copying its cost to the node that will be eliminated.   

Algorithm    

  • Locate the node that has to be eliminated by beginning at the foundation node and moving up the tree.  
  • If the node to be deleted is discovered: 
    • Check if it has kids. 
    • Find its in-order successor or in-order predecessor. 
    • Copy the value of the successor/predecessor to the node to be deleted. 
    • Recursively delete the successor/predecessor node. 
    • Adjust the determine-toddler relationships for this reason after deletion. 

Code  

class Node:

    def __init__(self,key):

        self.key=key

        self.left = None

        self.right = None

def inorder_successor(node):

    current = node

    while current.left:

        current = current.left

    return current

def delete_node_with_two_children(root, key):

    if root is None:

        return root

    if key < root.key:

        root.left = delete_node_with_two_children(root.left, key)

    elif key > root.key:

        root.right = delete_node_with_two_children(root.right, key)

    else:

        # Case 3: Node with Two Children

        if root.left is None:

            return root.right

        elif root.right is None:

            return root.left

        successor = inorder_successor(root.right)

        root.key = successor.key

        root.right = delete_node_with_two_children(root.right, successor.key)

    return root

# Create the binary tree

root = Node(8)

root.left = Node(3)

root.right = Node(10)

root.left.left = Node(1)

root.left.right = Node(6)

root.right.right = Node(14)

root.left.right.left = Node(4)

root.left.right.right = Node(7)

root.right.right.left = Node(13)

# Delete a node with two children

key = 8

root = delete_node_with_two_children(root, key)

# In-order traversal of the modified tree

def inorder_traversal(node):

    if node:

        inorder_traversal(node.left)

        print(node.key, end=" ")

        inorder_traversal(node.right)

print("In-order Traversal of the Modified Tree:")

inorder_traversal(root)

Output

In-order Traversal of the Modified Tree:

1 3 4 6 7 10 13 14

Complexity Analysis

Space Complexity: O(H), wherein H is the peak of the binary tree. The space complexity is due to the recursive calls positioned on the call stack throughout the deletion procedure.  

Time Complexity: O(N), wherein N is the variety of nodes within the binary tree, as we want to traverse the tree to find the node to be deleted.