Skip to content

Instantly share code, notes, and snippets.

@collinskoech11
Last active August 10, 2022 04:40
Show Gist options
  • Save collinskoech11/f3cfbe178be0b39b78110511c5d5038b to your computer and use it in GitHub Desktop.
Save collinskoech11/f3cfbe178be0b39b78110511c5d5038b to your computer and use it in GitHub Desktop.
Data Structures and Algorithms(Notes & Guidelines)
#Definition, attributes & manipulation of linked lists
#Defining a Linked List
class Listnode:
def __init__(self, val=0, next=None):
self.val = val
self.next = next
# (1) -> (2) -> (3) -> (4) -> (5)
# | |
# | self.next
# 1= val
#head = [1,2,3,4,5]
## Reversing a linked list
def reverseList(self, head: Optiional[ListNode]) -> Optional[ListNode]:
if not head:
return None #return None if the linked list is empty
prev, curr = None, head
# curr.next.next
# |
#Null ->(1)->(2)->(3)->(4)->(5)
# | | |
# prev curr curr.next
# | |
# nxt prev
while curr:
nxt = curr.next
curr.next = prev
prev = curr
curr = nxt
return prev # returning prev as the new head of the linked list
#Definition, manipulation & attributes of binary trees
# Defining a binary tree
class TreeNode:
def __init__(self, val=0, left=None, right=None):
self.val = val
self.left = left
self.right = right
# r = right node (self.right)
#l = left node (self.left)
# 1 Root node
# / \
# l(2) (3) r
# / \ / \
# l(4) (5)rl(6) (7) r
#Depth of a Binary Tree
def maxDepth(self, root: TreeNode) -> int:
if not root:
return 0
return 1+ max(self.maxDepth(root.left), self.maxDepth(root.right))
#Inverting a binary Tree
def invertTree(self, root: Optional[TreeNode]) -> Optional[TreeNode]:
# 1
# / \
# (3) (2)
# / \ / \
# (7) (6) (5) (4) expected output
tmp = root.left # create a temp variable to store the value of the left of root
root.left = root.right
root.right = tmp
self.invertTree(root.right)
self.invertTree(root.left)
return root# return the root of the inverted tree
#merging two binary trees
def mergeTrees(self, root1: Optional[TreeNode], root2: Optional[TreeNode]) -> Optional[TreeNode]:
if not root1:
return root2
#if root 1 is empty returnn root2
if not root2:
return root1
#if root2 is empty return rooot1
root3 = TreeNode(root1.val + root2.val)#
root3.left = self.mergeTrees(root1.left, root2.left)
root3.right = self.ergeTrees(root1.right, rooot2.right)
return root3
#check if a binary tree us a subtree of another Tree
def areIdentical(self, T, S):
#Helper Function to check if binary trees are exactly the same
#Base Case
if T is None and S is None:
return True
if T is None or S is None:
return False
return (T.val == S.val and self.areIdentical(T.left and S.left) and self.areIdentical(T.right and S.right))
def isSubTree(self, T:Optional[TreeNOde], S:Optional[TreeNode]) -> bool:
if S is None:
return True
if T is None:
return False
#check tree with root as current Node
if (self.areIdentical(T, S)):
return True
#if the tree ith root as current node doesnt match
# then try left and right subtree one by one
return self.isSubtree(T.left, S) or self.isSubtree(T.right, S)
# Binary Search
### Efficient algorithm for finding an item in a sorted list of items . it works by repeatedly dividing in half the partitionof the list that could contain the item , until you,ve narrowed down the possible locations to just one
### we basically ignore half of the elements afer just one iteration
### i) Compare x with the middle element
### ii) if x matches the middle element, we return the mid index.
### iii) If x is greater than the middle element, then can only lie in the right half subaray after the middle element. so we only recurr in the right half
### iv) if x is smaller than the middle element we traverse through the left side
### * each time we shift the L or R pointer towards the location of x
### * if r becomes < than l then the element is not in the array
l,r = 0, len(nums)-1
if r >= l:
mid = l + (r - l) // 2
# If element is present at the middle itself
if arr[mid] == x:
return mid
# If element is smaller than mid, then it
# can only be present in left subarray
elif arr[mid] > x:
return binarySearch(arr, l, mid-1, x)
# Else the element can only be present
# in right subarray
else:
return binarySearch(arr, mid + 1, r, x)
else:
# Element is not present in the array
return -1
### Complexity analysis for Binary Search
`Sorted Array of 10 elements: 2, 5, 8, 12, 16, 23, 38, 56, 72, 91`
`Let us say we want to search for 23.`
#### Finding the given element:
#### Now to find 23, there will be many iterations with each having steps as mentioned in the figure above:
### Iteration 1:
`Array: 2, 5, 8, 12, 16, 23, 38, 56, 72, 91`
`Select the middle element. (here 16)`
#### Since 23 is greater than 16, so we divide the array into two halves and consider the sub-array after element 16.
#### Now this subarray with the elements after 16 will be taken into next iteration.
### Iteration 2:
`Array: 23, 38, 56, 72, 91`
`Select the middle element. (now 56)`
#### Since 23 is smaller than 56, so we divide the array into two halves and consider the sub-array before element 56.
#### Now this subarray with the elements before 56 will be taken into next iteration.
### Iteration 3:
`Array: 23, 38`
`Select the middle element. (now 23)`
#### Since 23 is the middle element. So the iterations will now stop.
#### Let say the iteration in Binary Search terminates after k iterations. In the above example, it terminates after 3 iterations, so here k = 3
#### At each iteration, the array is divided by half. So let’s say the length of array at any iteration is n
### At Iteration 1,
`Length of array = n`
#### At Iteration 2,
`Length of array = n⁄2`
#### At Iteration 3,
`Length of array = (n⁄2)⁄2 = n⁄22`
#### Therefore, after Iteration k,
`Length of array = n⁄2k`
#### After k iterations, the length of array becomes 1
#### Therefore
`Length of array = n⁄2k=1`
`=> n = 2k`
#### Applying log function on both sides:
`=> log2 (n) = log2 (2k)`
`=> log2 (n) = k log2 (2)`
`as (loga (a) = 1) `
#### therefore,
`=> k = log2 (n)`
#### Binary Search Algorithm whose complexity is O(log n).
# Depth First Search
### DFS is technique used for traversing tree or graph. Here backtracking is used for traversal. In this traversal first the deepest node is visited and then backtracks to it’s parent node if no sibling of that node exist.
## DFS Traversal of a Graph vs Tree
##### In graph, there might be cycles and dis-connectivity. Unlike graph, tree does not contain cycle and always connected. So DFS of a tree is relatively easier. We can simply begin from a node, then traverse its adjacent (or children) without caring about cycles. And if we begin from a single node (root), and traverse this way, it is guaranteed that we traverse the whole tree as there is no dis-connectivity,
## Example
(1)
/ \
(2) (3)
/ \
(4) (5)
#### Therefore in depth first search Traversal of this tree will be
##### a) inOrder(left, root, right): 4 2 5 1 3
##### b) preOrder(Root, left, right): 1 2 4 5 3
##### c) PostOrder(left, right, root): 4 5 2 3 1
## Time Complexity: O(n)
#### Auxiliary Space: If we don’t consider size of stack for function calls then O(1) otherwise O(n)
https://api.sofi.hk/tradeflix/app/completeSignup/35a515e5-40c9-4073-9dbd-f067790d41f6/716268
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment