- A O(1)
- B O(log n)
- C O(n)
- D O(n log n)
Time Taken:
Correct Answer:
Wrong Answer:
Percentage: %
Binary heaps are a type of complete binary tree used to implement priority queues. In the worst case, when inserting an element into a binary heap, it needs to compare the element with the parent node and potentially swap them to maintain the heap property. As the height of a binary heap is logarithmic to the number of elements (n), the worst-case insertion operation has a time complexity of O(log n).
Depth-first search (DFS) is a graph traversal algorithm that explores as far as possible along each branch before backtracking. In the worst case, DFS visits each vertex once and each edge once, resulting in a time complexity proportional to the sum of the number of vertices (V) and the number of edges (E). Therefore, the time complexity of DFS is O(V + E).
In a stack implemented using a singly linked list, the pop operation removes the top element from the stack. Since the linked list maintains a reference to the top node, the pop operation can be performed in constant time by updating the reference to the next node. Thus, the time complexity of the pop operation is O(1).
In a breadth-first search (BFS) algorithm on a tree, each node is visited exactly once, and each edge is traversed once. As a tree has N-1 edges for N nodes, the time complexity of BFS on a tree is proportional to the number of nodes, denoted as O(N).
In a linear search algorithm for finding the maximum element in an unsorted array, each element is checked sequentially to determine the maximum. In the worst case, the maximum element is located at the end of the array, requiring n comparisons. Therefore, the time complexity of the linear search for finding the maximum element is linear, denoted as O(n).
In the worst case, a hash table's insertion operation may require rehashing or resizing the underlying array, which involves copying elements. As a result, the worst-case time complexity of an insertion operation in a hash table is linear to the number of elements (n), denoted as O(n).
In a binary heap, the delete operation removes the root element, which is typically the minimum or maximum element depending on the heap's type. After removing the root, the heap needs to restore the heap property by percolating down or up, which involves swapping elements and comparing them with their children or parent. As the height of a binary heap is logarithmic to the number of elements (n), the worst-case time complexity of a delete operation is O(log n).
Quicksort is a comparison-based sorting algorithm that divides the array into two subarrays based on a pivot element, recursively sorts the subarrays, and combines them. In the worst case, if the pivot selection consistently results in unbalanced partitions, quicksort degrades to its quadratic time complexity. This occurs when the pivot is always the minimum or maximum element, leading to n recursive calls and resulting in a time complexity of O(n^2).
Merge sort is a divide-and-conquer sorting algorithm that divides the array into smaller subarrays, recursively sorts them, and merges them to obtain the sorted array. In each recursion, the array is divided in half, resulting in a logarithmic number of levels. At each level, the merge operation combines two sorted subarrays, which takes linear time. Therefore, the time complexity of merge sort in the worst case is O(n log n).
In a breadth-first search (BFS) algorithm on a directed acyclic graph (DAG), each vertex is visited once, and each edge is traversed once. The time complexity of BFS on a DAG is proportional to the sum of the number of vertices (V) and the number of edges (E), denoted as O(V + E).