Algorithms & Programming Fundamentals
If an algorithm has a big O notation of , what happens to the execution time as the number of inputs (n) increases?
The execution time decreases inversely with n.
The execution time remains constant regardless of n.
The execution time increases quadratically.
The execution time increases linearly with n.
Considering parallel processing capabilities which model enables maximum concurrency while avoiding data races when applied correctly?
Multithreaded execution within single process context
Shared-memory model using mutex locks
Event-driven programming relying on I/O callbacks
Actor model coordinating via message passing
Why would one prefer a linear search over binary search for small data sets?
As linear searches can only operate on unsorted data, removing the need for initial sorting.
Because linear search uses less memory during operation due to the absence of recursive calls.
Since all elements must be checked anyway, linear searches find targets quicker if they tend to appear early in the list.
Because overhead costs make binary search less effective on small lists despite its lower Big O notation value.
What kind of search is more efficient for finding an item in a sorted list?
Random search
Binary search
Linear search
Unordered search
Given a scenario where an application requires real-time data transmission with minimal delay, which Internet protocol ensures the highest efficiency while maintaining reliability?
HTTP (Hypertext Transfer Protocol)
RTP (Real-Time Protocol)
FTP (File Transfer Protocol)
SMTP (Simple Mail Transfer Protocol)
Which term best describes an algorithm designed to sort items in ascending order that compares each pair of adjacent items and swaps them if they are in the wrong order?
Merge Sort
Binary Search
Quick Sort
Bubble Sort
Given four algorithms with respective Big-O notations as followsโ, , , and โwhich would exhibit the slowest growth rate for large values of N when comparing worst-case scenarios?
O(N)
O(โN)
O(2^N)
O(NlogN)

How are we doing?
Give us your feedback and let us know how we can improve
What is the expected running time complexity for deleting all items from a doubly linked list which has no access to the previous pointer by iterating through the list?
Singly linked lists require traversal of the entire data structure.
Performance degrades due to comparisons needed to find the right placement; hence worst case becomes quadratic.
Even though accessing the next node is a simple operation, it still involves parsing each element.
Removing front or back does not affect the remaining nodes thus constant.
Which of the following best explains why abstraction is important for managing complexity in software development?
It helps developers focus on high-level concepts without worrying about low-level details.
It ensures that all possible cases are handled explicitly, making programs longer but easier to understand.
It allows programmers to write code that can be understood only by experts in a given field.
It requires developers to use more complex syntax and structures to improve program efficiency.
What advantage does an abstract data type (ADT) provide when determining the efficiency of algorithms?
It allows programmers to consider operations without needing specific implementation details, facilitating comparisons between different algorithms' efficiencies.
By requiring detailed knowledge of how data is represented internally leading to better optimized code.
ADTs mandate adding extra layers of complexity.
The explicit definition associated with ADTs forces algorithm designers into tighter constraints, decreasing efficiency potential.