Algorithms and Problem Solving
Subject: Computer Science
Topic: 6
Cambridge Code: 0478
Algorithm Concept
Algorithm - Step-by-step procedure to solve problem
Algorithm Properties
- Finiteness: Must end
- Definiteness: Each step clear
- Input: Receives data
- Output: Produces result
- Effectiveness: Achievable
Searching Algorithms
Linear Search
Linear search - Check each element sequentially
def linear_search(list, target):
for i in range(len(list)):
if list[i] == target:
return i
return -1 # Not found
Characteristics:
- Works on unsorted lists
- Time: O(n) - checks all elements
- Simple but slow for large lists
Example: [3, 1, 4, 1, 5], search for 4
- Check 3: No
- Check 1: No
- Check 4: Yes → Return position
Binary Search
Binary search - Divide and conquer approach
Requirement: List must be sorted
def binary_search(list, target):
left = 0
right = len(list) - 1
while left <= right:
mid = (left + right) // 2
if list[mid] == target:
return mid
elif list[mid] < target:
left = mid + 1 # Search right half
else:
right = mid - 1 # Search left half
return -1 # Not found
Characteristics:
- Only works on sorted lists
- Time: O(log n) - much faster
- Example: [1, 3, 4, 5, 8], search for 4
- Check middle (4): Found → Return
Comparison
| Feature | Linear | Binary |
|---|---|---|
| Sorted required | No | Yes |
| Time complexity | O(n) | O(log n) |
| Data structure | Any | Array |
| Implementation | Simple | More complex |
| Speed | Slow | Fast |
Sorting Algorithms
Bubble Sort
Bubble sort - Compare adjacent elements, swap if needed
def bubble_sort(list):
n = len(list)
for i in range(n):
for j in range(0, n - i - 1):
if list[j] > list[j + 1]:
# Swap
list[j], list[j + 1] = list[j + 1], list[j]
return list
Process:
- Compare adjacent pairs
- Swap if out of order
- Repeat until sorted
- Each pass bubbles largest to end
Example: [3, 1, 4, 2]
- Pass 1: [1, 3, 2, 4]
- Pass 2: [1, 2, 3, 4]
Characteristics:
- Time: O(n²)
- Space: O(1)
- Simple but inefficient
- Stable sort
Insertion Sort
Insertion sort - Build sorted array one element at time
def insertion_sort(list):
for i in range(1, len(list)):
key = list[i]
j = i - 1
while j >= 0 and list[j] > key:
list[j + 1] = list[j]
j -= 1
list[j + 1] = key
return list
Process:
- Start with second element
- Compare with sorted portion
- Insert in correct position
- Move to next element
Characteristics:
- Time: O(n²) worst case
- Space: O(1)
- Efficient for small lists
- Stable sort
Quick Sort
Quick sort - Divide and conquer
def quick_sort(list):
if len(list) <= 1:
return list
pivot = list[len(list) // 2]
left = [x for x in list if x < pivot]
middle = [x for x in list if x == pivot]
right = [x for x in list if x > pivot]
return quick_sort(left) + middle + quick_sort(right)
Process:
- Choose pivot
- Partition into smaller/larger
- Recursively sort partitions
- Combine
Characteristics:
- Time: O(n log n) average
- Space: O(log n) recursive calls
- Very efficient for large lists
- Not stable
Merge Sort
Merge sort - Divide and merge
def merge_sort(list):
if len(list) <= 1:
return list
mid = len(list) // 2
left = merge_sort(list[:mid])
right = merge_sort(list[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
Characteristics:
- Time: O(n log n) always
- Space: O(n)
- Reliable, predictable
- Stable sort
Time Complexity (Big O Notation)
Big O - Measure of algorithm efficiency relative to input size
Common Complexities
| Notation | Name | Operations for n=1000 |
|---|---|---|
| O(1) | Constant | 1 |
| O(log n) | Logarithmic | ~10 |
| O(n) | Linear | 1,000 |
| O(n log n) | Linearithmic | ~10,000 |
| O(n²) | Quadratic | 1,000,000 |
| O(n³) | Cubic | 10⁹ |
| O(2ⁿ) | Exponential | 10³⁰ |
| O(n!) | Factorial | Huge |
Analysis
Worst case: Maximum operations Best case: Minimum operations Average case: Typical scenario
Example (Bubble sort):
- Worst case: O(n²)
- Best case: O(n) if already sorted
- Average: O(n²)
Algorithm Choosing
Consider:
- Speed: Time complexity
- Memory: Space complexity
- Implementation: Coding complexity
- Stability: Order preservation
- Input data: Sorted or unique
Guidelines:
- Small lists: Bubble or insertion sort
- Large lists: Quick or merge sort
- Nearly sorted: Insertion sort
- Stability needed: Merge or insertion
- Unknown data: Quick sort (good average)
Recursion
Recursion - Function calling itself
def factorial(n):
if n == 0 or n == 1:
return 1
else:
return n * factorial(n - 1)
Elements:
- Base case: Stops recursion
- Recursive case: Calls itself
- Termination: Must eventually reach base
Example: factorial(5)
- 5 × factorial(4)
- 5 × 4 × factorial(3)
- 5 × 4 × 3 × 2 × 1 = 120
Key Points
- Linear search: Simple, O(n), unsorted data
- Binary search: Fast O(log n), requires sorted data
- Bubble sort: O(n²), simple, inefficient
- Insertion sort: O(n²), good for small lists
- Quick sort: O(n log n) average, very efficient
- Merge sort: O(n log n) guaranteed, stable
- Big O notation measures complexity
- Choose algorithm based on data and requirements
Practice Questions
- Implement linear search
- Trace binary search
- Sort by hand (bubble, insertion)
- Analyze time complexity
- Choose appropriate algorithm
- Optimize algorithm
- Trace recursive function
Revision Tips
- Know sorting algorithms thoroughly
- Practice tracing algorithms
- Understand Big O notation
- Know algorithm trade-offs
- Compare search methods
- Recognize ideal use cases
- Practice implementations