Keyword Argument:
A programming language feature, most well-known from python, where an argument of a function is optional with a defined default value.
def fibonacci(iterations=10):
lst = []
a, b = 1, 1
for _ in range(iterations):
lst.append(a)
a, b = b, a + b
return lst
fibonacci() # list of length 10.
fibonacci(iterations=20) # list of length 20.
Binary Search:
One of the classic algorithms students tend to learn in school and higher education. Rather than searching a sorted list for an item element-by-element...
def linear_search(lst, target):
for index, item in enumerate(lst):
if item == target:
return index
else:
return -1 # Not found.
Fun fact: for / while loops can have else statements, which get run
when all iterations are done but are skipped if a break occurs. Totally pointless
here, but can be useful!
...we instead check the mid-point of the list, then readjust our search based on whether the element is higher or lower than the target.
def binary_search(lst, target):
low = 0
high = len(lst) - 1
while low <= high:
# Divide and round down to the nearest whole number.
mid = (high + low) // 2
if lst[mid] == target:
return mid
elif lst[mid] > target:
low = mid + 1
else:
high = mid - 1
else:
return -1 # Not found.
I’m not going to teach time complexity here, but this is crazy efficient. The downside is that the list must be sorted, and sorting tends to be pretty slow, so probably only do this if the list is sorted anyway. Life is all about balance.
Amortising:
A property of some algorithms where an initial computational cost is balanced out, because future iterations can take advantage of this one-time investment.
Lets redefine binary search using keyword arguments:
def binary_search(lst, target, low=0, high=len(lst)-1):
while low <= high:
mid = (high + low) // 2
if lst[mid] == target:
return mid
elif lst[mid] > target:
low = mid + 1
else:
high = mid - 1
return -1
We need to do some fiddling around, since the default for high doesn’t work.
def binary_search(lst, target, low=0, high=None):
if high == None:
high = len(lst) - 1
# ...
If you wanted to shoehorn recursion into your program, you could use this definition to do that, but IMO this isn’t the place for it.
Technically this is recursive, just not the “function-calling-itself” recursive we are used to.
The real benefit is that you can start your binary search at any point. Imagine you need to find two elements in a list, and you know one element is bigger than the other:
assert a < b
You could find the first element, then use this index as the low for the second:
idx_of_a = binary_search(lst, a)
if idx_of_a == -1:
raise ValueError(f'`{a}` not found in list.')
idx_of_b = binary_search(lst, b, low=idx_of_a)
This gives a ‘rebate’ on the initial computation finding idx_of_a. In this case there
likely wouldn’t be a noticeable difference, but every search you do on this list can
potentially be sped up by using information you’ve had to discover anyways.
The takeaway shouldn’t be that binary search can be amortised using keyword arguments, but rather remembering that what you’ve just done can speed up what you’re about to do.
The next challenge should be to use this technique on something slower than binary search. I hate to admit it, but the ultra-efficient nature of binary search almost makes this entire article useless.