1. Advanced Data Structures
Collections Module
Core concept: collections gives smarter containers than normal list/dict.
Why useful: Faster and cleaner for counting, default values, and queue operations.
Remember: Counter = count, defaultdict = no missing-key error, deque = fast left/right operations.
from collections import Counter, defaultdict, deque
# 1) Counter: count frequency
print(Counter([1, 1, 2, 3])) # Counter({1: 2, 2: 1, 3: 1})
# 2) defaultdict: missing key gets default value automatically
d = defaultdict(int)
d["a"] += 1
print(d["a"]) # 1
# 3) deque: fast insert/remove on both sides
dq = deque([1,2,3])
dq.appendleft(0)
print(dq.pop()) # 3
Code breakdown: First line imports tools. Counter creates value-frequency map. defaultdict(int) gives 0 for missing key, so d["a"] += 1 works safely. deque allows appendleft and pop efficiently.
Output summary: Counter shows counts, defaultdict avoids key error, deque returns last element 3.
Real interview use: Most frequent element -> Counter(arr).most_common(1).
2. Itertools (Powerful for Interviews)
Core concept: itertools helps generate patterns (all orders, pairs, cartesian products).
Where to use: Backtracking, combinations, and brute-force problems.
Remember: permutation = order matters, combination = order does not matter.
import itertools
print(list(itertools.permutations([1,2,3], 2)))
# [(1,2), (1,3), (2,1), (2,3), (3,1), (3,2)]
print(list(itertools.combinations([1,2,3], 2)))
# [(1,2), (1,3), (2,3)]
print(list(itertools.product([1,2], [3,4])))
# [(1,3), (1,4), (2,3), (2,4)]
Code breakdown: permutations builds ordered selections, combinations builds unordered selections, and product builds all pairings from two lists.
Output summary: You get ordered pairs, unordered pairs, and cross-product pairs.
3. Generators
Core concept: Generator produces values one-by-one using yield.
Why useful: Saves memory for large data.
Think: list = store all now, generator = create next only when needed.
def count_up(n):
for i in range(n):
yield i
gen = count_up(5)
print(next(gen)) # 0
print(next(gen)) # 1
Code breakdown: Function does not return full list. It pauses at each yield. next(gen) resumes function from last pause and gives next value.
Output summary: first next() gives 0, second gives 1.
4. Decorators
What it means: A decorator adds extra behavior around a function (before/after call).
When to use: Logging, timing, authorization, validation.
Simple view: Decorator is a wrapper around your original function.
def decorator(func):
def wrapper():
print("Before")
func()
print("After")
return wrapper
@decorator
def say_hello():
print("Hello")
say_hello()
# Before
# Hello
# After
Code breakdown: decorator receives original function, returns wrapper. @decorator means say_hello = decorator(say_hello).
How this code works: @decorator replaces say_hello with wrapper. So when you call it, Python runs Before -> original function -> After.
Output summary: It prints three lines in order: Before, Hello, After.
5. *args and **kwargs
What it means: *args takes extra positional values, **kwargs takes extra named values.
When to use: Flexible functions where number of inputs can vary.
Remember: *args -> tuple, **kwargs -> dictionary.
def func(*args, **kwargs):
print(args)
print(kwargs)
func(1, 2, 3, name="Ram")
# (1, 2, 3)
# {'name': 'Ram'}
Code breakdown: Python packs all unnamed values into tuple args and all named values into dict kwargs.
How this code works: *args stores normal values into one tuple, and **kwargs stores named values into one dictionary.
Output summary: first line is tuple of positional arguments, second line is dictionary of named arguments.
6. Map, Filter, Reduce
Core concept: 3 tools for list processing in one line.
Use: map changes values, filter keeps matching values, reduce combines to one value.
Shortcut: map = change each item, filter = keep some items, reduce = one final value.
from functools import reduce
print(list(map(lambda x: x*2, [1,2,3]))) # [2, 4, 6]
print(list(filter(lambda x: x%2==0, [1,2,3,4]))) # [2, 4]
print(reduce(lambda a,b: a+b, [1,2,3,4])) # 10
Code breakdown: map applies function to each element, filter keeps only True results, reduce repeatedly combines values into one.
Output summary: transformed list, filtered list, and final sum.
7. Advanced OOP
Inheritance
What it means: Child class reuses parent behavior and can override methods.
Why useful: write common code once in parent, customize in child.
class Animal:
def speak(self):
print("Animal sound")
class Dog(Animal):
def speak(self):
print("Bark")
Dog().speak() # Bark
Code breakdown: Dog(Animal) inherits parent methods, but its own speak overrides parent behavior.
How this code works: Dog inherits from Animal but overrides speak(), so Dog's method runs instead of parent method.
Output summary: prints Bark.
Dunder Methods
What it means: Special methods that define object behavior in Python.
Most asked: __init__ for setup, __str__ for print, __eq__ for object comparison.
class Person:
def __init__(self, name):
self.name = name
def __str__(self):
return self.name
print(Person("Ram")) # Ram
Code breakdown: object is created with name. On print(obj), Python calls obj.__str__().
How this code works: __init__ stores value when object is created. __str__ controls what is shown when you print object.
Output summary: printing object shows Ram instead of memory address.
Important:
__init__
__str__
__repr__
__len__
__eq__
8. Class vs Static Methods
Class method: Works with class (cls). Good for alternate constructors.
Static method: Utility method inside class, no self or cls.
Rule: Need class data? classmethod. Just helper logic? staticmethod.
class MyClass:
@classmethod
def cls_method(cls):
return cls.__name__
@staticmethod
def static_method():
return "helper"
print(MyClass.cls_method()) # MyClass
print(MyClass.static_method()) # helper
Code breakdown: cls_method can read class details (cls.__name__). static_method behaves like normal helper function inside class namespace.
How this code works: classmethod receives class (cls) automatically, staticmethod receives nothing automatically.
Output summary: first call returns class name, second returns helper string.
9. Context Managers
What it means: Handles setup/cleanup automatically with with.
When to use: Files, database connections, locks, resources.
Benefit: even on error, cleanup still runs.
class FileHandler:
def __enter__(self):
print("Start")
return self
def __exit__(self, exc_type, exc_val, exc_tb):
print("End")
with FileHandler():
print("Inside")
# Start
# Inside
# End
Code breakdown: with starts resource with __enter__, runs main block, then always calls __exit__ for cleanup.
How this code works: entering with calls __enter__, leaving block calls __exit__, even if error happens.
Output summary: start prints first, inside block prints second, cleanup prints at end.
10. Multithreading vs Multiprocessing
Threading: Best for I/O tasks (API calls, file/network wait).
Multiprocessing: Best for CPU-heavy tasks (math, image processing).
Fast choice in interview: waiting task -> threads, heavy computation -> processes.
import threading
import multiprocessing
# rule:
# Threading -> I/O bound
# Multiprocessing -> CPU bound
Code breakdown: for waiting operations, threads keep CPU busy while one task waits. for heavy calculations, processes use multiple CPU cores.
How this works in practice: use threading when program waits (network/file). use multiprocessing when CPU calculations are heavy.
- Threading -> I/O bound
- Multiprocessing -> CPU bound
11. Async Programming
Core concept: async/await runs many waiting tasks without blocking program flow.
When to use: APIs, scraping, bots, chat systems.
Important: Async is great for many waiting operations, not raw CPU speed.
import asyncio
async def main():
print("Hello")
await asyncio.sleep(1)
print("World")
asyncio.run(main())
# Hello
# (wait 1 second)
# World
Code breakdown: async def defines coroutine, await pauses current task without blocking loop, and asyncio.run starts event loop.
Output summary: prints Hello, waits 1 second, then prints World.
12. Memory Optimization
Goal: Use less memory for faster and scalable programs.
Simple strategy: avoid creating big temporary lists when you can stream values.
import sys
print(sys.getsizeof([1,2,3])) # memory used by list object
Code breakdown: getsizeof checks object memory footprint. Use it to compare memory cost of list, tuple, dict, etc.
How this code works: getsizeof shows bytes used by object in memory. helpful for comparing list vs generator choices.
Output summary: prints a number (bytes). exact value may differ by Python version/system.
Tips:
- Use generators instead of full lists for large loops.
- Use
__slots__ in classes to reduce object memory.
13. Time Complexity Must Know
Why this matters: Helps choose fast data structures in coding rounds.
Interview tip: if repeated lookup is needed, prefer set/dict over list search.
| Operation |
Complexity |
| List access | O(1) |
| List search | O(n) |
| Dict access | O(1) |
| Set lookup | O(1) |
| Sort | O(n log n) |
How to use this table: if operation repeats many times, choose data structure with lower complexity (for lookup, prefer dict/set over list).