Skip to main content
Back to Blogs
Python
Functions
Generators
Iterators
Functional Programming
Memory Optimization
Best Practices

Python Functions, Iterators & Generators

Amit Divekar

Python Functions, Iterators & Generators

I've been writing Python for a while now, and the stuff that genuinely changed how I think about code wasn't the syntax or the standard library. It was learning to use functions properly. Not just defining them, but understanding lambdas, generators, custom iterators, and when each one actually makes sense. This post walks through all of that with real examples, including a few places where I got it wrong before I got it right.

Lambda Functions

Lambdas are one of those things I avoided early on because they looked weird. I kept writing tiny named functions for single-use transforms until I realized I was just cluttering everything. Lambdas are anonymous single-line functions, and they shine when you're passing logic around as a value rather than calling it by name.

Basic Lambda Operations

# Mathematical operations as lambdas add = lambda x, y: x + y subtract = lambda x, y: x - y multiply = lambda x, y: x * y cube = lambda x: x ** 3 print(add(10, 5)) # 15 print(multiply(10, 5)) # 50 print(cube(3)) # 27

Lambda with map()

Pairing lambdas with map() is where they really earn their place. Instead of writing a loop or a throwaway function just to square every number in a list, you just do this:

numbers = [1, 2, 3, 4, 5] squared = list(map(lambda x: x ** 2, numbers)) print(squared) # [1, 4, 9, 16, 25]

That said, I'd keep lambdas to simple one-liners. If you find yourself nesting logic inside one, just write a proper function. Readability wins every time.

Default Arguments

I learned the value of default arguments after writing the same boilerplate check at the top of functions over and over. Something like if price is None: price = 100. That's just noise. Default arguments handle it cleanly:

def productInfo(name, price=100): if price < 100: print(f"{name} - Rs.{price}") productInfo("Pen", 50) # Prints: Pen - Rs.50 productInfo("Book", 150) # No output (price >= 100) productInfo("Eraser") # Uses default price of 100

This is useful for optional config values, pagination defaults, fallback thresholds. Basically anywhere you want a sensible default without forcing the caller to always spell it out.

Variable Arguments: *args and **kwargs

This is the feature I didn't fully appreciate until I started writing wrapper functions and hit the wall of "but what if the caller passes something I didn't expect?" *args and **kwargs are the answer to that.

Using **kwargs for Flexible APIs

def employee_details(name, **info): print(f"Employee Name: {name}") for key, value in info.items(): print(f"{key}: {value}") employee_details("Amit", department="IT", salary=50000, experience=5) # Output: # Employee Name: Amit # department: IT # salary: 50000 # experience: 5

Combining *args and **kwargs

def mixed_arguments(*args, **kwargs): print("Positional arguments (*args):") for arg in args: print(arg) print("\nKeyword arguments (**kwargs):") for key, value in kwargs.items(): print(f"{key}: {value}") mixed_arguments(1, 2, 3, "Hello", name="Amit", age=21, city="Mumbai")

This pattern shows up constantly in decorator code and any function that wraps something else. Once you get used to it, you'll start spotting places where you were previously hardcoding argument signatures that didn't need to be rigid.

Duck Typing: Python's Take on Interfaces

Python doesn't have interfaces the way Java does. Instead it has duck typing: if the object has the method you're trying to call, it works. If it doesn't, it blows up at runtime.

def process_data(data): print(f"Length of data: {len(data)}") print(f"Data: {data}") process_data("Hello") # Works with strings process_data([1, 2, 3, 4, 5]) # Works with lists process_data((10, 20, 30)) # Works with tuples process_data({"name": "Amit", "age": 21}) # Works with dictionaries

This function doesn't care what type you pass in. As long as the object supports __len__(), it'll work. It's a genuinely different mental model from statically-typed languages, and once it clicks, you start writing much more reusable code.

Recursive Functions: Useful, But Handle With Care

Recursion is clean when it matches the shape of the problem. I actually got bitten by the recursion limit the first time I tried this on a large input. Passed in a number with many digits, and Python threw a RecursionError because I hadn't thought through the depth at all. The fix was adding the base case early and keeping the recursion shallow by reducing the number at each step:

def sum_of_digits(n): if n < 10: return n else: digit_sum = 0 while n > 0: digit_sum += n % 10 n = n // 10 return sum_of_digits(digit_sum) print(sum_of_digits(457)) # 4+5+7=16, 1+6=7 print(sum_of_digits(9875)) # 9+8+7+5=29, 2+9=11, 1+1=2

Python's default recursion limit is 1000 frames. For most problems that's fine, but if you're recursing over deep structures or large inputs, you'll want to either use an iterative approach or bump the limit with sys.setrecursionlimit(). I'd lean iterative unless the recursive version is genuinely clearer.

Generators: The Feature I Should Have Learned Earlier

Generators were the thing I put off learning for way too long. I kept just building lists and wondering why my script was eating 2GB of RAM. Generators produce values on demand instead of materializing everything in memory at once. The syntax is the same as a regular function except you use yield instead of return.

Basic Generator

def even_numbers(): for i in range(0, 11, 2): yield i for num in even_numbers(): print(num) # 0, 2, 4, 6, 8, 10

Generator Pipelines for Data Processing

This is where generators get genuinely powerful. You can chain them together and each stage only processes one item at a time. No intermediate lists, no unnecessary allocations. I used this pattern when processing a stream of sensor readings and it made a huge difference compared to my first attempt, which built a full filtered list before converting units:

def sensor_data_generator(): data = [25.5, 30.2, 22.8, 35.6, 28.3, 31.9, 26.7, 33.4] for value in data: yield value def filter_temperature(data): for value in data: if 25 < value < 35: yield value def convert_to_fahrenheit(data): for value in data: yield (value * 9/5) + 32 # Chain generators together pipeline = convert_to_fahrenheit( filter_temperature( sensor_data_generator() ) ) for temp in pipeline: print(f"{temp:.2f}°F")

How Much Memory Do You Actually Save?

I was skeptical of the "generators are memory-efficient" claim until I ran the numbers myself:

import sys def generator_numbers(n): for i in range(n): yield i def list_numbers(n): return [i for i in range(n)] n = 10000 gen = generator_numbers(n) lst = list_numbers(n) print(f"Generator: {sys.getsizeof(gen)} bytes") # ~200 bytes print(f"List: {sys.getsizeof(lst)} bytes") # ~85,176 bytes print(f"Memory saved: {sys.getsizeof(lst) - sys.getsizeof(gen)} bytes")

The generator object sits at roughly 200 bytes regardless of n. The list grows proportionally. At 10,000 elements you're already at ~85KB for the list, and it keeps scaling. For anything large, generators aren't optional, they're the obvious choice.

Custom Iterators: When Generators Aren't Enough

Generators cover most cases, but sometimes you need state you can reset, or you want to expose iteration as part of a class interface. That's when you implement __iter__() and __next__() directly on a class.

Fibonacci Iterator

class Fibonacci: def __init__(self, limit=100): self.limit = limit self.a = 0 self.b = 1 def __iter__(self): return self def __next__(self): if self.a > self.limit: raise StopIteration current = self.a self.a, self.b = self.b, self.a + self.b return current fib = Fibonacci(100) for num in fib: print(num, end=" ") # 0 1 1 2 3 5 8 13 21 34 55 89

The first time I wrote this, I forgot to raise StopIteration and the loop ran forever. That was fun to debug. Always make sure your __next__() has a way out.

String Permutations Iterator

from itertools import permutations class StringPermutations: def __init__(self, string): self.string = string self.perms = permutations(string) def __iter__(self): return self def __next__(self): perm = next(self.perms) return ''.join(perm) perms = StringPermutations("ABC") for perm in perms: print(perm) # Output: ABC, ACB, BAC, BCA, CAB, CBA

This one delegates the heavy lifting to itertools.permutations and just wraps it in a class so it fits into the iterator protocol. It's a good pattern when you need iterator behavior but don't want to reinvent algorithms that already exist in the standard library.

List Comprehensions

List comprehensions are one of those Python features that look weird at first and then become so natural you start missing them in other languages. They're also faster than equivalent for loops in most cases because of how Python's bytecode handles them internally.

# Finding prime numbers def is_prime(n): if n < 2: return False for i in range(2, int(n ** 0.5) + 1): if n % i == 0: return False return True primes = [num for num in range(1, 51) if is_prime(num)] print(primes) # [2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47]

Benchmarks typically show 30-40% faster execution compared to building the same list with a plain for loop. The tradeoff is that deeply nested comprehensions get hard to read fast. If you need more than one condition or a nested loop, consider whether a regular loop would be clearer.

Practical Examples

Analyzing a String

def analyze_string(s): vowels = consonants = digits = 0 for char in s: if char.isalpha(): if char.lower() in 'aeiou': vowels += 1 else: consonants += 1 elif char.isdigit(): digits += 1 return vowels, consonants, digits vowels, consonants, digits = analyze_string("Hello World 123") print(f"Vowels: {vowels}, Consonants: {consonants}, Digits: {digits}") # Output: Vowels: 3, Consonants: 7, Digits: 3

A clean way to walk through a string and bucket characters by type. I used something like this when validating password complexity rules and it saved me from writing three separate loops.

Returning Multiple Values

def min_max(numbers): return min(numbers), max(numbers) numbers = [12, 45, 7, 89, 23, 56, 3] min_val, max_val = min_max(numbers) print(f"Min: {min_val}, Max: {max_val}") # Min: 3, Max: 89

Python returns a tuple here and unpacks it automatically on the left side. Simple, and no need to create a custom return object just to carry two values.

What I'd Tell Myself Earlier

A few things I wish I'd internalized sooner:

  • Use generators whenever you're iterating over something large. Don't build the full list unless you actually need random access or you're going to iterate it more than once.
  • Lambdas are for short, obvious operations. If you're squinting at a lambda to understand it, write a named function instead.
  • Type hints are worth adding even on personal projects. Coming back to untyped code three months later is painful.
  • Default arguments are great for optional configs but watch out for the mutable default argument trap (def f(lst=[]) is a classic Python footgun).
  • Duck typing is a feature, not a compromise. Write functions that accept any object with the right methods and you'll end up with much more reusable code.

When to Use What

Use CaseTool
Simple inline operationLambda function
Large dataset iterationGenerator
Complex iteration logicCustom iterator
Variable function arguments*args, **kwargs
List transformationList comprehension
Streaming data processingGenerator pipeline

Wrapping Up

These features aren't just academic. I've used generators to process log files that didn't fit in memory, used **kwargs to build decorator chains that stay transparent to callers, and used duck typing to write utilities that work across different data structures without any changes. The patterns here are the ones that actually show up in real code.

The best way to internalize this is to pick one concept, find a place in something you're already building, and apply it. That's how it stuck for me.


Connect With Me

If this was useful, I'd love to hear from you. I post more about Python, software development, and things I'm currently building or breaking.

  • GitHub: @amitdevx - Check out my projects and code
  • LinkedIn: Amit Divekar - Let's connect professionally

Feel free to open an issue, send a message, or just say hi.