• problem with code


def beginning_zeros(x): count = 0 for letters in x: if letters == '0': count += 1 else: return count

I wrote this code to count the zeros at the beginning. While i excecuted it for '0001' it gave the right answer(3) but when i ran for '0000' it gives me 1. Can someone explain why? And what should i do for this?