def beginning_zeros(x):
count = 0
for letters in x:
if letters == '0':
count += 1
else:
return count
I wrote this code to count the zeros at the beginning. While i excecuted it for '0001' it gave the right answer(3) but when i ran for '0000' it gives me 1. Can someone explain why? And what should i do for this?
Created at: 2021/03/07 08:49; Updated at: 2021/03/07 12:23