Division by variable error in Indicator

 

I have an interesting problem.

I wrote an Indicator that uses the Standard Deviation as a denominator in an equation. When I use the value but not to divide something else by it, it works fine by which it implies it is a non-zero value. However, when used as the denominator, all lines (not just those related to the Standard Deviation) disappear. I have experimented and it seems that is is reading the initial value as zero. I have tried pre-defining the variable and assigning a non-zero value to it outside the loop, but it still does not work. Has anyone seen this before?

 
Please paste code.
 
If you index beyond Bars-1 you'll get zeros. (Use SRC for code, not paste.)
Reason: