how can this be possible?

 

i have some problems with an init() function when i test it on EUR/USD it acts like 123<123 is true and when i test it on USD/JPY is acts normal

here is the code:


#property copyright ""
#property link ""
extern double levelcount=30;
extern int p = 3;
double level[20];
double init_level=1.4000;
int n;
int init()
{
Alert("star of init()");

double vect_init_level;
double ccc;
double final_level;
double bbb;
int j=0;

if (Digits>3) bbb=10000; else bbb=100;
vect_init_level=init_level;
final_level=init_level+(levelcount/bbb);

while (vect_init_level<final_level)
{
level[j]=vect_init_level;

j++;
ccc=5;
vect_init_level = vect_init_level+(ccc/bbb);
}

n=j;
Alert("final_level=", DoubleToStr(final_level,5));
Alert ("arraysize", n);
for(j=0; j<n; j++) Alert ("level[j]=", DoubleToStr(level[j],5) );

Alert("end of init()");
return(0);
}


and the results for EURUSD is this:

2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: end of init()
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40300
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40250
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40200
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40150
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40100
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40050
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: level[j]=1.40000
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: arraysize7
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: final_level=1.40300
2009.09.22 13:50:17 2009.09.01 00:00 test EURUSD,M5: Alert: star of init()
2009.09.22 13:50:17 test inputs: levelcount=30; p=3;


why the last value in the level array takes the value of final_level because the while structure in the code above has the condition (vect_init_level<final_level) and it acts like <=

why is this happening?


on the other side when i test it on UDS/JPY the results are ok:

2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: end of init()
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.65000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.60000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.55000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.50000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.45000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: level[j]=1.40000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: arraysize6
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: final_level=1.70000
2009.09.22 13:53:41 2009.09.01 00:00 test USDJPY,M5: Alert: star of init()
2009.09.22 13:53:40 test inputs: levelcount=30; p=3;


it stops before final_level and all is ok


can anyone please explain this to me ? where is the mistake in my logic?

 
usermql4:

i have some problems with an init() function when i test it on EUR/USD it acts like 123<123 is true and when i test it on USD/JPY is acts normal

here is the code:


...
while (vect_init_level<final_level)
{
level[j]=vect_init_level;

...


it stops before final_level and all is ok


can anyone please explain this to me ? where is the mistake in my logic?

Change that while so that you've normalized the double:

while (NormalizeDouble(vect_init_level, Digits) <NormalizeDouble(final_level, Digits)){


It's good practice to normalize your doubles when doing things like comparison.

Reason: