How can i use a double value to grade uchar color channels properly ?


preface : I know this is something i'm doing wrong 100% but i can't see it unfortunately so thanks and sorry in advance

I'm using a union to quickly turn argb to 4 byte channels and vice versa

union ucolor{
uint col;
uchar argb[4];

Now , on init i'm creating 2 global ucolor objects and i'm assigning 2 colors , black and white 

//global declaration
ucolor _is,_isnot;

void OnInit(){



By playing around -and by the display becoming transparent- i figured that 

argb[0] is the blue channel argb[1] is the green channel and argb[2] is the red channel while argb[3] is the alpha channel

So far so good (i assume)

Now , i create a pixels array sized as a square via the input display_size

input int displaySize=300;//display size

so this initializes at 300x300 

Creation of a bmp label , same size , offset position ,everything displays .

Now to the point , this display is sort of a "heatmap" of a neural net output.

The net has 2 features and i'm sending in the x and the y of the display as the 2 features divided by (display_size-1) so as to send inputs from 0.0 to 1.0 strictly

The color calculation is this : (and probably where my error is)

         //prepare a color object
         ucolor painter;
         painter.argb[3]=255;//set the alpha as 255
         for(int y=0;y<displaySize;y++){//loop in y
         for(int x=0;x<displaySize;x++){//loop in x
            //the hypothetical input [0] would be the x by the display size-1 
              features[0]=((double)x)/((double)(displaySize-1));//turn the x axis of the display to an input from 0.0 to 1.0
              features[1]=((double)y)/((double)(displaySize-1));//same as above for the y axis
            //pass through and receive a value 0.0 to 1.0
              double output=net.get_output(0);//this is a reference to an output node it responds with values from 0.0 to 1.0
              //channels 0 1 2 B G R , loop in them
                for(int c=0;c<=2;c++)
                double channel=MathFloor(((double)_isnot.argb[c])*(1.0-output))+MathFloor(((double)_is.argb[c])*(output));
              //set color 
         //update resource

So what am i doing there ? 

If the output is 1.0 i want the pixel to light up , to get more from ucolor object _is , and less from ucolor object _isnot .

At the end i'm creating a resource which is tethered to the bmp label and updates the display 

Now here is the catch 

Sometimes if a value is 0 and its supposed to get a "black" pixel it gets a white pixel instead 

for example here : 

the 0.5x , 0.5y pixel has output 0.032 , and 1-0.32 is properly 0.96 so you'd expect a 96% black pixel but its white (middle of the display)

other times this displays correctly like here for example :

the 0.5x , 0.5y pixel has output 0.073 , and 1-0.073 is properly 0.92 so you'd expect a 92% black pixel and it is so

i'm attaching the full codes since this is a public blog anyway

Thank you if you read this far .



Actually turns out there are no errors . 

The networks are okay , the display is okay too .

Here is what is going on actually .

The network receives 4 samples that could be the 4 corners of the display and never sees the "world" in the middle as a sample.

To put it in everyday terms :

  • Take a square piece of paper 
  • Your goal is to bring the top right and the bottom left corners together without altering the permanent shape of the sheet
  • (meaning when you let go it comes back to the initial shape)
  • Well , there are 2 ways it can fold , with the middle below your fingers , or with the middle above your fingers

And that is what happened here , like in the example i did not specify anything about the middle and you did not do anything to it similarly for the network , it did not see any "middle" activity so it does not care.


Thanks anyway