Char vs Short vs Int vs Long - how much benefit?

[Deleted]  

A difficult question to give a quantifiable answer, but here I go:

I understand the difference between char / short / int / long (8bit, 16bit, 32bit, 64bit) but what sort of real-world difference does this actually make?

If I'm using a counter that is never realistically going to be more than 10-20.

Leaving aside the argument "why not do it properly?", does it really make any difference if I use char/uchar rather than short, int or long?

Or is the gain so negligible it is unnoticeable?

I appreciate the effect will be cumulative, but I'm interested to know if the distinction is really necessary.

 
for large arrays there are obvious space savings, for performance see here
[Deleted]  
Thanks! I'd missed that thread and it's now answered a lot of questions