Axis Automatic scale incorrectly interprets NULL as 0 (zero)
Posted: Thu Jul 13, 2006 11:39 pm
When a line series has null values, the series correctly plots a gap in the line (instead of interpolating a zero value).
However, the automatic scale of the value axis interprets the null as zero. So, if there is a NULL value, the axis minimum value will be zero, even if if all of the non-null values are way above zero (hundreds, thousands, etc.).
Also, the series.YValues.Item(x) returns zero instead of null.
It would be helpful if the NULL values were ignored when automatically scaling the value axis.
However, the automatic scale of the value axis interprets the null as zero. So, if there is a NULL value, the axis minimum value will be zero, even if if all of the non-null values are way above zero (hundreds, thousands, etc.).
Also, the series.YValues.Item(x) returns zero instead of null.
It would be helpful if the NULL values were ignored when automatically scaling the value axis.