Nevron Forum

Plotting massive data sets

https://www.nevron.com/Forum/Topic2624.aspx

By Janosch Peters - Friday, September 11, 2009

Hi,

Im trying to get nevron top plot about 1.4M points. The chart is fed with 5 data points every 250ms. The display range is 2h and I have 10 Series. Unfortunately, the cpu usage is at 100% when there are about 17000 Points in the whole chart.

It seems that nevron does not have a method to eleminate points which are overlapping. In my test case, for istance, there are about 144 data points per pixel. It appears to me that nevron tries to plot them all.

Is there a built-in method in nevron chart that decimates (unnessecary) points?

If not, is there some way to connect a self-developed method which eleminates points into nevron?

 

Regards,
Janosch

By Blagovest Milanov 1 - Monday, September 14, 2009

Hi Janosch,

The control does not have build in functionality for data point sampling (although we plan to add such features in one of the next releases of the control). One of the simplest ways to achieve this is to sample the data at given intervals (say each 10 or 20 data points) and to get the min and max values for each sample. This will ensure that you don't loose data peaks in the sample. I can also send you a sample application if you want showing this approach with a standard area chart.

Hope this helps - let me know if you have any questions or comments.

Best regards,
Bob

By Teddy Lambropoulos - Thursday, June 17, 2010

Would it be possible for you to send me that sample project?

I am trying to use an NGridSurface Series with over 150,000 points. In my project I refresh the plot frequently. Needless to say, using an NChartControl.Refresh command on a plot with over 150,000 points takes more time than I would like. I would like to down-sample the plotted data but still maintain local maxima and minima. Have the developers created a built-in way to do this or do I still need to do it myself? If I must do it myself, does anyone have any techniques to suggest?

 

Thank you,

Teddy