Optimizing Serialization for Large Data Sets


Author
Message
Pedram .
Pedram .
Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)
Group: Forum Members
Posts: 7, Visits: 1
We are trying to serialize to disk large ChartControl files, by *clearing* all data points first and then saving the data
using the following code:

**********************
foreach (NSeries Series in Chart.Series)
{
Series.ClearDataPoints();
}
ChartControl.Document.Calculate();
ChartControl.Refresh();

ChartControl.Serializer.SaveControlStateToFile(FileName, PersistencyFormat.Binary, new NDataSerializationFilter());
**********************

However when we do this the file size changes depending on number of data points. These files can be quite large 10MB-30MB and loading them again can take 30sec to 2min even though there should be no datapoints in the file. We want to keep all the series in the ChartControl but remove all its points before saving to reduce the file size. Any thoughts on how we can do this?

Nevron Support
Nevron Support
Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)Supreme Being (4.5K reputation)
Group: Administrators
Posts: 3.1K, Visits: 4.2K

Hi Pedram,

We could not replicate this issue - can you send a ziped state file serialized to PersistencyFormat.CustomXML to support@nevron.com for review? BTW does XML serialization exibit this problem as well?



Best Regards,
Nevron Support Team


GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Similar Topics

Reading This Topic

Login

Explore
Messages
Mentions
Search