Hi,
I am having problem with a dataset which in some cases will take up to 15 min to complete an AcceptChange and CPU usage is very high.
I have tested a few things to see if I can make it go faster and I did make it much faster but I am not sure if it will have a side affect so that I cannot rely on the dataset.
In my dataset I have one table that always will have changes lets call it Solutions.
The implementation would looks like this.
internal static void AcceptChanges(DataSet dataSet) { if (dataSet.HasChanges()) { if (dataSet.Solutions.GetChanges() != null) { dataSet.Solutions.AcceptChanges(); } for(int i = 0; dataSet.HasChanges() == true && i < dataSet.Tables.Count; i++) { if (dataSet.Tables[i].GetChanges() != null) { dataSet.Tables[i].AcceptChanges(); } } } }
The reason for this way of doing it is that when AcceptChanges is called on a DataTable it will copy all rows changed or not into an array and I am guessing that this could be my problem. Also the cascading update/delete might have an effect on the issue (But I am not sure about this).
So my question is: Will this work???