Monday, October 6, 2008

Large data volume. Who has come through to the final?

To sum up our testing of data-driven web applications capability to handle large datasets let’s take a look of what we had reviewed.

I couldn’t just manage to check app viability of Dabble DB because of system failure to import 20 000 records, but Luke Andrews, their representative assured me, he will find out why:

"Hi Jane, I'll take a look at the specific data you're trying to import and see if we can figure out what is causing the bottleneck. We'll get back to you soon."
Let me just say I am still waiting, but as soon as these guys drop me a word my Dabble DB test accomplishment won’t keep you waiting.

Even though I had not found a stand-alone solution, my task was accomplished. 20 000 records don’t cause any performance problem. Each of the vendors is capable of processing large data volumes, just pay attention to the implementation peculiarities.

Trying Coghead I d like to mark their anisochronous way to import large data sets and opportunity to work within the app (for example, for your app synchronisation with another system on the daily basis).

In terms of my testing Quickbase and TeamDesk offer the best compromise of data import simplicity and flexibility. These products have a similar data import approach, which easily allows creating detailed table with columns directly through import and copy/paste option and at the same time you can also import a large data volume through file if needed. It should be noted Quickbase data analysis is a bit advanced and offers needed column types at once.

Hope my try will give you a hand to make a reasonable choice.

No comments:

Post a Comment