Tuesday, September 30, 2008

Viability Check. Dabble DB.

In order to find the best way to roll out the app with pretty large data volume lets kick off with Dabble DB.

Let’s check how fast it will see the light of the day.

Registration and creation of new account is simple and fast.
This is how Dabble DB offers to start:


But… you can import only 15 000 records like this, if more – contact Support. Ok then, I cut out my data to 15 00o as required.

Let’s see…

As I tried to handle that first pasting took me 12 minutes (or my patience just gave out).

It seems it’s not a DabbleDB problem, I guess IE just couldn’t handle that set of data and format large text area volume. I didn’t expect a blazing speed, few words about that wouldn’t be redundant though.

My point is developers should have foreseen the option to download the data not only through copy/paste, but directly from file or just keep the user advised.

Anyway, too early to give up.

But after pressing Continue the system processed all data pretty fast and showed the result with possible field types.

This is it! I just defined category, pressed Finish and importing progress started. Everything looked pretty good until…

Until this:

No reason to torment the system with this anymore (and me as well).
Back to the wizard at the very start. Let’s just import it gradually, like 100 records at first setting and then add all remaining data through data import.

It seemed quite a way out, taking into account my first 100 ran easily.

The look was promising:

As importing of all my data at one setting would probably fail again, my decision was to do that 5000 at a time. First importing try was successful.

My second shot to add next 5000 records failed. The error massage popped up on 91% of progress.


Undoubtedly, you’ll have to ask for help of Dabble DB support team to get through.
I will get in touch with them and show you what those guys suggested as soon as I get any feedback.

Personally I couldn’t find the cause of errors. Either the system is constantly overloaded or my 5000 records just bring it down.

First round conclusion:

To my opinion the copy/paste scheme of data importing is not the case in handling large data sets. Other options are to be offered.


Monday, September 29, 2008

Large volume of data - feasible. But how?

Almost each vendor out there states "handling any data sets!". The question is just "how"..

Obviously, the ways differ. To make sure it will be really working as expected lets check how it is imported and build a couple of reports. What do the biggest players offer?

I think 20 000 records is heavy enough to check the level of capability to handle such data volumes, so the customer can sleep well and be sure that even though the data volume grows with the speed of an avalanche my app is (and will be) ok with it.

I will focuse a spotlight on:


Test application viability through real life problems

Let’s get real. The only true way to prove something wrong or right – test it!

Reviews of on-demand web-based applications are way too superficial to get a clear picture.
Meaning: how, for God’s sake, can you find out if it really meets your expectations considering your specifics except through an actual app deployment? Ponder that for a while…

The truth is all pitfalls are hidden until the complete app roll out.

So, here I try to make it at least a bit clearer and show you how the specific problem is solved through a particular web-based solution.

The good news is you can check out if the app you are going to utilize resolves your given task or find out a better, not so complicated way to do that.

Tell the business problem you are willing to fix and here you can check out how it can be implemented the best possible way.

The bottom line of all this is to make the choice of the solution wiser and up to your real life concerns, that’s it.