[TIP] Using DB on tests without orm

marrese at gmail.com marrese at gmail.com
Fri Jul 22 08:15:03 PDT 2011


On Fri, Jul 22, 2011 at 12:01, Mark Sienkiewicz <sienkiew at stsci.edu> wrote:
> marrese at gmail.com wrote:
>
>> Today we decided to look for a better approach.
>> The first thing I thought about was to use sqlite3 and do all the
>> testing in there, but I can't find a decent way to migrate data
>> between mysql and sqlite (and I think I will have to do this on
>> regular basis)
>>
>
> DBAPI is pretty portable, but not so portable that I would count on testing
> exclusively in a different database than the one I want to support.  I do
> divide my tests into those that are more or less likely to be affected by
> database details.
>
> mysql databases are pretty easy to create.  For your scenario, I would
> consider just making a bunch more of them for testing.  Use the conventional
> database dump/restore tools to initialize a particular database before using
> it for a test run.  It may be interesting to know that the default mysql
> configuration allows anybody to create/drop a database with a name that
> starts "test_".
>
> I've often considered a tool that could dump/restore between a database
> table and a CSV file.  Say the first line of the CSV has the column names,
> and the rest are data.  If you had such a tool, you could dump your
> reference data to CSV and then load it into a blank test database
> immediately before your test run.  I don't know of any such tool, but it is
> so easy to write with DBAPI.  You'll want the "row factory" that returns the
> column names with the query results.  Both MySQLdb and sqlite3 have a
> suitable row factory.
>
> As a bonus, this tool makes your test data into a plain text file that you
> can store in your configuration management system.
>
> Testing with sqlite does have one advantage:  speed.  With a small enough
> database on a fast enough CPU, you can create a database and run a test
> faster than you can connect to a mysql server; it's the network latency.
>  But sqlite has only very coarse locking, so you might structure your tests
> so that you never run two tests against the same database at the same time.
>
> If all you are doing is basic select/insert, mysql and sqlite look very
> similar through DBAPI.  I have a separate set of CREATE TABLE / CREATE INDEX
> commands for each database.  For example, sqlite doesn't know what "CREATE
> TABLE ... USING INNODB" means.  It means keeping the two copies in sync, but
> how often do you change the structure of a table?
>
> Mark S.
>
>

My particular problem is that I'm working with a legacy structure on a
new project, and because of that the test database is a complete mess.

What I did yesterday (after sending the email) was reorganize that
database and create several test databases trying to minimize that
I've to create and dump on each test run.

Creating a script that queries the tables is not so simple with the
current structure. We are not using any framework, but I decided to
make some tests using Django ORM. So far no good. The structure is so
messed up that the dumpdata fails constantly.

I'll try to minimize the amount of data stored in the test db and work
from there. I guess that with less data the storage will be easy to
recreate.

Thanks for the answers ...

-- 
Martín

http://www.kyheo.com.ar
GTalk: marrese at gmail.com



More information about the testing-in-python mailing list