Wednesday, March 28, 2012

import large table from mysql

I'm trying to migrate a mysql database to sql server 2000 (then I'm
going to upgrade that to 2005). I downloaded myODBC and was able to
import most of the tables, however when I try to import the largest
table (about 18GB) it fails by running out of memory. The machine I'm
using has 4 gigs of memory in it, so I don't think it's the machine,
however SQL Server 2000 Standard can only take up to 2 gigs of memory
which I think is the problem. How can I work around this? I'm
thinking of trying to use mysqldump to dump pieces of the table at a
time and import it that way, but i think that will take an
unreasonably long time.
Thanks for any help!What method are you using to load the data from the mysql database to the MS
SQL database?
--
Denny
MCSA (2003) / MCDBA (SQL 2000)
MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration /
Microsoft Office SharePoint Server 2007: Configuration)
MCITP (dbadmin, dbdev)
"dylan" wrote:

> I'm trying to migrate a mysql database to sql server 2000 (then I'm
> going to upgrade that to 2005). I downloaded myODBC and was able to
> import most of the tables, however when I try to import the largest
> table (about 18GB) it fails by running out of memory. The machine I'm
> using has 4 gigs of memory in it, so I don't think it's the machine,
> however SQL Server 2000 Standard can only take up to 2 gigs of memory
> which I think is the problem. How can I work around this? I'm
> thinking of trying to use mysqldump to dump pieces of the table at a
> time and import it that way, but i think that will take an
> unreasonably long time.
> Thanks for any help!
>|||On May 30, 3:06 pm, mrdenny <mrde...@.discussions.microsoft.com> wrote:[vbcol=seagreen]
> What method are you using to load the data from the mysql database to the
MS
> SQL database?
> --
> Denny
> MCSA (2003) / MCDBA (SQL 2000)
> MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration
/
> Microsoft Office SharePoint Server 2007: Configuration)
> MCITP (dbadmin, dbdev)
> "dylan" wrote:
Hi,
Thanks for the response.
I'm using myODBC to connect to the mysql server through an ODBC
connection and Data Transformation Services to import the data.
Dylan|||You can set the transformation to batch the load, which might release the
memory stress.
I would probably export the data using a mysql tool, then BCP the data into
the SQL Server.
--
Denny
MCSA (2003) / MCDBA (SQL 2000)
MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration /
Microsoft Office SharePoint Server 2007: Configuration)
MCITP (dbadmin, dbdev)
"dylan" wrote:

> On May 30, 3:06 pm, mrdenny <mrde...@.discussions.microsoft.com> wrote:
> Hi,
> Thanks for the response.
> I'm using myODBC to connect to the mysql server through an ODBC
> connection and Data Transformation Services to import the data.
> Dylan
>|||On May 30, 3:40 pm, mrdenny <mrde...@.discussions.microsoft.com> wrote:[vbcol=seagreen]
> You can set the transformation to batch the load, which might release the
> memory stress.
> I would probably export the data using a mysql tool, then BCP the data int
o
> the SQL Server.
> --
> Denny
> MCSA (2003) / MCDBA (SQL 2000)
> MCTS (SQL 2005 / Microsoft Windows SharePoint Services 3.0: Configuration
/
> Microsoft Office SharePoint Server 2007: Configuration)
> MCITP (dbadmin, dbdev)
> "dylan" wrote:
>
>
>
Ok, thanks. I'll look for the batching option, and if that doesn't
work I'll look at bcp. I've never used that but it looks easy enough.|||Hello,
Use the below steps:-
1. In the mysql Side extract the data from table to a comma seperated text
file
2. Copy the file to SQL Server machine
3. Create the table structure with out indexes and triggers
4. Load the table using BCP IN or BULK INSERT with batch commit options.
5. Create indexes and triggers
6. If you database is in FULL recovery mode, make sure you backup the
transaction log every 2 minutes or so while loading the data.
Thanks
Hari
"dylan" <dylan.roehrig@.gmail.com> wrote in message
news:1180554251.701728.181170@.k79g2000hse.googlegroups.com...
> On May 30, 3:40 pm, mrdenny <mrde...@.discussions.microsoft.com> wrote:
> Ok, thanks. I'll look for the batching option, and if that doesn't
> work I'll look at bcp. I've never used that but it looks easy enough.
>|||On May 30, 10:03 pm, "Hari Prasad" <hari_prasa...@.hotmail.com> wrote:[vbcol=seagreen]
> Hello,
> Use the below steps:-
> 1. In the mysql Side extract the data from table to a comma seperated text
> file
> 2. Copy the file to SQL Server machine
> 3. Create the table structure with out indexes and triggers
> 4. Load the table using BCP IN or BULK INSERT with batch commit options.
> 5. Create indexes and triggers
> 6. If you database is in FULL recovery mode, make sure you backup the
> transaction log every 2 minutes or so while loading the data.
> Thanks
> Hari
> "dylan" <dylan.roeh...@.gmail.com> wrote in message
> news:1180554251.701728.181170@.k79g2000hse.googlegroups.com...
>
>
>
>
>
>
>
Thanks Hari, I'm running into a problem dumping the data into a a
comma separated text file though. Is it possible to do that when the
data in the source table is a blob?|||On May 31, 9:19 am, dylan <dylan.roeh...@.gmail.com> wrote:
> On May 30, 10:03 pm, "Hari Prasad" <hari_prasa...@.hotmail.com> wrote:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> Thanks Hari, I'm running into a problem dumping the data into a a
> comma separated text file though. Is it possible to do that when the
> data in the source table is a blob?
Well, we ended up having to write a small app to import the data - but
it works.

No comments:

Post a Comment