Showing posts with label hii. Show all posts
Showing posts with label hii. Show all posts

Monday, March 19, 2012

Import DBF data to a MySQL Table

Hi
I'm trying to sync a DBF file stored on my customer server with the
MySQL table on my server.
I'm still looking for an automatic way to do it, however, for now i'm
trying to find a way to create a small app that will help me do it
faster than manually.
My question is: how do I import the data from my DBF file to my MySQL
Server?
I already have the code to select data from the DBF, but can't seems to
be able to find a fast way to import it. I tried the way of inserting
each line, but its really slow since I have aroung 10,000 rows.
Any idea that would make the process faster ?
Thanks!
EricI think you bought a ticket to the wrong show. You should probably search
for a MySQL newsgroup...
<ericpoirier@.gmail.com> wrote in message
news:1149186767.242174.200480@.c74g2000cwc.googlegroups.com...
> Hi
> I'm trying to sync a DBF file stored on my customer server with the
> MySQL table on my server.
> I'm still looking for an automatic way to do it, however, for now i'm
> trying to find a way to create a small app that will help me do it
> faster than manually.
> My question is: how do I import the data from my DBF file to my MySQL
> Server?
> I already have the code to select data from the DBF, but can't seems to
> be able to find a fast way to import it. I tried the way of inserting
> each line, but its really slow since I have aroung 10,000 rows.
> Any idea that would make the process faster ?
> Thanks!
> Eric
>|||Yeah I know, new here and don't know how to delete posts.
Mike C# wrote:
> I think you bought a ticket to the wrong show. You should probably search
> for a MySQL newsgroup...
> <ericpoirier@.gmail.com> wrote in message
> news:1149186767.242174.200480@.c74g2000cwc.googlegroups.com...|||MySQL can import a comma separated value text file in a very short time. My
experience was 1.2 million records in about 10 seconds. If you can export th
e
MS SQL data as comma separated value file, then have a VB Script or program
to import the file into MySQL data base. There are several 3rd party
applications that will convert data from one database to another.
"ericpoirier@.gmail.com" wrote:

> Hi
> I'm trying to sync a DBF file stored on my customer server with the
> MySQL table on my server.
> I'm still looking for an automatic way to do it, however, for now i'm
> trying to find a way to create a small app that will help me do it
> faster than manually.
> My question is: how do I import the data from my DBF file to my MySQL
> Server?
> I already have the code to select data from the DBF, but can't seems to
> be able to find a fast way to import it. I tried the way of inserting
> each line, but its really slow since I have aroung 10,000 rows.
> Any idea that would make the process faster ?
> Thanks!
> Eric
>

Import database from access

Hi
I am develop a project with .net in windows application.my source of the
database set in access I want to working with WMSDE and I need to import the
database under the access to database in WMSDE first Is it passible?
If yes, how can I do this.
Thanks for who refer to my question
doesn't access have an upsizing wizard in it ?
"amos hchmon" <amoshchmon@.discussions.microsoft.com> wrote in message
news:91853F12-5194-4D31-891F-0C5EAFC16E94@.microsoft.com...
> Hi
> I am develop a project with .net in windows application.my source of the
> database set in access I want to working with WMSDE and I need to import
> the
> database under the access to database in WMSDE first Is it passible?
> If yes, how can I do this.
> Thanks for who refer to my question

Import database from access

Hi
I am develop a project with .net in windows application.my source of the
database set in access I want to working with WMSDE and I need to import th
e
database under the access to database in WMSDE first Is it passible?
If yes, how can I do this.
Thanks for who refer to my questiondoesn't access have an upsizing wizard in it ?
"amos hchmon" <amoshchmon@.discussions.microsoft.com> wrote in message
news:91853F12-5194-4D31-891F-0C5EAFC16E94@.microsoft.com...
> Hi
> I am develop a project with .net in windows application.my source of the
> database set in access I want to working with WMSDE and I need to import
> the
> database under the access to database in WMSDE first Is it passible?
> If yes, how can I do this.
> Thanks for who refer to my question

Monday, March 12, 2012

Import data from text to MS SQL Server. ! Need Help!

Hi
I need to know how to import data from .txt to the MS SQL Server. IT is really important and i have 1 hour only, any help ...... appreciated.
\ThanksDTS import data

Import data from excel into tables

hi
I want to import data from excel into table in sql server
while execution this statement
select *
into SQLServerTable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=D:\testing.xls;HDR=YES',
'SELECT * FROM [Sheet1$]')
i got this error
[Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB provider
'Microsoft.Jet.OLEDB.4.0' reported an error. ERROR [01000]
[Microsoft][ODBC SQL Server Driver][SQL Server][OLE/DB provi
der returned
message:
Could not find installable ISAM.
Any one find me solution and reply where it went wrongHi
It is better practice to create the table and then using INSERT...SELECT
rather than SELECT.. INTO.
I think you should be using datasource and not database therefore (this is
untested!) try:
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'DataSource=D:\testing.xls;Extended Properties=Excel 8.0;HDR=YES', Sheet1$)
John
"Chinnappa" <Chinnappa@.discussions.microsoft.com> wrote in message
news:99E608C6-BBBF-43A4-B7AE-DF2D0318C79D@.microsoft.com...
> hi
>
> I want to import data from excel into table in sql server
> while execution this statement
> select *
> into SQLServerTable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
> 'Excel 8.0;Database=D:\testing.xls;HDR=YES',
> 'SELECT * FROM [Sheet1$]')
> i got this error
>
> [Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB provider
> 'Microsoft.Jet.OLEDB.4.0' reported an error. ERROR [01000]
> [Microsoft][ODBC SQL Server Driver][SQL Server][OLE/DB pro
vider returned
> message:
> Could not find installable ISAM.
> Any one find me solution and reply where it went wrong
>

Import data from excel into tables

hi
I want to import data from excel into table in sql server
while execution this statement
select *
into SQLServerTable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=D:\testing.xls;HDR=YES',
'SELECT * FROM [Sheet1$]')
i got this error
[Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB provider
'Microsoft.Jet.OLEDB.4.0' reported an error. ERROR [01000]
[Microsoft][ODBC SQL Server Driver][SQL Server][OLE/DB provider returned
message:
Could not find installable ISAM.
Any one find me solution and reply where it went wrong
Hi
It is better practice to create the table and then using INSERT...SELECT
rather than SELECT.. INTO.
I think you should be using datasource and not database therefore (this is
untested!) try:
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'DataSource=D:\testing.xls;Extended Properties=Excel 8.0;HDR=YES', Sheet1$)
John
"Chinnappa" <Chinnappa@.discussions.microsoft.com> wrote in message
news:99E608C6-BBBF-43A4-B7AE-DF2D0318C79D@.microsoft.com...
> hi
>
> I want to import data from excel into table in sql server
> while execution this statement
> select *
> into SQLServerTable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
> 'Excel 8.0;Database=D:\testing.xls;HDR=YES',
> 'SELECT * FROM [Sheet1$]')
> i got this error
>
> [Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB provider
> 'Microsoft.Jet.OLEDB.4.0' reported an error. ERROR [01000]
> [Microsoft][ODBC SQL Server Driver][SQL Server][OLE/DB provider returned
> message:
> Could not find installable ISAM.
> Any one find me solution and reply where it went wrong
>

Friday, March 9, 2012

import data from Excel

Hi:

I imported data from Excel to a table on SQL Server throught Data Transformation Services.

All data with a under score(like 1_5, 7_5) were replaced by Nulls. Is there any comment

about that?

Thanks a lot!

Regards,

Kevin JinWhat data type did you have the column that 1_5 was going into?|||you can change the field in sql server table.
data will remain same|||Slynch:

Thank you for your response. I did use nvarchar.

Regards,

Kevin Jin|||Mateenmohd:

Thank you for your response. Could you please let me know the detail about your idea. I'm interested in that.

Regards,

Kevin Jin|||Importing from Excel is a pain in the neck As far as I can tell, Excel decides what type your data is and that's the final deal. In Excel, save it as a csv text file, then import the text file. I've tried importing from Excel a million different ways and it always deletes data due to some wacky reason. I knew a guy who knew a guy who was on the Excel team and basically, just export from Excel to text and forget about trying to use Excel. In my opinion, they tried to make Excel work like a database so accountants wouldn't have to use Access or learn anything about databases. Unfortunately, Excel's a rotten database.|||PDraigh:

Thank you for your suggestion.

Regards,

Kevin Jin

Wednesday, March 7, 2012

Import and Export Wizard: transferring multiple tables from SQL Server 2005 to SQL Server 2000

Hi!

I just used the SSIS Import and Export Wizard to copy 50+ tables from SS05 to SS2K.

I found that the wizard created a package that I could not figure out how to edit, e.g., to change whether or not it had to CREATE a table, or just use an existing one. (I created some problems by manually editing the receiving table names to be ones that already existed -- but the original names it had did not exist, so it knew it had to create them. What I should have done, and eventually ended up doing, was scroll through my list of tables in the "receiving" box; I just figured editing the name would be faster, not realizing what problems I would create for myself.)

Anyhow, now that I see the complex package that the wizard creates, with a LOOP over the 50+ tables, I would like to know how/where in the package it is storing the information about the tables to copy.

Basically the wizard creates the following Control Flow tab entries (in processing sequence order):

an Execute SQL Task: NonTransactableSql an Execute SQL Task: START TRANSACTION a Sequence Container: Transaction Scoping Sequence, which contains an Execute SQL Task: AllowedToFailPrologueSql an Execute SQL Task: PrologueSql a Foreach Loop Container, which contains a Transfer Task with an icon I did not notice in the Toolbox an Execute Package Task: Execute Inner Package an Execute SQL Task: EpilogueSql an "on success" arrow to an Execute SQL Task: COMMIT TRANSACTION an Execute SQL Task: PostTransaction Sql an "on failure" arrow to an Execute SQL Task: ROLLBACK TRANSACTION an Execute SQL Task: CompensatingSql

Where, and how, can I look within this package to see the details about the tables I am transferring? I see that one of the Connection Managers is "TableSchema.XML" -- but it points to a temporary file on my hard drive, that I presume is populated by the package. Where does it get its information?

This is certainly much more complex than the package I would have written, based on my limited knowledge of SSIS. I would have been inclined to create 50+ Data Flow tasks, one for each table.

So now I'm trying to understand why the Wizard created this more-complex package.

Any help will be appreciated, including references to non-Microsoft books/websites/etc.

Thanks in advance.

Dan

Hi Dan,

you can also have a package with 50 parallel data flows built if you uncheck the "Optimize for Many Tables" checkbox. That might be easier for you to edit. This solution does not scale too well with "really" too many tables so we had to emloy the complex package you are referring to. The metadata is stored in the XML file you mentioned and the transfer task goes through that file and generates simple data flows on the fly and executes them one by one.

HTH.

|||

Bob,

Thanks.

I also now know how to get the "50 parallel data flows" that is easier to edit.

I don't understand, though, how the metadata in the XML would "tag along" if I were to save the SSIS package to the File System and copy it to a network drive, where others could use it.

Dan

|||

You would need to copy all the associated files (like the XML with metadata) and tweak the connections to point to the new location.

Thanks.

|||

Bob,

Thanks again.

So it seems the Wizard creates an SSIS package, and also whatever files are needed to support that package, e.g., the XML file with the table names and structures (if a CREATE is necessary). The user of the Wizard must be smart enough to realize that anything in the Connection Manager must accompany the SSIS package -- and that such "temporary folder" files are essential to the task (so they should not be deleted by any "housekeeping" effort on the PC). That's good to know.

I would have expected such files to end up in someplace like the BIN folder found in the same folder where the SSIS package is created. If it isn't a long explanation, maybe you might share why the development team did not use the BIN folder for such files. (I am not wanting to be "nasty" -- just curious about how such decisions are made by the development team.)

Dan

|||

Hi Dan,

the transfer tables task was not initially designed to be used by the wizard. It was built by SMO team to allow copying tables using their APIs. When used that way the internal package is invisible and it makes sense to put the additional files to the temp folders. Later, we realized it might be suitable for our purpose. We weren't sure how often our users would actually want to preserve this package.

We realized it is a problem now and found some additional issues with the table transfer provider task, so we are looking into simplifying the wizard generated packages for Katmai.

Thanks,

-Bob

|||

Bob,

Thanks for the response. I truly appreciate the help the Wizard has given me on many occasions.

I have learned a lot from the Wizard, using it to convert my SQL Server 2000 DTS Packages to SSIS, and learning from the packages it created.

Since you mentioned some new features, is there any current plan to enhance the Execute Package Utility so that it can accommodate the movement of data from MS Access 2003 SP2 to SQL Server 2005? I can do that when I use Visual Studio 2005, but when I try to execute the package with the Execute Package Utility I get many errors of the form

Error: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Data Conversion 1" (49).

The only data conversion I perform is double-byte characters to single-byte characters.

(Maybe I should ask this in a separate thread?)

Dan

|||

Are trying to run the package on the same machine it worked from the designer? What edition of the product you have installed?

It does not seem like new features are needed for this. You just need a proper edition of the product on the machine where you run the package.

Thanks.

|||

Hi Bob,

I'm having a similar issue as Dan, with two notable differences. First, I have over 300 tables to transfer and Second, most of the tables have identity columns.

When I went through the wizard, with "Optimize for Many Tables" checked, it ignored the adjustments to accept the identity values and created new values. Is there a way to adjust this script to accept the Identity values without setting up 300+ parallel data flows? The components are identical to what Dan described in the first posting.

If this can't be done, other suggestions are welcome.

Thanks - Gary

|||

Hi Gary,

unfortunately you have hit another issue with the transfer tables task I was referring to in the previous post. Currently, it is not possible to pass the identitty column settings when the "Optimize for many tables" option is selected.

The best workaround I can offer is to copy your tables in multiple batches (50 tables each should work, it might take up to 100 depending on your hardwear but you would need to test it) with unchecked "Optimize for many tables" option.

Thanks.

|||

Bob,

Thanks for your reply.

Yes, I am trying to run the package from the same machine it worked from the designer. But I am trying to run a "file system" copy of the package that I placed on the network drive. I am out of the office at the moment, so I cannot try running the exact copy on my PC hard drive -- but I will be back in the office in a few days, and can try doing so at that time.

We upgraded to SP2 a month or two ago, for SQL Server 2005. Did you need more "edition" information? If so, I will provide it on Friday, or so.

I just figured it (transfer from Access, and convert Access tables with single-byte characters to double-byte characters, as seem to be usual for SSIS input, then back to single-byte characters for placement in the SQL Server 2005 tables) was a capability that went beyond the intent of the "standalone" package runner (outside Visual Studio), Execute Package Utility.

I am pleased to learn that I may not need to use Visual Studio 2005 to perform this movement of data from Access to SQL Server 2005.

Dan

|||

Bob,

I am back at my desk, where I tried to run the SSIS package that moves approx. 50 tables from MS Access to SQL Server 2005.

The Access version is 2003 (11.6566.8132) SP2.

The SQL Server version is 9.0.3042

The only version information I see in the About box for "About DTExecUI" is "Version: 1.0". Is there some other place I should be seeking version information for this product?

Dan

|||

Dan,

I was asking about the edition of your SQL server instalation; is it Developer, Standard or Enterprise edition?

Thanks.

|||

Bob,

We have the Enterprise edition of SQL Server 2005 in the environment where I am trying to perform the task.

Dan

|||

Have you installed the entire SSIS module on all of those machines as well?

Thanks,

Bob

Import and Export foreign characters (non-DBCS only)

Hi
I have a table which stores the customer information...There is a Customer
Address column defined as nvarchar(50)...Data for the table is imported
using DTS Bulk Insert from a Customer.dat file...
My question: -
1. What are the requirements for me to view the foreign characters (Arabic,
Turkish, German, Italian etc...) using an editor. What setup needs to be
done on the server where the files reside.
2. How do I make the .dat file as a unicode file ?
3. Can I see the foreign characters by a simple select in query analyzer?
4. What are the requirements to export the file to contain all those foreign
characters?
Ps: - Focus is not on DBCS characters (Kanji characters etc...)
Any help would be much appreciated....
Thanks in advance...
ImtiazSee below.
Steve Kass
Drew University
Imtiaz wrote:

>Hi
>I have a table which stores the customer information...There is a Customer
>Address column defined as nvarchar(50)...Data for the table is imported
>using DTS Bulk Insert from a Customer.dat file...
>My question: -
>1. What are the requirements for me to view the foreign characters (Arabic,
>Turkish, German, Italian etc...) using an editor. What setup needs to be
>done on the server where the files reside.
>
As long as the data is stored as Unicode, and the client is Unicode
compatible, all you need is Unicode fonts in your editor.

>2. How do I make the .dat file as a unicode file ?
>
I don't know about DTS, but I think that bcp will leave out the
two-byte Unicode byte-order marker at the beginning of the file.
You may need to insert the two-byte sequence FFFE yourself
so that other applications will handle it correctly. For importing
back to SQL Server, this might be the wrong thing to do, though.

>3. Can I see the foreign characters by a simple select in query analyzer?
>
Yes. Just make sure the font in your results window is a Unicode font
(Arial MS Unicode, for example).

>4. What are the requirements to export the file to contain all those foreig
n
>characters?
>
Just export as Unicode and it should be fine.

>Ps: - Focus is not on DBCS characters (Kanji characters etc...)
>Any help would be much appreciated....
>Thanks in advance...
>Imtiaz
>

Friday, February 24, 2012

import 1 mln records = QA is hung up

Hi
I encountered a problem when I'm trying to import data from a flat table to
several tables of the same databases.
There are about 1 mln records inside the flat table called XX_ZRODLO. The
stored procedure for import seems to work and import data but after about
15000 it stops and it seems as though QA is hung up.
After restart of SQL and shutdown QA - the procedure for import can import
another about 30000 records and then the same agian QA is dead ...
Where could be the problem?
This is the procedure to import data (1 mln records) from table XX_ZRODLO to
several tables of databases according to its structure.
CREATE procedure XX_IMPORT
as
--##### SEKCJA DEKLARACJI
declare @.nazwa nvarchar(255)
declare @.id int
declare @.id_woj int
declare @.id_miasto int
declare @.ulica nvarchar(255)
declare @.telefon nvarchar(255)
declare @.id_branza int
declare @.www nvarchar(255)
declare @.mail nvarchar(255)
--nowo dodane
declare @.kod nvarchar(255)
declare @.numer_pos nvarchar(255)
declare @.kierunkowy nvarchar(255)
declare @.tb_firma_identity int
declare @.tb_adres_identity int
--koniec
declare @.id_firma int
declare @.id_adres int
set @.nazwa = ''
set @.id = 0
set @.id_woj = 0
set @.id_miasto=0
set @.ulica=''
set @.telefon=''
set @.kierunkowy=''
set @.id_branza=0
set @.www=''
set @.mail=''
--nowo dodane
set @.kod = ''
set @.numer_pos =''
set @.kierunkowy =''
--koniec
-- ####### KONIEC SEKCJI DEKLARACJI
--######## POCZATEK
begin
declare tb cursor for (select
nazwa,id,id_woj,id_miasto,ulica,telefon,
id_branza,www,mail,kod,numer_pos,kie
runkowy
from xx_zrodlo where id_portal is null)
open tb
fetch next from tb into
@.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.nu
mer_pos,@.kierunkowy
while @.@.fetch_status = 0
begin
IF @.id_miasto <>0 AND @.id_miasto is not NULL AND @.id_woj<>0 AND @.id_woj is
not NULL AND @.id_branza<>0 AND @.id_branza is not NULL
BEGIN
INSERT INTO TB_FIRMA (nazwa_firma,import) VALUES (@.nazwa,@.id)
select @.@.Identity
set @.tb_firma_identity = @.@.Identity
--select @.id_firma = max(idTB_FIRMA) FROM TB_FIRMA
UPDATE XX_ZRODLO set id_portal = 1 where id=@.id
INSERT INTO TB_FIRMA_BRANZA (idTB_FIRMA, idTB_BRANZA, HIERARCHIA,import)
VALUES (@.tb_firma_identity,@.id_branza,30000,1)
INSERT INTO TB_ADRES (ulica,nr_adresowy,kod,idTB_MIEJSOWOSC) VALUES
(@.ulica,@.numer_pos,@.kod,@.id_miasto)
select @.@.Identity
set @.tb_adres_identity = @.@.Identity
--select @.id_adres = max(idTB_ADRES) from TB_ADRES
INSERT INTO TB_ADRES_FIRMA (idTB_FIRMA,idTB_ADRES,import) VALUES
(@.tb_firma_identity,@.tb_adres_identity,1
)
--przypadek telefonu komrkowego
--IF substring(@.telefon,1,1) = 0
IF @.kierunkowy = 0
BEGIN
INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
VALUES
(@.telefon,5,@.tb_adres_identity,1)
END
--przypadek telefonu stacjonarnego
--IF substring(@.telefon,1,1) != 0
IF @.kierunkowy <>0 AND @.kierunkowy is not null
BEGIN
INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
VALUES
(@.telefon,2,@.tb_adres_identity,1)
END
--przypadek adresu www
IF len(@.www) > 1
BEGIN
INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
VALUES
(@.www,3,@.tb_adres_identity,1)
END
--przypadek adresu email
IF len(@.mail) > 1
BEGIN
INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
VALUES
(@.mail,4,@.tb_adres_identity,1)
END
END
fetch next from tb into
@.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.nu
mer_pos,@.kierunkowy
END
end
close tb
deallocate tb
GO
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GOHi
It seems that the id column is a natural key for this and therefore you
could do this using set based operations and not a cursor. This would speed
up your process.
John
"Dariusz Tomon" <d.tomon@.mazars.pl> wrote in message
news:%23Qw81E1zGHA.3512@.TK2MSFTNGP04.phx.gbl...
> Hi
> I encountered a problem when I'm trying to import data from a flat table
> to several tables of the same databases.
> There are about 1 mln records inside the flat table called XX_ZRODLO. The
> stored procedure for import seems to work and import data but after about
> 15000 it stops and it seems as though QA is hung up.
> After restart of SQL and shutdown QA - the procedure for import can import
> another about 30000 records and then the same agian QA is dead ...
> Where could be the problem?
> This is the procedure to import data (1 mln records) from table XX_ZRODLO
> to several tables of databases according to its structure.
> CREATE procedure XX_IMPORT
> as
> --##### SEKCJA DEKLARACJI
> declare @.nazwa nvarchar(255)
> declare @.id int
> declare @.id_woj int
> declare @.id_miasto int
> declare @.ulica nvarchar(255)
> declare @.telefon nvarchar(255)
> declare @.id_branza int
> declare @.www nvarchar(255)
> declare @.mail nvarchar(255)
> --nowo dodane
> declare @.kod nvarchar(255)
> declare @.numer_pos nvarchar(255)
> declare @.kierunkowy nvarchar(255)
> declare @.tb_firma_identity int
> declare @.tb_adres_identity int
> --koniec
> declare @.id_firma int
> declare @.id_adres int
> set @.nazwa = ''
> set @.id = 0
> set @.id_woj = 0
> set @.id_miasto=0
> set @.ulica=''
> set @.telefon=''
> set @.kierunkowy=''
> set @.id_branza=0
> set @.www=''
> set @.mail=''
> --nowo dodane
> set @.kod = ''
> set @.numer_pos =''
> set @.kierunkowy =''
> --koniec
> -- ####### KONIEC SEKCJI DEKLARACJI
>
> --######## POCZATEK
> begin
>
> declare tb cursor for (select
> nazwa,id,id_woj,id_miasto,ulica,telefon,
id_branza,www,mail,kod,numer_pos,k
ierunkowy
> from xx_zrodlo where id_portal is null)
>
> open tb
> fetch next from tb into
> @.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.
numer_pos,@.kierunkowy
> while @.@.fetch_status = 0
> begin
>
> IF @.id_miasto <>0 AND @.id_miasto is not NULL AND @.id_woj<>0 AND @.id_woj is
> not NULL AND @.id_branza<>0 AND @.id_branza is not NULL
> BEGIN
> INSERT INTO TB_FIRMA (nazwa_firma,import) VALUES (@.nazwa,@.id)
> select @.@.Identity
> set @.tb_firma_identity = @.@.Identity
> --select @.id_firma = max(idTB_FIRMA) FROM TB_FIRMA
> UPDATE XX_ZRODLO set id_portal = 1 where id=@.id
> INSERT INTO TB_FIRMA_BRANZA (idTB_FIRMA, idTB_BRANZA, HIERARCHIA,import)
> VALUES (@.tb_firma_identity,@.id_branza,30000,1)
> INSERT INTO TB_ADRES (ulica,nr_adresowy,kod,idTB_MIEJSOWOSC) VALUES
> (@.ulica,@.numer_pos,@.kod,@.id_miasto)
> select @.@.Identity
> set @.tb_adres_identity = @.@.Identity
> --select @.id_adres = max(idTB_ADRES) from TB_ADRES
> INSERT INTO TB_ADRES_FIRMA (idTB_FIRMA,idTB_ADRES,import) VALUES
> (@.tb_firma_identity,@.tb_adres_identity,1
)
> --przypadek telefonu komrkowego
> --IF substring(@.telefon,1,1) = 0
> IF @.kierunkowy = 0
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.telefon,5,@.tb_adres_identity,1)
> END
> --przypadek telefonu stacjonarnego
> --IF substring(@.telefon,1,1) != 0
> IF @.kierunkowy <>0 AND @.kierunkowy is not null
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.telefon,2,@.tb_adres_identity,1)
> END
> --przypadek adresu www
> IF len(@.www) > 1
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.www,3,@.tb_adres_identity,1)
> END
> --przypadek adresu email
> IF len(@.mail) > 1
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.mail,4,@.tb_adres_identity,1)
> END
> END
>
> fetch next from tb into
> @.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.
numer_pos,@.kierunkowy
> END
> end
> close tb
> deallocate tb
>
> GO
> SET QUOTED_IDENTIFIER OFF
> GO
> SET ANSI_NULLS ON
> GO
>
>|||Dariusz Tomon wrote:
> Hi
> I encountered a problem when I'm trying to import data from a flat table t
o
> several tables of the same databases.
> There are about 1 mln records inside the flat table called XX_ZRODLO. The
> stored procedure for import seems to work and import data but after about
> 15000 it stops and it seems as though QA is hung up.
> After restart of SQL and shutdown QA - the procedure for import can import
> another about 30000 records and then the same agian QA is dead ...
> Where could be the problem?
> This is the procedure to import data (1 mln records) from table XX_ZRODLO
to
> several tables of databases according to its structure.
> CREATE procedure XX_IMPORT
> as
> --##### SEKCJA DEKLARACJI
> declare @.nazwa nvarchar(255)
> declare @.id int
> declare @.id_woj int
> declare @.id_miasto int
> declare @.ulica nvarchar(255)
> declare @.telefon nvarchar(255)
> declare @.id_branza int
> declare @.www nvarchar(255)
> declare @.mail nvarchar(255)
> --nowo dodane
> declare @.kod nvarchar(255)
> declare @.numer_pos nvarchar(255)
> declare @.kierunkowy nvarchar(255)
> declare @.tb_firma_identity int
> declare @.tb_adres_identity int
> --koniec
> declare @.id_firma int
> declare @.id_adres int
> set @.nazwa = ''
> set @.id = 0
> set @.id_woj = 0
> set @.id_miasto=0
> set @.ulica=''
> set @.telefon=''
> set @.kierunkowy=''
> set @.id_branza=0
> set @.www=''
> set @.mail=''
> --nowo dodane
> set @.kod = ''
> set @.numer_pos =''
> set @.kierunkowy =''
> --koniec
> -- ####### KONIEC SEKCJI DEKLARACJI
>
> --######## POCZATEK
> begin
>
> declare tb cursor for (select
> nazwa,id,id_woj,id_miasto,ulica,telefon,
id_branza,www,mail,kod,numer_pos,k
ierunkowy
> from xx_zrodlo where id_portal is null)
>
> open tb
> fetch next from tb into
> @.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.
numer_pos,@.kierunkowy
> while @.@.fetch_status = 0
> begin
>
> IF @.id_miasto <>0 AND @.id_miasto is not NULL AND @.id_woj<>0 AND @.id_woj is
> not NULL AND @.id_branza<>0 AND @.id_branza is not NULL
> BEGIN
> INSERT INTO TB_FIRMA (nazwa_firma,import) VALUES (@.nazwa,@.id)
> select @.@.Identity
> set @.tb_firma_identity = @.@.Identity
> --select @.id_firma = max(idTB_FIRMA) FROM TB_FIRMA
> UPDATE XX_ZRODLO set id_portal = 1 where id=@.id
> INSERT INTO TB_FIRMA_BRANZA (idTB_FIRMA, idTB_BRANZA, HIERARCHIA,import)
> VALUES (@.tb_firma_identity,@.id_branza,30000,1)
> INSERT INTO TB_ADRES (ulica,nr_adresowy,kod,idTB_MIEJSOWOSC) VALUES
> (@.ulica,@.numer_pos,@.kod,@.id_miasto)
> select @.@.Identity
> set @.tb_adres_identity = @.@.Identity
> --select @.id_adres = max(idTB_ADRES) from TB_ADRES
> INSERT INTO TB_ADRES_FIRMA (idTB_FIRMA,idTB_ADRES,import) VALUES
> (@.tb_firma_identity,@.tb_adres_identity,1
)
> --przypadek telefonu komrkowego
> --IF substring(@.telefon,1,1) = 0
> IF @.kierunkowy = 0
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.telefon,5,@.tb_adres_identity,1)
> END
> --przypadek telefonu stacjonarnego
> --IF substring(@.telefon,1,1) != 0
> IF @.kierunkowy <>0 AND @.kierunkowy is not null
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.telefon,2,@.tb_adres_identity,1)
> END
> --przypadek adresu www
> IF len(@.www) > 1
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.www,3,@.tb_adres_identity,1)
> END
> --przypadek adresu email
> IF len(@.mail) > 1
> BEGIN
> INSERT INTO TB_KONTAKT (WARTOSC, idTB_NAZWA_KONTAKT, idTB_ADRES,import)
> VALUES
> (@.mail,4,@.tb_adres_identity,1)
> END
> END
>
> fetch next from tb into
> @.nazwa,@.id,@.id_woj,@.id_miasto,@.ulica,@.te
lefon,@.id_branza,@.www,@.mail,@.kod,@.
numer_pos,@.kierunkowy
> END
> end
> close tb
> deallocate tb
>
> GO
> SET QUOTED_IDENTIFIER OFF
> GO
> SET ANSI_NULLS ON
> GO
>
>
Launch another QA session and look at the sysprocesses table - is there
blocking? Does the spid that your import is using show a waittype? Are
you see a delay due to autogrow?
Tracy McKibben
MCDBA
http://www.realsqlguy.com