Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Friday, March 30, 2012

import rpt file to SQL Server 2005

Hi,

I know how to import the .rpt file into Access database using import.

can some body tell about how to import to SQL Server 2005 database, both front end (using some utility or interface) and back end (directly accessing the server) solution.

Thanks,

Fahim.

hi,

I'm not sure I understood you question...

.rpt file could stand for a "report file", but SQL Server has nothing to do with client issues like reports.. so, the question is, where do you want to import those .rpt files? do you like to store them into a database table? and if this is correct, what do you like to do with that files, as SQL Server can not use them to render reports?..

a "branch" of SQL Server, available in SQLExpress edition as well, is devoted to reporting, using Reporting Services, an additional "engine" responsible for the purpose..

regards

|||

In addition to Andrewa, if you want to convert your rpt file into a report in reporting services you will have to do manual work as the queries within the Report will have to be trasnfered to SQL Server syntax. In addition, the migraiton assistant for Report wont transfer the underlying tables to SQL Server, you will have to do that on your own.
Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Inert rpt file to SQL table now ( from front end ),

i hope my question is clear.

thanks for your help.

|||

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Insert rpt file to SQL table now ( from front end ),

i hope my question is clear.

thanks for your help.

|||

hi,

FahimatMicrosoftForum wrote:

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Insert rpt file to SQL table now ( from front end ),

if you want to insert "files" into a database table, you can perhaps bulk insert them... or you can deal the task via client code, reading the files into memory and passing the resulting byte array to the Transact-SQL command specified for the INSERT INTO operation..

i hope my question is clear.

absolutely not... at least not to me

regards

|||

FahimatMicrosoftForum wrote:

Hi,

I know how to import the .rpt file into Access database using import.

can some body tell about how to import to SQL Server 2005 database, both front end (using some utility or interface) and back end (directly accessing the server) solution.

Thanks,

Fahim.

Hi,

I also looking for this solution. I am import excel, txt, pdf file to sql2005. Are you found the solution?

|||

Search Books Online for the bcp utility, it should allow you to import various types of files.

Mike

|||

Excel and Textfiles are supported by bcp, bcp does not support imporintg pdf files as their file structure is not bulk-importable.

Jens K. Suessmeyer


http://www.sqlserver2005.de

import rpt file to SQL Server 2005

Hi,

I know how to import the .rpt file into Access database using import.

can some body tell about how to import to SQL Server 2005 database, both front end (using some utility or interface) and back end (directly accessing the server) solution.

Thanks,

Fahim.

hi,

I'm not sure I understood you question...

.rpt file could stand for a "report file", but SQL Server has nothing to do with client issues like reports.. so, the question is, where do you want to import those .rpt files? do you like to store them into a database table? and if this is correct, what do you like to do with that files, as SQL Server can not use them to render reports?..

a "branch" of SQL Server, available in SQLExpress edition as well, is devoted to reporting, using Reporting Services, an additional "engine" responsible for the purpose..

regards

|||

In addition to Andrewa, if you want to convert your rpt file into a report in reporting services you will have to do manual work as the queries within the Report will have to be trasnfered to SQL Server syntax. In addition, the migraiton assistant for Report wont transfer the underlying tables to SQL Server, you will have to do that on your own.
Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Inert rpt file to SQL table now ( from front end ),

i hope my question is clear.

thanks for your help.

|||

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Insert rpt file to SQL table now ( from front end ),

i hope my question is clear.

thanks for your help.

|||

hi,

FahimatMicrosoftForum wrote:

Thanks for your reply,

I am currently doing like this,

-- take comma delimited file.

-- convert into rpt file using Excel

-- manuall insert to Access database table.

now I want to upgrade this process to

-- SQL Server 2005 ( i know how to upgrade database to this engine)

-- Insert rpt file to SQL table now ( from front end ),

if you want to insert "files" into a database table, you can perhaps bulk insert them... or you can deal the task via client code, reading the files into memory and passing the resulting byte array to the Transact-SQL command specified for the INSERT INTO operation..

i hope my question is clear.

absolutely not... at least not to me

regards

|||

FahimatMicrosoftForum wrote:

Hi,

I know how to import the .rpt file into Access database using import.

can some body tell about how to import to SQL Server 2005 database, both front end (using some utility or interface) and back end (directly accessing the server) solution.

Thanks,

Fahim.

Hi,

I also looking for this solution. I am import excel, txt, pdf file to sql2005. Are you found the solution?

|||

Search Books Online for the bcp utility, it should allow you to import various types of files.

Mike

|||

Excel and Textfiles are supported by bcp, bcp does not support imporintg pdf files as their file structure is not bulk-importable.

Jens K. Suessmeyer


http://www.sqlserver2005.de

import question

I have a excel comma seperated file with 2 columns.

Column A contains names and Column B contains zips.

The problem is Column A is not consistent in it's structure. Some columns only contain one name where others contain two names just by last name seperated by a space or a slash.

For example

BOWERING WILKENSON
GEERS/DOODS

I'm trying to clean the data before importing into a sql table. I want to seperate the records for column A where there are two names and move the second name to a new row. Is there a method to do this during the import? Or, within Excel prior to the import using replace? I want to find instances where there is a space or slash and move the name after the space or slash and move to a new row?

Thanks,
-D-I think in Excel you can do a text to column and put "/" or space as the delimitter|||Instead of leaning up just import the fist column in a temporary table and clean it by using substring and charindex.

Import Prs

Hey guys,

I was curious how other people create several 100 SQL objects on a
new database. Example: I have each procedure in its own text file (IE
100 .sql files), so I just check them out of source control, then run
a perl script that just dumps them into one txt file. Then I paste
that mess into Query Analyzer.

So to reiterate my question, how do other people get sql objects
into a database. Obviously I'd do an object copy if they resided in
some other database.

While my solution works, there is always a better way. Thanks for
your suggestions.

- CptI don't know Perl but I assume you could execute OSQL from your script.
Another method is to invoke a Windows FOR command in a command prompt
window to execute OSQL. For example:

CD C:\SQLScripts
FOR %v in (*.sql) DO OSQL -i "%v" -o "%v.out" -E

--
Hope this helps.

Dan Guzman
SQL Server MVP

--------
SQL FAQ links (courtesy Neil Pike):

http://www.ntfaq.com/Articles/Index...epartmentID=800
http://www.sqlserverfaq.com
http://www.mssqlserver.com/faq
--------

"CptVorpal" <cptvorpal@.hotmail.com> wrote in message
news:76cffbe0.0310081530.6c7a034b@.posting.google.c om...
> Hey guys,
> I was curious how other people create several 100 SQL objects on a
> new database. Example: I have each procedure in its own text file (IE
> 100 .sql files), so I just check them out of source control, then run
> a perl script that just dumps them into one txt file. Then I paste
> that mess into Query Analyzer.
> So to reiterate my question, how do other people get sql objects
> into a database. Obviously I'd do an object copy if they resided in
> some other database.
> While my solution works, there is always a better way. Thanks for
> your suggestions.
> - Cpt|||cptvorpal@.hotmail.com (CptVorpal) wrote in message news:<76cffbe0.0310081530.6c7a034b@.posting.google.com>...
> Hey guys,
> I was curious how other people create several 100 SQL objects on a
> new database. Example: I have each procedure in its own text file (IE
> 100 .sql files), so I just check them out of source control, then run
> a perl script that just dumps them into one txt file. Then I paste
> that mess into Query Analyzer.
> So to reiterate my question, how do other people get sql objects
> into a database. Obviously I'd do an object copy if they resided in
> some other database.
> While my solution works, there is always a better way. Thanks for
> your suggestions.
> - Cpt

You could start by looking at using OSQL.EXE to run your .sql scripts
from the command line - it should be straightforward to wrap that in a
script of some sort which gets the list of scripts, then executes them
one by one. You can trap the output from OSQL and use it for error
handling as well.

Simon|||[posted and mailed]

CptVorpal (cptvorpal@.hotmail.com) writes:
> I was curious how other people create several 100 SQL objects on a
> new database. Example: I have each procedure in its own text file (IE
> 100 .sql files), so I just check them out of source control, then run
> a perl script that just dumps them into one txt file. Then I paste
> that mess into Query Analyzer.
> So to reiterate my question, how do other people get sql objects
> into a database. Obviously I'd do an object copy if they resided in
> some other database.
> While my solution works, there is always a better way. Thanks for
> your suggestions.

The suggestion from Dan and Simon to use OSQL is a good one. I'll
add that you can invoke OSQL for each file. This could be help to
track any errors.

For a faster execution you could look into to connect to the database
from Perl using some interface. (There is a short overview on my
web site at: http://www.algonet.se/~sommar/mssql...ternatives.html.

And if you want a ton of bells and whistles, you can look at
http://www.abaris.se/abaperls/. This is the load tool that we use
in our shop, and as the name indicates it's all Perl. You could
say that I started where you are now, and this is what I have seven
years later. :-)

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.aspsql

Import problem with varchar(max) field

I'm trying to import some Assessor data from a text file into a table and the field for Legal Description (column 2 in the source text file) in the table is of data type varchar(max) because some of the data goes over the 8K size. I get an error on the first row of importing that refers to column 2 (see 'Initial errors' below). I read the related post and changed the size of input column 2 to 8000 and got this error. Finally I set the size of the of input column 2 to 4000 and it ran. So I'm thinking there is a limit on the size of varchar data that can be imported. Just want to clarify what that limit is and how I might go about importing this data.

Thanks, John

Error with input column 2 set to size of 8000:

Setting Destination Connection (Error)

Messages

Error 0xc0204016: DTS.Pipeline: The "output column "Column 2" (388)" has a length that is not valid. The length must be between 0 and 4000.
(SQL Server Import and Export Wizard)

Exception from HRESULT: 0xC0204016 (Microsoft.SqlServer.DTSPipelineWrap)

Initial errors:

Executing (Error)

Messages

Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Column 2" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)

Error 0xc020902a: Data Flow Task: The "output column "Column 2" (18)" failed because truncation occurred, and the truncation row disposition on "output column "Column 2" (18)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)

Error 0xc0202092: Data Flow Task: An error occurred while processing file "\\Scux00\assrdumps\SQLServerDB\exportsql.txt" on data row 1.
(SQL Server Import and Export Wizard)

Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - exportsql_txt" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038.
(SQL Server Import and Export Wizard)

Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039.
(SQL Server Import and Export Wizard)

There is a limit on the size of unicode character strings at 4000. Non-unicode character strings have a limit of 8000. Ensure that you don't have type conversions between varchar and nvarchar.|||

I'm trying the SQL Server Import and Export Wizard. I've set the DataType to most of the fields as 'string[DT_STR]' and the Unicode box is unchecked, but when I look at column mappings after clicking 'Edit Mappings', all the columns say they're type nvarchar on the screen and it doesn't appear to be something I can change. If I highlight a column on that screen it shows the setting that I set in the Advanced screen for the text file, ie .. Column 2 string[DT_STR](4000). Since I'm getting the limit of 4k it must be because of this. I'm not sure how to modify it. I'll save the package as a file and open it up in BID Studio ... john

Import Performance file Run Command

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:02AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
My performance monitor job name is SAVE_STATISTICS what
run command should I use to start my SAVE_STATISTICS at 7
AM and stop it daily 7:02 AM?
This would be a very help to resolve this issue.
Thank You,
Dan
you could potentially use LOGMAN native command from a remote XP and Windows
2003 computer to Start and Stop the logs at the specified time each day - ie
a Daily scheduled task (Control Panel/ Scheduled Tasks) to do it.
And then as part of the definition of the Perfmon Log / Schedule add the
"Run this command" to have a batch file that calls DTSRUN and executes your
DTS package to import the Data.
a bit messy I admit
regards
Andy.
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>
|||Hi
I replied to you previous post on this subject! Perfmon has it's own method
of scheduling, through the Scheduling tab on the Properties dialog. You can
tell perfmon to stop the performance monitor and start a new log after a
given period. You can also specify that it runs a command when this happens.
This could be your DTS job. To change the name of the input file check out:
http://www.sqldts.com/default.aspx?200
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>

Import Performance file Run Command

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:02AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
My performance monitor job name is SAVE_STATISTICS what
run command should I use to start my SAVE_STATISTICS at 7
AM and stop it daily 7:02 AM?
This would be a very help to resolve this issue.
Thank You,
Danyou could potentially use LOGMAN native command from a remote XP and Windows
2003 computer to Start and Stop the logs at the specified time each day - ie
a Daily scheduled task (Control Panel/ Scheduled Tasks) to do it.
And then as part of the definition of the Perfmon Log / Schedule add the
"Run this command" to have a batch file that calls DTSRUN and executes your
DTS package to import the Data.
a bit messy I admit
regards
Andy.
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>|||Hi
I replied to you previous post on this subject! Perfmon has it's own method
of scheduling, through the Scheduling tab on the Properties dialog. You can
tell perfmon to stop the performance monitor and start a new log after a
given period. You can also specify that it runs a command when this happens.
This could be your DTS job. To change the name of the input file check out:
http://www.sqldts.com/default.aspx?200
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>

Import Performance file Run Command

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:02AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
My performance monitor job name is SAVE_STATISTICS what
run command should I use to start my SAVE_STATISTICS at 7
AM and stop it daily 7:02 AM?
This would be a very help to resolve this issue.
Thank You,
Danyou could potentially use LOGMAN native command from a remote XP and Windows
2003 computer to Start and Stop the logs at the specified time each day - ie
a Daily scheduled task (Control Panel/ Scheduled Tasks) to do it.
And then as part of the definition of the Perfmon Log / Schedule add the
"Run this command" to have a batch file that calls DTSRUN and executes your
DTS package to import the Data.
a bit messy I admit
regards
Andy.
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx
.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>|||Hi
I replied to you previous post on this subject! Perfmon has it's own method
of scheduling, through the Scheduling tab on the Properties dialog. You can
tell perfmon to stop the performance monitor and start a new log after a
given period. You can also specify that it runs a command when this happens.
This could be your DTS job. To change the name of the input file check out:
http://www.sqldts.com/default.aspx?200
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:26ad201c461ce$4a260760$a301280a@.phx
.gbl...
>
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:02AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> My performance monitor job name is SAVE_STATISTICS what
> run command should I use to start my SAVE_STATISTICS at 7
> AM and stop it daily 7:02 AM?
> This would be a very help to resolve this issue.
> Thank You,
> Dan
>

Import Performance file into SQL Server table

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:01AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
This would be a very help to resolve this issue.
Thank You,
Dan
Hi
You can schedule the start/stop of performance monitor from within Perfmon
itself. This also allows you to run a script when you close it.
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:265b501c46109$08226bd0$a301280a@.phx.gbl...
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:01AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> This would be a very help to resolve this issue.
> Thank You,
> Dan
sql

Import Performance file into SQL Server table

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:01AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
This would be a very help to resolve this issue.
Thank You,
DanHi
You can schedule the start/stop of performance monitor from within Perfmon
itself. This also allows you to run a script when you close it.
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:265b501c46109$08226bd0$a301280a@.phx
.gbl...
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:01AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> This would be a very help to resolve this issue.
> Thank You,
> Dan

Import Performance file into SQL Server table

I have Windows 2000 Advanced Server with SQL Server 2000
Enterprise with SP3A. I trying to set up a job to import
Performance Monitors daily into SQL Server table. How
would I set up Performance Monitor to stop at 7AM close
the present file, create a new file at 7:01AM start
collecting statistics. Then I will DTS file into a SQL
Server table.
This would be a very help to resolve this issue.
Thank You,
DanHi
You can schedule the start/stop of performance monitor from within Perfmon
itself. This also allows you to run a script when you close it.
John
"Dan" <anonymous@.discussions.microsoft.com> wrote in message
news:265b501c46109$08226bd0$a301280a@.phx.gbl...
> I have Windows 2000 Advanced Server with SQL Server 2000
> Enterprise with SP3A. I trying to set up a job to import
> Performance Monitors daily into SQL Server table. How
> would I set up Performance Monitor to stop at 7AM close
> the present file, create a new file at 7:01AM start
> collecting statistics. Then I will DTS file into a SQL
> Server table.
> This would be a very help to resolve this issue.
> Thank You,
> Dan

Import Password Protected File Via DTS

I am trying to import a password-protected Excel file into SQL Server
using DTS. I am getting an error that it can't decrypt file. Does
anyone know how I can pass the password to the file during the DTS
execution. Please help.

Thanks,
MichelleThe Excel connection object has a Password property - you can set it
using a dynamic properties task or an ActiveX task.

Simon

import of data in text file How ?

Hello group,
Each month we want to import data from an ascii file,
the records are of a fixed length (some spare space is added to fill them
up)
there are 12 different types of records (variants)
Each type has a different (but within the type always the same) build.
Each record starts with a type indication. Then the records follows.
One type can have 2 fields the other type can have 15 fields.
Records following eachother can have relations, so a type two record
belongs to the type 1 record before the type 2 record.
Example with 3 types
1 AAAAAABB
2 CCCDDDDDDEEEEFFGGHH
2 CCCDDDDDDEEEEFFGGHH
3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
1 AAAAAABB
2 CCCDDDDDDEEEEFFGGHH
The letters are indications for the different fields.
Any suggestions how to solve this ?
In our propriety system this was solved by reading the record
into a pascal record with all twelf variants and then writing it to the
propriety database. We could reuse the technique of the program,
but can not reuse the program.
And we still have to split up each record into fields.
Our propriety database just accepted the 'binairy' data as is.
ben brugmanBen,
You can reimplement your method of reading a record at a time and inserting
the row into the proper destination by using a DTS package and doing some
scripting for each row.
How I tend to do things like this is:
BULK INSERT the file to a work table
CREATE TABLE BulkInsertWork
(RowID INT IDENTITY,
RowText NVARCHAR(1000))
TRUNCATE TABLE BulkInsertWork
BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt' ...
Then I write a stored procedure that parses out the specific rows, such as:
INSERT INTO Table1 (ColumnA, ColumnB)
SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
FROM BulkInsertWork
etc. for each type
FWIW
Russell Fields
http:/www.sqlpass.org/events/seattle03
2003 PASS Community Summit - Seattle
- The largest user-even dedicated to SQL Server! Register TODAY!
"ben brugman" <ben@.niethier.nl> wrote in message
news:bob7k7$4dj$1@.reader11.wxs.nl...
> Hello group,
> Each month we want to import data from an ascii file,
> the records are of a fixed length (some spare space is added to fill them
> up)
> there are 12 different types of records (variants)
> Each type has a different (but within the type always the same) build.
> Each record starts with a type indication. Then the records follows.
> One type can have 2 fields the other type can have 15 fields.
> Records following eachother can have relations, so a type two record
> belongs to the type 1 record before the type 2 record.
> Example with 3 types
> 1 AAAAAABB
> 2 CCCDDDDDDEEEEFFGGHH
> 2 CCCDDDDDDEEEEFFGGHH
> 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> 1 AAAAAABB
> 2 CCCDDDDDDEEEEFFGGHH
> The letters are indications for the different fields.
> Any suggestions how to solve this ?
> In our propriety system this was solved by reading the record
> into a pascal record with all twelf variants and then writing it to the
> propriety database. We could reuse the technique of the program,
> but can not reuse the program.
> And we still have to split up each record into fields.
> Our propriety database just accepted the 'binairy' data as is.
> ben brugman
>|||I immeadiatly went ahead with your suggested method.
I created a .txt file from the example in the mail.
I changed the nvarchar in the table to varchar.
When inserting from the .txt file I get the following errors :
Server: Msg 4832, Level 16, State 1, Line 1
Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any
information about the error.
The statement has been terminated.
Added :
WITH (
DATAFILETYPE = 'char',
ROWTERMINATOR = '\n'
)
But stil the same errors.
(Lastrow and rowsperbatch did not bring a solution either).
Please advise.
Also played around a bit with DTS, but the problem is that I can do an
import and then
a selection. Or that I can do a split up of the line into different fields.
But I can not do them both.
(Or I have to split up every line in every possible way and after selection
throwing away
the not desired results. Make a table for each type and insert every row in
every table and
then trowing away the types which do not belong in the table. I think this
is pretty ugly).
Ben brugman
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:uf4ZQm7oDHA.1072@.TK2MSFTNGP09.phx.gbl...
> Ben,
> You can reimplement your method of reading a record at a time and
inserting
> the row into the proper destination by using a DTS package and doing some
> scripting for each row.
> How I tend to do things like this is:
> BULK INSERT the file to a work table
> CREATE TABLE BulkInsertWork
> (RowID INT IDENTITY,
> RowText NVARCHAR(1000))
> TRUNCATE TABLE BulkInsertWork
> BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt' ...
> Then I write a stored procedure that parses out the specific rows, such
as:
> INSERT INTO Table1 (ColumnA, ColumnB)
> SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
> FROM BulkInsertWork
> etc. for each type
> FWIW
> Russell Fields
> http:/www.sqlpass.org/events/seattle03
> 2003 PASS Community Summit - Seattle
> - The largest user-even dedicated to SQL Server! Register TODAY!
>
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:bob7k7$4dj$1@.reader11.wxs.nl...
> > Hello group,
> >
> > Each month we want to import data from an ascii file,
> > the records are of a fixed length (some spare space is added to fill
them
> > up)
> > there are 12 different types of records (variants)
> > Each type has a different (but within the type always the same) build.
> > Each record starts with a type indication. Then the records follows.
> > One type can have 2 fields the other type can have 15 fields.
> >
> > Records following eachother can have relations, so a type two record
> > belongs to the type 1 record before the type 2 record.
> >
> > Example with 3 types
> >
> > 1 AAAAAABB
> > 2 CCCDDDDDDEEEEFFGGHH
> > 2 CCCDDDDDDEEEEFFGGHH
> > 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> > 1 AAAAAABB
> > 2 CCCDDDDDDEEEEFFGGHH
> >
> > The letters are indications for the different fields.
> >
> > Any suggestions how to solve this ?
> >
> > In our propriety system this was solved by reading the record
> > into a pascal record with all twelf variants and then writing it to the
> > propriety database. We could reuse the technique of the program,
> > but can not reuse the program.
> > And we still have to split up each record into fields.
> > Our propriety database just accepted the 'binairy' data as is.
> >
> > ben brugman
> >
> >
>|||Ben,
A guess: The last row of the text file does not have the '\n'. If that is
so, append the character.
Less likely and more obscure: Are you on SQL Server 2000? If you are on
7.0, KB article 324122 has obscure problems that it is unlikely apply,
but...
http://support.microsoft.com/default.aspx?scid=kb;en-us;324122#appliesto
http://support.microsoft.com/default.aspx?scid=kb;EN-US;272292
http://support.microsoft.com/default.aspx?scid=kb;EN-US;197043
All error numbers that are in the range of 7300 to 7399 indicate a problem
with the provider, in this case STREAM.
Russell Fields
http://www.sqlpass.org/events/seattle03
2003 PASS Community Summit - Seattle
- The largest user-event dedicated to SQL Server! Register TODAY!
"ben brugman" <ben@.niethier.nl> wrote in message
news:eumzJHGpDHA.2188@.TK2MSFTNGP11.phx.gbl...
> I immeadiatly went ahead with your suggested method.
> I created a .txt file from the example in the mail.
> I changed the nvarchar in the table to varchar.
> When inserting from the .txt file I get the following errors :
> Server: Msg 4832, Level 16, State 1, Line 1
> Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.
> Server: Msg 7399, Level 16, State 1, Line 1
> OLE DB provider 'STREAM' reported an error. The provider did not give any
> information about the error.
> The statement has been terminated.
>
> Added :
> WITH (
> DATAFILETYPE = 'char',
> ROWTERMINATOR = '\n'
> )
> But stil the same errors.
> (Lastrow and rowsperbatch did not bring a solution either).
> Please advise.
> Also played around a bit with DTS, but the problem is that I can do an
> import and then
> a selection. Or that I can do a split up of the line into different
fields.
> But I can not do them both.
> (Or I have to split up every line in every possible way and after
selection
> throwing away
> the not desired results. Make a table for each type and insert every row
in
> every table and
> then trowing away the types which do not belong in the table. I think this
> is pretty ugly).
> Ben brugman
>
>
>
> "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> news:uf4ZQm7oDHA.1072@.TK2MSFTNGP09.phx.gbl...
> > Ben,
> >
> > You can reimplement your method of reading a record at a time and
> inserting
> > the row into the proper destination by using a DTS package and doing
some
> > scripting for each row.
> >
> > How I tend to do things like this is:
> > BULK INSERT the file to a work table
> >
> > CREATE TABLE BulkInsertWork
> > (RowID INT IDENTITY,
> > RowText NVARCHAR(1000))
> >
> > TRUNCATE TABLE BulkInsertWork
> >
> > BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt' ...
> >
> > Then I write a stored procedure that parses out the specific rows, such
> as:
> >
> > INSERT INTO Table1 (ColumnA, ColumnB)
> > SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
> > FROM BulkInsertWork
> >
> > etc. for each type
> >
> > FWIW
> > Russell Fields
> > http:/www.sqlpass.org/events/seattle03
> > 2003 PASS Community Summit - Seattle
> > - The largest user-even dedicated to SQL Server! Register TODAY!
> >
> >
> >
> > "ben brugman" <ben@.niethier.nl> wrote in message
> > news:bob7k7$4dj$1@.reader11.wxs.nl...
> > > Hello group,
> > >
> > > Each month we want to import data from an ascii file,
> > > the records are of a fixed length (some spare space is added to fill
> them
> > > up)
> > > there are 12 different types of records (variants)
> > > Each type has a different (but within the type always the same) build.
> > > Each record starts with a type indication. Then the records follows.
> > > One type can have 2 fields the other type can have 15 fields.
> > >
> > > Records following eachother can have relations, so a type two record
> > > belongs to the type 1 record before the type 2 record.
> > >
> > > Example with 3 types
> > >
> > > 1 AAAAAABB
> > > 2 CCCDDDDDDEEEEFFGGHH
> > > 2 CCCDDDDDDEEEEFFGGHH
> > > 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> > > 1 AAAAAABB
> > > 2 CCCDDDDDDEEEEFFGGHH
> > >
> > > The letters are indications for the different fields.
> > >
> > > Any suggestions how to solve this ?
> > >
> > > In our propriety system this was solved by reading the record
> > > into a pascal record with all twelf variants and then writing it to
the
> > > propriety database. We could reuse the technique of the program,
> > > but can not reuse the program.
> > > And we still have to split up each record into fields.
> > > Our propriety database just accepted the 'binairy' data as is.
> > >
> > > ben brugman
> > >
> > >
> >
> >
>|||After loads of trying, reading the KB file, still getting the same Error
message.
In the end we deleted the identity column from "BulkInsertWork" and now the
insert works, but offcourse we have lost the order of the table.
Also did try some DTS work, but DTS is not very flexible when it comes to
variant records in a text file.
Thanks for your time,
and if you have suggestions to bring the identity column back,
I would appreciate that.
ben brugman
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:uqOboZIpDHA.2404@.TK2MSFTNGP12.phx.gbl...
> Ben,
> A guess: The last row of the text file does not have the '\n'. If that
is
> so, append the character.
> Less likely and more obscure: Are you on SQL Server 2000? If you are on
> 7.0, KB article 324122 has obscure problems that it is unlikely apply,
> but...
> http://support.microsoft.com/default.aspx?scid=kb;en-us;324122#appliesto
> http://support.microsoft.com/default.aspx?scid=kb;EN-US;272292
> http://support.microsoft.com/default.aspx?scid=kb;EN-US;197043
> All error numbers that are in the range of 7300 to 7399 indicate a problem
> with the provider, in this case STREAM.
> Russell Fields
> http://www.sqlpass.org/events/seattle03
> 2003 PASS Community Summit - Seattle
> - The largest user-event dedicated to SQL Server! Register TODAY!
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:eumzJHGpDHA.2188@.TK2MSFTNGP11.phx.gbl...
> > I immeadiatly went ahead with your suggested method.
> > I created a .txt file from the example in the mail.
> > I changed the nvarchar in the table to varchar.
> >
> > When inserting from the .txt file I get the following errors :
> >
> > Server: Msg 4832, Level 16, State 1, Line 1
> > Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.
> > Server: Msg 7399, Level 16, State 1, Line 1
> > OLE DB provider 'STREAM' reported an error. The provider did not give
any
> > information about the error.
> > The statement has been terminated.
> >
> >
> > Added :
> > WITH (
> > DATAFILETYPE = 'char',
> > ROWTERMINATOR = '\n'
> > )
> >
> > But stil the same errors.
> > (Lastrow and rowsperbatch did not bring a solution either).
> >
> > Please advise.
> >
> > Also played around a bit with DTS, but the problem is that I can do an
> > import and then
> > a selection. Or that I can do a split up of the line into different
> fields.
> > But I can not do them both.
> > (Or I have to split up every line in every possible way and after
> selection
> > throwing away
> > the not desired results. Make a table for each type and insert every row
> in
> > every table and
> > then trowing away the types which do not belong in the table. I think
this
> > is pretty ugly).
> >
> > Ben brugman
> >
> >
> >
> >
> >
> >
> >
> > "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> > news:uf4ZQm7oDHA.1072@.TK2MSFTNGP09.phx.gbl...
> > > Ben,
> > >
> > > You can reimplement your method of reading a record at a time and
> > inserting
> > > the row into the proper destination by using a DTS package and doing
> some
> > > scripting for each row.
> > >
> > > How I tend to do things like this is:
> > > BULK INSERT the file to a work table
> > >
> > > CREATE TABLE BulkInsertWork
> > > (RowID INT IDENTITY,
> > > RowText NVARCHAR(1000))
> > >
> > > TRUNCATE TABLE BulkInsertWork
> > >
> > > BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt' ...
> > >
> > > Then I write a stored procedure that parses out the specific rows,
such
> > as:
> > >
> > > INSERT INTO Table1 (ColumnA, ColumnB)
> > > SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
> > > FROM BulkInsertWork
> > >
> > > etc. for each type
> > >
> > > FWIW
> > > Russell Fields
> > > http:/www.sqlpass.org/events/seattle03
> > > 2003 PASS Community Summit - Seattle
> > > - The largest user-even dedicated to SQL Server! Register TODAY!
> > >
> > >
> > >
> > > "ben brugman" <ben@.niethier.nl> wrote in message
> > > news:bob7k7$4dj$1@.reader11.wxs.nl...
> > > > Hello group,
> > > >
> > > > Each month we want to import data from an ascii file,
> > > > the records are of a fixed length (some spare space is added to fill
> > them
> > > > up)
> > > > there are 12 different types of records (variants)
> > > > Each type has a different (but within the type always the same)
build.
> > > > Each record starts with a type indication. Then the records follows.
> > > > One type can have 2 fields the other type can have 15 fields.
> > > >
> > > > Records following eachother can have relations, so a type two record
> > > > belongs to the type 1 record before the type 2 record.
> > > >
> > > > Example with 3 types
> > > >
> > > > 1 AAAAAABB
> > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> > > > 1 AAAAAABB
> > > > 2 CCCDDDDDDEEEEFFGGHH
> > > >
> > > > The letters are indications for the different fields.
> > > >
> > > > Any suggestions how to solve this ?
> > > >
> > > > In our propriety system this was solved by reading the record
> > > > into a pascal record with all twelf variants and then writing it to
> the
> > > > propriety database. We could reuse the technique of the program,
> > > > but can not reuse the program.
> > > > And we still have to split up each record into fields.
> > > > Our propriety database just accepted the 'binairy' data as is.
> > > >
> > > > ben brugman
> > > >
> > > >
> > >
> > >
> >
> >
>|||Ben,
Yes, I am sorry that I glossed this in my earlier quick explanation. (I
thought about it during the night, don't you know.)
You need a format file to tell bulk insert to skip over the Identity column.
Something like:
8.0
2
1 SQLCHAR 0 0 "" 1 ID SQL_Latin1_General_Cp437_BIN
2 SQLCHAR 0 1000 "\r\n" 2 Text SQL_Latin1_General_Cp437_BIN
Russell Fields
http://www.sqlpass.org/events/seattle03
2003 PASS Community Summit - Seattle
- The largest user-event dedicated to SQL Server! Register TODAY!
"ben brugman" <ben@.niethier.nl> wrote in message
news:OiAr$FTpDHA.3844@.tk2msftngp13.phx.gbl...
> After loads of trying, reading the KB file, still getting the same Error
> message.
> In the end we deleted the identity column from "BulkInsertWork" and now
the
> insert works, but offcourse we have lost the order of the table.
> Also did try some DTS work, but DTS is not very flexible when it comes to
> variant records in a text file.
> Thanks for your time,
> and if you have suggestions to bring the identity column back,
> I would appreciate that.
> ben brugman
>
> "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> news:uqOboZIpDHA.2404@.TK2MSFTNGP12.phx.gbl...
> > Ben,
> >
> > A guess: The last row of the text file does not have the '\n'. If that
> is
> > so, append the character.
> >
> > Less likely and more obscure: Are you on SQL Server 2000? If you are
on
> > 7.0, KB article 324122 has obscure problems that it is unlikely apply,
> > but...
> > http://support.microsoft.com/default.aspx?scid=kb;en-us;324122#appliesto
> > http://support.microsoft.com/default.aspx?scid=kb;EN-US;272292
> > http://support.microsoft.com/default.aspx?scid=kb;EN-US;197043
> >
> > All error numbers that are in the range of 7300 to 7399 indicate a
problem
> > with the provider, in this case STREAM.
> >
> > Russell Fields
> > http://www.sqlpass.org/events/seattle03
> > 2003 PASS Community Summit - Seattle
> > - The largest user-event dedicated to SQL Server! Register TODAY!
> >
> > "ben brugman" <ben@.niethier.nl> wrote in message
> > news:eumzJHGpDHA.2188@.TK2MSFTNGP11.phx.gbl...
> > > I immeadiatly went ahead with your suggested method.
> > > I created a .txt file from the example in the mail.
> > > I changed the nvarchar in the table to varchar.
> > >
> > > When inserting from the .txt file I get the following errors :
> > >
> > > Server: Msg 4832, Level 16, State 1, Line 1
> > > Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.
> > > Server: Msg 7399, Level 16, State 1, Line 1
> > > OLE DB provider 'STREAM' reported an error. The provider did not give
> any
> > > information about the error.
> > > The statement has been terminated.
> > >
> > >
> > > Added :
> > > WITH (
> > > DATAFILETYPE = 'char',
> > > ROWTERMINATOR = '\n'
> > > )
> > >
> > > But stil the same errors.
> > > (Lastrow and rowsperbatch did not bring a solution either).
> > >
> > > Please advise.
> > >
> > > Also played around a bit with DTS, but the problem is that I can do an
> > > import and then
> > > a selection. Or that I can do a split up of the line into different
> > fields.
> > > But I can not do them both.
> > > (Or I have to split up every line in every possible way and after
> > selection
> > > throwing away
> > > the not desired results. Make a table for each type and insert every
row
> > in
> > > every table and
> > > then trowing away the types which do not belong in the table. I think
> this
> > > is pretty ugly).
> > >
> > > Ben brugman
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > > "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> > > news:uf4ZQm7oDHA.1072@.TK2MSFTNGP09.phx.gbl...
> > > > Ben,
> > > >
> > > > You can reimplement your method of reading a record at a time and
> > > inserting
> > > > the row into the proper destination by using a DTS package and doing
> > some
> > > > scripting for each row.
> > > >
> > > > How I tend to do things like this is:
> > > > BULK INSERT the file to a work table
> > > >
> > > > CREATE TABLE BulkInsertWork
> > > > (RowID INT IDENTITY,
> > > > RowText NVARCHAR(1000))
> > > >
> > > > TRUNCATE TABLE BulkInsertWork
> > > >
> > > > BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt' ...
> > > >
> > > > Then I write a stored procedure that parses out the specific rows,
> such
> > > as:
> > > >
> > > > INSERT INTO Table1 (ColumnA, ColumnB)
> > > > SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
> > > > FROM BulkInsertWork
> > > >
> > > > etc. for each type
> > > >
> > > > FWIW
> > > > Russell Fields
> > > > http:/www.sqlpass.org/events/seattle03
> > > > 2003 PASS Community Summit - Seattle
> > > > - The largest user-even dedicated to SQL Server! Register TODAY!
> > > >
> > > >
> > > >
> > > > "ben brugman" <ben@.niethier.nl> wrote in message
> > > > news:bob7k7$4dj$1@.reader11.wxs.nl...
> > > > > Hello group,
> > > > >
> > > > > Each month we want to import data from an ascii file,
> > > > > the records are of a fixed length (some spare space is added to
fill
> > > them
> > > > > up)
> > > > > there are 12 different types of records (variants)
> > > > > Each type has a different (but within the type always the same)
> build.
> > > > > Each record starts with a type indication. Then the records
follows.
> > > > > One type can have 2 fields the other type can have 15 fields.
> > > > >
> > > > > Records following eachother can have relations, so a type two
record
> > > > > belongs to the type 1 record before the type 2 record.
> > > > >
> > > > > Example with 3 types
> > > > >
> > > > > 1 AAAAAABB
> > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > > 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> > > > > 1 AAAAAABB
> > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > >
> > > > > The letters are indications for the different fields.
> > > > >
> > > > > Any suggestions how to solve this ?
> > > > >
> > > > > In our propriety system this was solved by reading the record
> > > > > into a pascal record with all twelf variants and then writing it
to
> > the
> > > > > propriety database. We could reuse the technique of the program,
> > > > > but can not reuse the program.
> > > > > And we still have to split up each record into fields.
> > > > > Our propriety database just accepted the 'binairy' data as is.
> > > > >
> > > > > ben brugman
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>|||I'll try that on monday.
At the moment I have done away with the identity column,
I am not sure if I will need it, because it seems that the only dependency
in order of the records is the dependency on the very first record, which
only exists once.
I am using your method of
"SUBSTRING(RowText,3,6), SUBSTRING(RowText,9,2)"
But because there are 153 columns in total, I do not want to type this out.
So I am in the process of writing scripts to generate this code.
Up to know I have managed to get three tables done, but the script
still needs some 'hand editing' for each table.
Using constructs like
Select 'INSERT INTO' + @.@.tablename+ '('
Select 'SUBSTRING(Rowtext,
'+convert(varchar(3),offset)+','+convert(varchar(3),length)+'),'
From metatable where table_name = @.@.tablename and length <> 0
Select 'From Bulkinsertwork'
Or something similar, then I cut and past this into another QA window,
clean it up and execute it.
Some things still to be solved are
The extra comma behind the last SUBSTRING line.
The extra lines generated (with the number of rows affectted and headers).
So I might go and do this in a stringvariabele and try to get the last comma
out.
Haven't decided jet if I am going to deliver the end script or enough tools
to build
the script.
Still some problems with referential constraints and spaces which have to be
converted to
<NULLS>. (After or during ? inserting ?).
Chopping all problems up and going from intermediate result to intermadiate
result makes
the code less complex.
And some problems with the content of the columns coming from other lines.
(The first line contains a sort of number which should be included in almost
all records).
Probably use temporary tables resolve all problems and then copy the content
to the
production tables.
Thanks for you help.
ben brugman
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:ucvlkTTpDHA.1632@.TK2MSFTNGP10.phx.gbl...
> Ben,
> Yes, I am sorry that I glossed this in my earlier quick explanation. (I
> thought about it during the night, don't you know.)
> You need a format file to tell bulk insert to skip over the Identity
column.
> Something like:
> 8.0
> 2
> 1 SQLCHAR 0 0 "" 1 ID SQL_Latin1_General_Cp437_BIN
> 2 SQLCHAR 0 1000 "\r\n" 2 Text SQL_Latin1_General_Cp437_BIN
>
> Russell Fields
> http://www.sqlpass.org/events/seattle03
> 2003 PASS Community Summit - Seattle
> - The largest user-event dedicated to SQL Server! Register TODAY!
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:OiAr$FTpDHA.3844@.tk2msftngp13.phx.gbl...
> > After loads of trying, reading the KB file, still getting the same Error
> > message.
> >
> > In the end we deleted the identity column from "BulkInsertWork" and now
> the
> > insert works, but offcourse we have lost the order of the table.
> >
> > Also did try some DTS work, but DTS is not very flexible when it comes
to
> > variant records in a text file.
> >
> > Thanks for your time,
> > and if you have suggestions to bring the identity column back,
> > I would appreciate that.
> >
> > ben brugman
> >
> >
> > "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> > news:uqOboZIpDHA.2404@.TK2MSFTNGP12.phx.gbl...
> > > Ben,
> > >
> > > A guess: The last row of the text file does not have the '\n'. If
that
> > is
> > > so, append the character.
> > >
> > > Less likely and more obscure: Are you on SQL Server 2000? If you are
> on
> > > 7.0, KB article 324122 has obscure problems that it is unlikely apply,
> > > but...
> > >
http://support.microsoft.com/default.aspx?scid=kb;en-us;324122#appliesto
> > > http://support.microsoft.com/default.aspx?scid=kb;EN-US;272292
> > > http://support.microsoft.com/default.aspx?scid=kb;EN-US;197043
> > >
> > > All error numbers that are in the range of 7300 to 7399 indicate a
> problem
> > > with the provider, in this case STREAM.
> > >
> > > Russell Fields
> > > http://www.sqlpass.org/events/seattle03
> > > 2003 PASS Community Summit - Seattle
> > > - The largest user-event dedicated to SQL Server! Register TODAY!
> > >
> > > "ben brugman" <ben@.niethier.nl> wrote in message
> > > news:eumzJHGpDHA.2188@.TK2MSFTNGP11.phx.gbl...
> > > > I immeadiatly went ahead with your suggested method.
> > > > I created a .txt file from the example in the mail.
> > > > I changed the nvarchar in the table to varchar.
> > > >
> > > > When inserting from the .txt file I get the following errors :
> > > >
> > > > Server: Msg 4832, Level 16, State 1, Line 1
> > > > Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.
> > > > Server: Msg 7399, Level 16, State 1, Line 1
> > > > OLE DB provider 'STREAM' reported an error. The provider did not
give
> > any
> > > > information about the error.
> > > > The statement has been terminated.
> > > >
> > > >
> > > > Added :
> > > > WITH (
> > > > DATAFILETYPE = 'char',
> > > > ROWTERMINATOR = '\n'
> > > > )
> > > >
> > > > But stil the same errors.
> > > > (Lastrow and rowsperbatch did not bring a solution either).
> > > >
> > > > Please advise.
> > > >
> > > > Also played around a bit with DTS, but the problem is that I can do
an
> > > > import and then
> > > > a selection. Or that I can do a split up of the line into different
> > > fields.
> > > > But I can not do them both.
> > > > (Or I have to split up every line in every possible way and after
> > > selection
> > > > throwing away
> > > > the not desired results. Make a table for each type and insert every
> row
> > > in
> > > > every table and
> > > > then trowing away the types which do not belong in the table. I
think
> > this
> > > > is pretty ugly).
> > > >
> > > > Ben brugman
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > "Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
> > > > news:uf4ZQm7oDHA.1072@.TK2MSFTNGP09.phx.gbl...
> > > > > Ben,
> > > > >
> > > > > You can reimplement your method of reading a record at a time and
> > > > inserting
> > > > > the row into the proper destination by using a DTS package and
doing
> > > some
> > > > > scripting for each row.
> > > > >
> > > > > How I tend to do things like this is:
> > > > > BULK INSERT the file to a work table
> > > > >
> > > > > CREATE TABLE BulkInsertWork
> > > > > (RowID INT IDENTITY,
> > > > > RowText NVARCHAR(1000))
> > > > >
> > > > > TRUNCATE TABLE BulkInsertWork
> > > > >
> > > > > BULK INSERT MyDB.dbo.BulkInsertWork FROM 'c:\MyMonthlyFile.txt'
...
> > > > >
> > > > > Then I write a stored procedure that parses out the specific rows,
> > such
> > > > as:
> > > > >
> > > > > INSERT INTO Table1 (ColumnA, ColumnB)
> > > > > SELECT SUBSTRING(RowText,3,6), SUBSTRING(9,2)
> > > > > FROM BulkInsertWork
> > > > >
> > > > > etc. for each type
> > > > >
> > > > > FWIW
> > > > > Russell Fields
> > > > > http:/www.sqlpass.org/events/seattle03
> > > > > 2003 PASS Community Summit - Seattle
> > > > > - The largest user-even dedicated to SQL Server! Register TODAY!
> > > > >
> > > > >
> > > > >
> > > > > "ben brugman" <ben@.niethier.nl> wrote in message
> > > > > news:bob7k7$4dj$1@.reader11.wxs.nl...
> > > > > > Hello group,
> > > > > >
> > > > > > Each month we want to import data from an ascii file,
> > > > > > the records are of a fixed length (some spare space is added to
> fill
> > > > them
> > > > > > up)
> > > > > > there are 12 different types of records (variants)
> > > > > > Each type has a different (but within the type always the same)
> > build.
> > > > > > Each record starts with a type indication. Then the records
> follows.
> > > > > > One type can have 2 fields the other type can have 15 fields.
> > > > > >
> > > > > > Records following eachother can have relations, so a type two
> record
> > > > > > belongs to the type 1 record before the type 2 record.
> > > > > >
> > > > > > Example with 3 types
> > > > > >
> > > > > > 1 AAAAAABB
> > > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > > > 3 JJJJJJKKKKKKKKKLLMMMMMMNOPPPPQQQQ
> > > > > > 1 AAAAAABB
> > > > > > 2 CCCDDDDDDEEEEFFGGHH
> > > > > >
> > > > > > The letters are indications for the different fields.
> > > > > >
> > > > > > Any suggestions how to solve this ?
> > > > > >
> > > > > > In our propriety system this was solved by reading the record
> > > > > > into a pascal record with all twelf variants and then writing it
> to
> > > the
> > > > > > propriety database. We could reuse the technique of the program,
> > > > > > but can not reuse the program.
> > > > > > And we still have to split up each record into fields.
> > > > > > Our propriety database just accepted the 'binairy' data as is.
> > > > > >
> > > > > > ben brugman
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>

Wednesday, March 28, 2012

import mysql dump file in sql server

does anyone have a good idea how i can easily import a mysql dump file in a
sql server 2000 enterprise edition
The dump file is 7 gig big however my MS SQL server is a DELL poweredge 4600
with 2 x 2.8 GHZ hyperthreaded XEON processors 800 gig harddisk space
and 6 gig of memory ( running on Windows 2000 advanced with PAE and AWE
extensions enabled )
hope someone has experience with this
Michel Posseth [MCP ]M. Posseth wrote:
> does anyone have a good idea how i can easily import a mysql dump
> file in a sql server 2000 enterprise edition
> The dump file is 7 gig big however my MS SQL server is a DELL
> poweredge 4600 with 2 x 2.8 GHZ hyperthreaded XEON processors
> 800 gig harddisk space and 6 gig of memory ( running on Windows 2000
> advanced with PAE and AWE extensions enabled )
> hope someone has experience with this
> Michel Posseth [MCP ]
I don't know what a dump file is, but if it's just extracted data from a
MySQL table, you can use the DTS Import/Export Wizard to import the
results into SQL Server.
David Gugick
Imceda Software
www.imceda.com|||MySQL dump file is proprietary and SQLServer wouldn't know what to do with
it. You will need to load the dump file into a MySQL somewhere and then
follow this whitepaper to transfer your data.
http://www.microsoft.com/technet/pr...ploy/mysql.mspx
-oj
"M. Posseth" <michelp@.nohausystems.nl> wrote in message
news:ctr4r3$di0$1@.reader13.wxs.nl...
> does anyone have a good idea how i can easily import a mysql dump file in
> a
> sql server 2000 enterprise edition
> The dump file is 7 gig big however my MS SQL server is a DELL poweredge
> 4600
> with 2 x 2.8 GHZ hyperthreaded XEON processors 800 gig harddisk space
> and 6 gig of memory ( running on Windows 2000 advanced with PAE and AWE
> extensions enabled )
> hope someone has experience with this
> Michel Posseth [MCP ]
>|||David Gugick wrote:
> M. Posseth wrote:
>
>
> I don't know what a dump file is, but if it's just extracted data from a
> MySQL table, you can use the DTS Import/Export Wizard to import the
> results into SQL Server.
>
a sql dump file is just a SQL script with DDL and data ( the same as
you can make in SQL server ) however with its own SQL dialect
so instead that it contains data from one table it contains all data
from a database
i know that it is possible to import the sql dump after changing the
dialect parameters to the corresponding mssql dialect
however we are talking about a 7 gig SQL text file
so i was just wondering if someone knows how i can unleash the power of
my MSSQL server on this mysql dump file so i can import it.
well i guess i am pioneering again ,
how the #$#$$% am i gonna loop through a 7 gig text file and find and
replace the mysql datatypes for the corresponding MSSQL datatypes .......
Michelsql

Import Multi-Row Record Text File with DTS

I have a text file I need to import into a SQL Server table. Each record spans several lines. Does anyone have a vbscript routine that wil go thru this text file and put each record on one row? Once the records are one row, DTS will easily handle the import. Or is there a better way?

Sample multi-row records:

WRLDWYXCDS1 ALT101 APR04 21:30:24 6879 FAIL ALT
HOST 00 0 08 00 DN 3073477171 1st CYCLE
TEST TYPE CKTTST DIAGNOSTIC RESULT BIC/EBS LC TRBL:PTRN 000E S= 1 R= 0
ACTION REQUIRED Replace Card CARD TYPE 6X21AC

WRLDWYXCDS1 ALT101 APR04 22:31:37 7672 FAIL ALT
HOST 00 0 08 00 DN 3073477171 1st CYCLE
TEST TYPE CKTTST DIAGNOSTIC RESULT BIC/EBS LC TRBL:PTRN 000E S= 1 R= 0
ACTION REQUIRED Replace Card CARD TYPE 6X21ACDTS questions don't get a lot of answers on this forum. Perhaps most developers feel as I do, that DTS is a kluge solution of several different poorly integrated technologies. If you are lucky, my posting this reply will shoot your post back to the top of the list queue and somebody proficient in DTS will give you an answer.

Otherwise, I use DTS only for what it is best at; transfering and importing raw data. I avoid using DTS for transforming data, prefering instead to load the data into staging tables in my database and then run stored procedures to transfer it to the main database schema. I think one of the big advantages of this is that you separate the import process from the transform process, allowing you to flag and retain records that fail your business rules.|||Originally posted by eoffshore
I have a text file I need to import into a SQL Server table. Each record spans several lines. Does anyone have a vbscript routine that wil go thru this text file and put each record on one row? Once the records are one row, DTS will easily handle the import. Or is there a better way?

Sample multi-row records:

WRLDWYXCDS1 ALT101 APR04 21:30:24 6879 FAIL ALT
HOST 00 0 08 00 DN 3073477171 1st CYCLE
TEST TYPE CKTTST DIAGNOSTIC RESULT BIC/EBS LC TRBL:PTRN 000E S= 1 R= 0
ACTION REQUIRED Replace Card CARD TYPE 6X21AC

WRLDWYXCDS1 ALT101 APR04 22:31:37 7672 FAIL ALT
HOST 00 0 08 00 DN 3073477171 1st CYCLE
TEST TYPE CKTTST DIAGNOSTIC RESULT BIC/EBS LC TRBL:PTRN 000E S= 1 R= 0
ACTION REQUIRED Replace Card CARD TYPE 6X21AC

You don't need VB for this. It would be faster to create a staging table with one column. You want the column to be a char so you don't lose your fixed length. Have a select statement push it into a second staging table with char, but big enough for all four rows.

Then, you can either use a final parsing statement to divide it up, or export to a fixed length text file. It should be quicker to do it this way in sets then to parse through in VB where you'll have to parse through each line and length section.|||In your case it looks like you have records with five elements, where each element is a separate line in your raw data.

One solution to this is to import each record into a staging table as a single string of characters. A possible layout for your staging table would be:

RecordID (Identity)
GroupID (Int)
RecordType (Int)
RecordString (Varchar(500))
ImportErrors (Varchar(100))

As the data is imported into the staging table a RecordID is created to maintain the order.

Next you run a query(s) against your table to identify the RecordType for each record. For instance, maybe all the type 2 records start with 'HOST'

Lastly, group the records like this:

Update StagingTable
set GroupID = SubTable.GroupID
from StagingTable
inner join
(select StagingTable.RecordID, Min(GroupRecords.RecordID) GroupID
from StagingTable
inner join StagingTable GroupRecords
on StagingTable.RecordID >= GroupRecords.RecordID
and GroupRecords.RecordType = 1) SubTable
on StagingTable.RecordID = SubTable.RecordID

Now you have identified your imported records and their relationships to eachother, and you can load them into your schema. It's not neat or pretty, but it works. For each step you can flag problems by appending an error message to the ImportErrors field.

import multiple XML files at once

Hi,

I have about 300-400 XML files I want to load in my SQL database (2005). The following code will load one (1) file. How do i do a mulitple collections?

INSERT INTO MEL (DATA)SELECT *FROM OPENROWSET (BULK
'C:\Temp\CHAPTER1.xml', SINGLE_BLOB)AS TEMP

Thanks,

http://www.sqlservercentral.com/columnists/smoore/importingxmlfilesintosqlserver.asp

See this article.. i think ur schema should be same for each file...

|||

Thank you for the help. This script runs in VBS, how do I do this in VB.NET or from with SQL (stored procedure)? Also I am looking to import the XML as RAW XML. My XML files are large and complex and I want to store them in a table with TYPE of the field "XML".

Thanks Again,

Bones

|||

Example to use the below stored proc

-- Listing 2

CREATE TABLE #Files (MyFile varchar(200))

CREATE TABLE #Lines (MyLine varchar(8000))

DECLARE @.MyFile varchar(200), @.SQL varchar(2000), @.Path varchar(400)

SET @.Path = 'C:\Program Files\Microsoft SQL Server\MSSQL\'

EXECUTE sp_ListFiles @.Path,'#Files','%.txt',NULL,0

SELECT @.MyFile = MyFile FROM #Files WHERE MyFile LIKE 'README%'

SET @.SQL = 'BULK INSERT #Lines FROM ' + CHAR(39) + @.Path + @.MyFile + CHAR(39)

EXECUTE (@.SQL)

SELECT * FROM #Lines

DROP TABLE #Files

DROP TABLE #Lines

1---------------------23StoredProcedure:sp_ListFiles45---------------------6789USE master10GO11CREATE PROCEDURE dbo.sp_ListFiles12 @.PCWritevarchar(2000),13 @.DBTablevarchar(100)=NULL,14 @.PCIntravarchar(100)=NULL,15 @.PCExtravarchar(100)=NULL,16 @.DBUltrabit = 017AS1819SET NOCOUNT ON2021DECLARE @.Return int22DECLARE @.Retainint23DECLARE @.Statusint2425SET @.Status = 02627DECLARE @.Taskvarchar(2000)2829DECLARE @.Work varchar(2000)3031DECLARE @.Wishvarchar(2000)3233SET @.Work ='DIR ' +'"' + @.PCWrite +'"'3435CREATE TABLE #DBAZ (Name varchar(400),Work int IDENTITY(1,1))3637INSERT #DBAZEXECUTE @.Return = master.dbo.xp_cmdshell @.Work3839SET @.Retain =@.@.ERROR4041IF @.Status = 0SET @.Status = @.Retain42IF @.Status = 0SET @.Status = @.Return4344IF (SELECTCOUNT(*)FROM #DBAZ) < 44546BEGIN4748 SELECT @.Wish =Name FROM #DBAZWHERE Work = 14950IF @.WishISNULL5152BEGIN5354 RAISERROR ('General error [%d]',16,1,@.Status)5556END5758 ELSE5960 BEGIN6162 RAISERROR (@.Wish,16,1)6364END6566 END6768ELSE6970 BEGIN7172 DELETE #DBAZWHEREISDATE(SUBSTRING(Name,1,10)) = 0ORSUBSTRING(Name,40,1) ='.'ORNameLIKE'%.lnk'7374IF @.DBTableISNULL7576BEGIN7778 SELECTSUBSTRING(Name,40,100)AS Files79FROM #DBAZ80WHERE 0 = 081AND (@.DBUltra = 0ORNameLIKE'%<DIR>%')82AND (@.DBUltra != 0ORNameNOT LIKE'%<DIR>%')83AND (@.PCIntraISNULL ORSUBSTRING(Name,40,100)LIKE @.PCIntra)84AND (@.PCExtraISNULL ORSUBSTRING(Name,40,100)NOT LIKE @.PCExtra)85ORDER BY 18687END8889 ELSE9091 BEGIN9293 SET @.Task =' INSERT ' +REPLACE(@.DBTable,CHAR(32),CHAR(95))94 +' SELECT SUBSTRING(Name,40,100) AS Files'95 +' FROM #DBAZ'96 +' WHERE 0 = 0'97 +CASEWHEN @.DBUltra = 0THEN''ELSE' AND Name LIKE ' +CHAR(39) +'%<DIR>%' +CHAR(39)END98 +CASEWHEN @.DBUltra != 0THEN''ELSE' AND Name NOT LIKE ' +CHAR(39) +'%<DIR>%' +CHAR(39)END99 +CASEWHEN @.PCIntraISNULLTHEN''ELSE' AND SUBSTRING(Name,40,100) LIKE ' +CHAR(39) + @.PCIntra +CHAR(39)END100 +CASEWHEN @.PCExtraISNULLTHEN''ELSE' AND SUBSTRING(Name,40,100) NOT LIKE ' +CHAR(39) + @.PCExtra +CHAR(39)END101 +' ORDER BY 1'102103IF @.Status = 0EXECUTE (@.Task)SET @.Return =@.@.ERROR104105IF @.Status = 0SET @.Status = @.Return106107 END108109 END110111DROP TABLE #DBAZ112113SET NOCOUNT OFF114115RETURN (@.Status)116117GO118119-- Listing 2120

The above proc will return the list of files in a folder. Use the resulSet of this proc and then run a a cursor or loop to execute your procedure

|||

Thanks Satya

You've been very helpful.

Bones

import multiple text files?

Hello,

I am kind of new to Sql Server 2005.

I figured out how to use the import data wizard to import a delimited text file.

But I need to find a way to import many delimited text files at once.

does anybody know if this can be done in Sql Server 2005? and how?
thanks in advance,Hi,

you should probably use integration services for that. SSIS has a special task for that which will loop though a directory and inmport all text files which fit into the filter specified before.

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

Import multiple text files into Single table

How to import multiple text files (residing in single folder) into SQL Server table? I know how to import single file but not sure how multiple files could be loaded? Pls. guide.

Thanks,

HShah

I am assuming that thefiles all have the same structure.

There are a number of options availableto you. See here:

Processing data from multiple files all at once
(http://blogs.conchango.com/jamiethomson/archive/2006/10/14/SSIS_3A00_-Processing-data-from-multiple-files-all-at-once.aspx)

-Jamie

sql

Import MS-Excel to SQL-Server

Greetings All,

I have a excel file which is originally a sqlserver table that was
exported as a excel file. I have added more data to this excel file
and now want to import it again to its original table,i.e, it will
overwrite current data in the table but with no change in the schema.
How should I handle the issue of PKs in the current table that will be
over-written. I know sqlserver dose not adjust PKs when data is
over-written, like my case.

MTIA,
Grawsha<grawsha2000@.yahoo.com> wrote in message
news:1108999543.155717.259460@.o13g2000cwo.googlegr oups.com...
> Greetings All,
> I have a excel file which is originally a sqlserver table that was
> exported as a excel file. I have added more data to this excel file
> and now want to import it again to its original table,i.e, it will
> overwrite current data in the table but with no change in the schema.
> How should I handle the issue of PKs in the current table that will be
> over-written. I know sqlserver dose not adjust PKs when data is
> over-written, like my case.
> MTIA,
> Grawsha

I'm not sure what you mean by "overwritten" - do you mean you want to delete
all rows from the table, then load the .xls? If so, just use DELETE or
TRUNCATE TABLE to empty the table before loading the file. If you mean
something else, then you should give some more details, preferably including
the CREATE TABLE statement for your table, so we can see what your key
actually is.

Simon|||Simon

Yes I want to delete the all rows from the table (employees). But the
problem is there are a PK (Person ID) that I use to relate it to a FK
in another table (sales) . I would assume the Enerprise Manager would
give an error if I try to delete the rows.

My question is, should I remove the relation first, delete the rows,
load the file and then create the relation again?

Grawsha|||If your target table is referenced by a foreign key from another table,
you'll get an error if you delete from it, unless you created the key
with ON DELETE CASCADE (assuming MSSQL 2000 - you didn't mention which
version you have), in which case the referencing rows will be deleted
also.

As you suggest, you can load the data by dropping the foreign key then
recreating it, but this will leave orphaned rows in the referencing
table, unless your .xls contains data for all the current rows in the
target table. It isn't clear from your description if your .xls
contains updated rows for existing PK values, or if it contains
entirely new PK values and rows, or both. You might want to consider
another approach, which is to create a staging table with the same
structure as your target table, load the .xls into it, then INSERT and
UPDATE the data in the target table - this may be closer to what you
really need.

/* Add new rows */
insert into dbo.Target (col1, col2, ...)
from dbo.Staging s
where not exists (select * from dbo.Target t
where s.PK = t.PK)

/* Update values for existing rows */
update dbo.Target
set col1 = s.col1, col2 = s.col2, ...
from dbo.Staging s
join dbo.Target t
on s.PK = t.PK

If this doesn't help, I suggest that you post CREATE TABLE statements
for your tables, INSERT for some sample data, and then a few rows of
data from your .xls, to show what your data looks like and what you
expect to happen - descriptions by themselves are usually unclear.

http://www.aspfaq.com/etiquette.asp?id=5006

Simon

Import MDF SQL 2000 file into SQL 2005 Express

Is it possible?
I only have the MDF from my SQL 2000 DB, and I need to import it to SQL
Server 2005...
Best Regards
Fabio CavassiniYou might try CREATE DATABASE...FOR ATTACH_REBUILD_LOG. This should work if
the database was properly detached from the SQL 2000 instance. See the SQL
2005 Books Online for more information.
Hope this helps.
Dan Guzman
SQL Server MVP
"Fabio Cavassini" <cavassinif@.gmail.com> wrote in message
news:1138068510.765889.178750@.g44g2000cwa.googlegroups.com...
> Is it possible?
> I only have the MDF from my SQL 2000 DB, and I need to import it to SQL
> Server 2005...
> Best Regards
> Fabio Cavassini
>|||Thanks Dan
I tried to attach it with sp_attach_db and it works...it converts the
format to the new version.
After that I got the following error when I want to create a Diagram:
"Database diagram support objects cannot be installed because this
database does not have a valid owner. To continue, first use the Files
page of the Database Properties dialog box or the ALTER AUTHORIZATION
statement to set the database owner to a valid login, then add the
database diagram support objects."
that this code fix...
EXEC sp_dbcmptlevel 'yourDB', '90';
go
ALTER AUTHORIZATION ON DATABASE::yourDB TO "yourLogin"
go
use [yourDB]
go
EXECUTE AS USER = N'dbo' REVERT
go
Best Regards
Fabio Cavassini|||> I tried to attach it with sp_attach_db and it works...it converts the
> format to the new version.
Just like the SQL 2000 version, sp_attach_db is basically just a wrapper for
CREATE DATABASE...FOR ATTACH. I don't recommend it in SQL 2005 because it
will be discontinued in a future version so you might as well get used to it
(or use a GUI that does this for you). From the SQL Server 2005 Books
Online:
<Excerpt
href="ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/tsqlref9/html/59bc993e-7913-4091
-89cb-d2871cffda95.htm">
Important:
This feature will be removed in a future version of Microsoft SQL Server.
Avoid using this feature in new development work, and plan to modify
applications that currently use this feature. We recommend that you use
CREATE DATABASE database_name FOR ATTACH instead. For more information, see
CREATE DATABASE (Transact-SQL).
</Excerpt>
Hope this helps.
Dan Guzman
SQL Server MVP
"Fabio Cavassini" <cavassinif@.gmail.com> wrote in message
news:1138070344.185793.284520@.f14g2000cwb.googlegroups.com...
> Thanks Dan
> I tried to attach it with sp_attach_db and it works...it converts the
> format to the new version.
> After that I got the following error when I want to create a Diagram:
> "Database diagram support objects cannot be installed because this
> database does not have a valid owner. To continue, first use the Files
> page of the Database Properties dialog box or the ALTER AUTHORIZATION
> statement to set the database owner to a valid login, then add the
> database diagram support objects."
> that this code fix...
> EXEC sp_dbcmptlevel 'yourDB', '90';
> go
> ALTER AUTHORIZATION ON DATABASE::yourDB TO "yourLogin"
> go
> use [yourDB]
> go
> EXECUTE AS USER = N'dbo' REVERT
> go
> Best Regards
> Fabio Cavassini
>|||Thanks for the info Dan, I'll consider it.
Best Regards
Fabio Cavassinisql