Showing posts with label source. Show all posts
Showing posts with label source. Show all posts

Friday, March 30, 2012

Import problem with varchar(max) field

I'm trying to import some Assessor data from a text file into a table and the field for Legal Description (column 2 in the source text file) in the table is of data type varchar(max) because some of the data goes over the 8K size. I get an error on the first row of importing that refers to column 2 (see 'Initial errors' below). I read the related post and changed the size of input column 2 to 8000 and got this error. Finally I set the size of the of input column 2 to 4000 and it ran. So I'm thinking there is a limit on the size of varchar data that can be imported. Just want to clarify what that limit is and how I might go about importing this data.

Thanks, John

Error with input column 2 set to size of 8000:

Setting Destination Connection (Error)

Messages

Error 0xc0204016: DTS.Pipeline: The "output column "Column 2" (388)" has a length that is not valid. The length must be between 0 and 4000.
(SQL Server Import and Export Wizard)

Exception from HRESULT: 0xC0204016 (Microsoft.SqlServer.DTSPipelineWrap)

Initial errors:

Executing (Error)

Messages

Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Column 2" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)

Error 0xc020902a: Data Flow Task: The "output column "Column 2" (18)" failed because truncation occurred, and the truncation row disposition on "output column "Column 2" (18)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)

Error 0xc0202092: Data Flow Task: An error occurred while processing file "\\Scux00\assrdumps\SQLServerDB\exportsql.txt" on data row 1.
(SQL Server Import and Export Wizard)

Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - exportsql_txt" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038.
(SQL Server Import and Export Wizard)

Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039.
(SQL Server Import and Export Wizard)

There is a limit on the size of unicode character strings at 4000. Non-unicode character strings have a limit of 8000. Ensure that you don't have type conversions between varchar and nvarchar.|||

I'm trying the SQL Server Import and Export Wizard. I've set the DataType to most of the fields as 'string[DT_STR]' and the Unicode box is unchecked, but when I look at column mappings after clicking 'Edit Mappings', all the columns say they're type nvarchar on the screen and it doesn't appear to be something I can change. If I highlight a column on that screen it shows the setting that I set in the Advanced screen for the text file, ie .. Column 2 string[DT_STR](4000). Since I'm getting the limit of 4k it must be because of this. I'm not sure how to modify it. I'll save the package as a file and open it up in BID Studio ... john

import ODBC data to SQL Server 2005

I can't find the ODBC selection in the Data source list of the Import Wizard, so how can I import data from an ODBC data source to SQL Server 2005?

have you found a solution to this yet? if so, please share. thank you.sql

Wednesday, March 28, 2012

import ODBC data to SQL Server 2005

I can't find the ODBC selection in the Data source list of the Import Wizard, so how can I import data from an ODBC data source to SQL Server 2005?

have you found a solution to this yet? if so, please share. thank you.

Import objects from SSAS to Business Intelligence Development Studio

Hi all,

Could anyone tell me how I can use Business Intelligence Development Studio to browse or even modify data source views, cubes, etc. on SSAS created by means other than the Studio?

Thanks,

hz

Two ways:

1. BI Dev Studio online mode:
Start BI Dev Studio . File->Open->Analysis Services Database.

2. Create new project based on the live version of the database.
Start BI Dev Studio . File->New Project and choose project type "Import Analysis Services 9.0 Database"

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward, your answer is very helpful. Thanks a lot!

If you don't mind, I have one more question regarding this.

If I modify the objects of SSAS by means other than BI Studio. Can BI studio detect these changes and modify its projects accordingly. This is a sort of the reverse process of project deployment.

hz

|||

Unfortunately the answer is: there is no easy way to compare an existing project to the live version of the database.

This is one of the things the next version of the product could be addressing.

For now there are several workarounds I can think of:

1. Export live version of the database into a new project and then use some sort of XML comparison tool to compare project files. Compare .dim files for dimensions, .cube file for cube definitions

2. Create a script of the live version of the database by using right click->Script Database as ... in the SQL Management studio.
And then compare the script to the script you get by running Deployment Wizard utility for your project.
Again you should be able to use XML compare to compare both scripts.

These are not perfect ways, but should give you some idea for extent of the changes done to your live database.

Hope that helps.
Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Thanks again, Edward! You have offered all I need to know regarding this.

I use AMO to create and modify objects such as cubes, data mining structures. Without using BI Studio, it will be very difficult to check whether my code has created what I need. Now, I'll just import the objects to the studio each time I want to check them.

hz

Import objects from SSAS to Business Intelligence Development Studio

Hi all,

Could anyone tell me how I can use Business Intelligence Development Studio to browse or even modify data source views, cubes, etc. on SSAS created by means other than the Studio?

Thanks,

hz

Two ways:

1. BI Dev Studio online mode:
Start BI Dev Studio . File->Open->Analysis Services Database.

2. Create new project based on the live version of the database.
Start BI Dev Studio . File->New Project and choose project type "Import Analysis Services 9.0 Database"

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward, your answer is very helpful. Thanks a lot!

If you don't mind, I have one more question regarding this.

If I modify the objects of SSAS by means other than BI Studio. Can BI studio detect these changes and modify its projects accordingly. This is a sort of the reverse process of project deployment.

hz

|||

Unfortunately the answer is: there is no easy way to compare an existing project to the live version of the database.

This is one of the things the next version of the product could be addressing.

For now there are several workarounds I can think of:

1. Export live version of the database into a new project and then use some sort of XML comparison tool to compare project files. Compare .dim files for dimensions, .cube file for cube definitions

2. Create a script of the live version of the database by using right click->Script Database as ... in the SQL Management studio.
And then compare the script to the script you get by running Deployment Wizard utility for your project.
Again you should be able to use XML compare to compare both scripts.

These are not perfect ways, but should give you some idea for extent of the changes done to your live database.

Hope that helps.
Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Thanks again, Edward! You have offered all I need to know regarding this.

I use AMO to create and modify objects such as cubes, data mining structures. Without using BI Studio, it will be very difficult to check whether my code has created what I need. Now, I'll just import the objects to the studio each time I want to check them.

hz

Monday, March 26, 2012

Import from sql 2000 to sql 2005

We keep running out of space due to the temp file getting so large on the
source server. Is there any way to prevent this ?
ThanksWhat exactly are you doing?
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Toronto, ON Canada
.
"Ray Brown" <RayBrown@.discussions.microsoft.com> wrote in message
news:A52B9428-5F2C-4DC9-8447-F6B6B440D5F1@.microsoft.com...
We keep running out of space due to the temp file getting so large on the
source server. Is there any way to prevent this ?
Thankssql

Import from Data from Microsoft Access

In SQL Server 2005, Im trying to append data to tables which already exist. Im importing the data through the import wizard.

The source from is Microsoft Access with no username and password.

The source to is SQL Server 2005 using OLE DB Provider SQL Server with the login information of the schema I wish to use.

I click through and the tables appear in the source. When I select all, they appear in the destination but they appear with the dbo. prefix which would regard them as new tables since the tables dont exist under that schema. I can click on the first destination table drop down text box and see all the tables under the schema there suppose to be under but its not the default. There are a lot of tables and I don't feel like using the drop down text box hundreds of times. Is there a solution to this problem?

It worked in Sql Server 200

Thanks

Scott

use dts or ssis|||

But why is it defaulting to dbo. when the table doesnt even exist and I can dropdown and see the proper table. Im even connected as the user I want to the destination database and the user is a db_owner. Creating a package wont work because we;re constantly adding tables and DTS may work but the preferred method is just to be able to import data into the proper schema

|||

dbo is the default schema.

how about qualifying the destination table with shcemaname.tablename in

the import process

|||

But if Im loggin in as Another user I would figure it would default to that user. When you say qualify in the import process do you mean change the [dbo]. to [proper schema owner].

I tried to create a package and then took the file and cut and paste dbo with proper name however since the table never existed its trying to create the table and it already exists so I get an error when I run it. When I use the drop down text box and change the table to the proper table with the proper owner it changes the option to append which is correct.

Im kinda at a loss as I feel there is nothing I Can do but hit the drop down text box for 200+ tables every time.

What I really need is a solution in the import/export wizard to show up with the destination tables as the proper schema owner?

Scott

|||

For my sake and everyone else's, Im not crazy. In SQL Server 2005 SP1 Microsoft has fixed this issue and now allows you to choose a destination schema. woooohoooo!!!!

However ... I am now getting the following error on appending data. The table structure exists and Im trying to append all data from Access tables into SQL Server tables. Not all Access tables have data. If I do one individual table it works. If I do 200 I get the following error:

- Prepare for Execute (Error)

Messages

Error 0xc0202009: {8DD4F4CE-2DD7-4856-A251-71D4206EC6DC}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Unspecified error".
(SQL Server Import and Export Wizard)

Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)

Error 0xc004701a: Data Flow Task: component "Source 64 - DP_ROUTE_JURISDICTION" (6998)failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard)

Does anyone have a solution or can point me in the right direction. Does it have something to do with the Access buffer size? Ive see some posts for this error but no solid solutions. Any help would be greatly appreciated.

Thanks

Scott

|||

Can you publish the Access database anywhere so that we can reproduce the problem?

Paul A. Mestemaker II
Program Manager
Microsoft SQL Server Manageability
http://blogs.msdn.com/sqlrem/

Import from Data from Microsoft Access

In SQL Server 2005, Im trying to append data to tables which already exist. Im importing the data through the import wizard.

The source from is Microsoft Access with no username and password.

The source to is SQL Server 2005 using OLE DB Provider SQL Server with the login information of the schema I wish to use.

I click through and the tables appear in the source. When I select all, they appear in the destination but they appear with the dbo. prefix which would regard them as new tables since the tables dont exist under that schema. I can click on the first destination table drop down text box and see all the tables under the schema there suppose to be under but its not the default. There are a lot of tables and I don't feel like using the drop down text box hundreds of times. Is there a solution to this problem?

It worked in Sql Server 200

Thanks

Scott

use dts or ssis|||

But why is it defaulting to dbo. when the table doesnt even exist and I can dropdown and see the proper table. Im even connected as the user I want to the destination database and the user is a db_owner. Creating a package wont work because we;re constantly adding tables and DTS may work but the preferred method is just to be able to import data into the proper schema

|||

dbo is the default schema.

how about qualifying the destination table with shcemaname.tablename in

the import process

|||

But if Im loggin in as Another user I would figure it would default to that user. When you say qualify in the import process do you mean change the [dbo]. to [proper schema owner].

I tried to create a package and then took the file and cut and paste dbo with proper name however since the table never existed its trying to create the table and it already exists so I get an error when I run it. When I use the drop down text box and change the table to the proper table with the proper owner it changes the option to append which is correct.

Im kinda at a loss as I feel there is nothing I Can do but hit the drop down text box for 200+ tables every time.

What I really need is a solution in the import/export wizard to show up with the destination tables as the proper schema owner?

Scott

|||

For my sake and everyone else's, Im not crazy. In SQL Server 2005 SP1 Microsoft has fixed this issue and now allows you to choose a destination schema. woooohoooo!!!!

However ... I am now getting the following error on appending data. The table structure exists and Im trying to append all data from Access tables into SQL Server tables. Not all Access tables have data. If I do one individual table it works. If I do 200 I get the following error:

- Prepare for Execute (Error)

Messages

Error 0xc0202009: {8DD4F4CE-2DD7-4856-A251-71D4206EC6DC}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Unspecified error".
(SQL Server Import and Export Wizard)

Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)

Error 0xc004701a: Data Flow Task: component "Source 64 - DP_ROUTE_JURISDICTION" (6998)failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard)

Does anyone have a solution or can point me in the right direction. Does it have something to do with the Access buffer size? Ive see some posts for this error but no solid solutions. Any help would be greatly appreciated.

Thanks

Scott

|||

Can you publish the Access database anywhere so that we can reproduce the problem?

Paul A. Mestemaker II
Program Manager
Microsoft SQL Server Manageability
http://blogs.msdn.com/sqlrem/

Import from an ODBC data source into SQL Server

Hi,

I am trying to import tables from an ODBC data source into an SQL Server 2005 database. I presume that one way to achieve this is to create an Integration Services package, via Business Intelligence Studio (I already used DTS in SQL Server 2000, but not Integration Services) ?

Or is there a simpler way ? For instance, is there an import wizard that woult include an ODBC Data source ?

Thanks i advance.

http://groups.google.de/group/microsoft.public.sqlserver.dts/browse_frm/thread/2d0b1220a73e2894/f9adfb6af01a8306?hl=de#f9adfb6af01a8306

Friday, March 23, 2012

Import export failed : Data conversion failed

[Source - chn_employee_vew_test_txt [1]] Error: Data conversion failed. The data conversion for column "Column 42" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

[Source - chn_employee_vew_test_txt [1]] Error: The "output column "Column 42" (136)" failed because truncation occurred, and the truncation row disposition on "output column "Column 42" (136)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

I using Locale (People's Republic of China) and code page of 936 (Simplied Chinese GBK) with header row delimiter {CR}{LF}.

I am using flat file import method.

Whenever the server process the Column 42 with value "11,Nanjing Rd.W, China" which contain 'comma' or '.' it will hit error importing with above message. When i manually change the column value to non comma or '.' (11 Nanjing Rd W China) in the flat file it is ok.

I am using SQL server 2005.

Please advise what need to be done to avoid this error ?

Thanks in advance and any idea or suggestion is very much appreciated as i have try to solve this issue for over a week but still not able to find any answer on it.

Please help.

regards,

kong

Check the Quoted_Identifier and text_qualifier values in SSIS, if you are running from a workstation then make sure SSIS server component is installed

Monday, March 19, 2012

Import database from access

Hi
I am develop a project with .net in windows application.my source of the
database set in access I want to working with WMSDE and I need to import the
database under the access to database in WMSDE first Is it passible?
If yes, how can I do this.
Thanks for who refer to my question
doesn't access have an upsizing wizard in it ?
"amos hchmon" <amoshchmon@.discussions.microsoft.com> wrote in message
news:91853F12-5194-4D31-891F-0C5EAFC16E94@.microsoft.com...
> Hi
> I am develop a project with .net in windows application.my source of the
> database set in access I want to working with WMSDE and I need to import
> the
> database under the access to database in WMSDE first Is it passible?
> If yes, how can I do this.
> Thanks for who refer to my question

Import database from access

Hi
I am develop a project with .net in windows application.my source of the
database set in access I want to working with WMSDE and I need to import the
database under the access to database in WMSDE first Is it passible?
If yes, how can I do this.
Thanks for who refer to my questiondoesn't access have an upsizing wizard in it ?
"amos hchmon" <amoshchmon@.discussions.microsoft.com> wrote in message
news:91853F12-5194-4D31-891F-0C5EAFC16E94@.microsoft.com...
> Hi
> I am develop a project with .net in windows application.my source of the
> database set in access I want to working with WMSDE and I need to import
> the
> database under the access to database in WMSDE first Is it passible?
> If yes, how can I do this.
> Thanks for who refer to my question|||you can import the access via SQL Servers import function in the EMC.

Import database from access

Hi
I am develop a project with .net in windows application.my source of the
database set in access I want to working with WMSDE and I need to import th
e
database under the access to database in WMSDE first Is it passible?
If yes, how can I do this.
Thanks for who refer to my questiondoesn't access have an upsizing wizard in it ?
"amos hchmon" <amoshchmon@.discussions.microsoft.com> wrote in message
news:91853F12-5194-4D31-891F-0C5EAFC16E94@.microsoft.com...
> Hi
> I am develop a project with .net in windows application.my source of the
> database set in access I want to working with WMSDE and I need to import
> the
> database under the access to database in WMSDE first Is it passible?
> If yes, how can I do this.
> Thanks for who refer to my question

Monday, March 12, 2012

IMPORT DATA FROM XML SOURCE TO SQL

Hello,

I'm trying to import data from a xml file (several tables inside) into sql tables.

- In the xml source, I choose the xml and xsd files and I see all tables perfectly.

- Drag the xml source output to Sql Destination input, choose one table to import and create the sql table

- I execute the task and it concludes ok

- In Execution Results window appears as warnings as fields of each one other tables (giving information about this fields are not used at the process and it'll be better removed it in order to increase performance), no errors and success task. The problem is that no one data is imported into sql destination (wrote 0 rows).

Process window:

Information: 0x4004300A at OPP, DTS.Pipeline: Validation phase is beginning.

Warning: 0x80047076 at OPP, DTS.Pipeline: The output column "field_1" (66374) on output "TABLA1" (50424) and component "XML Source" (27281) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.

Information: 0x40043006 at OPP, DTS.Pipeline: Prepare for Execute phase is beginning.

Information: 0x40043007 at OPP, DTS.Pipeline: Pre-Execute phase is beginning.

Information: 0x4004300C at OPP, DTS.Pipeline: Execute phase is beginning.

Information: 0x40043008 at OPP, DTS.Pipeline: Post Execute phase is beginning.

Information: 0x40043009 at OPP, DTS.Pipeline: Cleanup phase is beginning.

Information: 0x4004300B at OPP, DTS.Pipeline: "component "SQL Server Destination" (20427)" wrote 0 rows.

SSIS package "package.dtsx" finished: Success.

The program '[408] package.dtsx: DTS' has exited with code 0 (0x0).

If I import data of xml file from access, I haven't any problem. All tables are imported

I don't know what can be. Thanks a lot.

Gema

Use a data viewer to see if there's any data in the pipeline.

-Jamie

|||

there's no data into data viewer... What I do?

Gema

|||

It sounds as though its more likely that there is no data coming out of your source file rather than them failing to get inserted into SQL.

I have no idea as to why that may be though. For starters, work on the assumption that there is a fault in the way you have configured it.

-Jamie

|||

I think too that's a problem of xml source configuration (not about sql destination) but I don't know if can be a problem with xml-xsd files (is doesn't at my hand) or because some property of data flow task and/or its elements must be changed...

Thanks a lot, Jamie

Gema

import data from ODBC to SQL2005

Hi, there;
I want to importing data from ODBC, I created DataReader Source which use a .NET Provide \Odbc Data Provider and connected successfully. My destination is a OLE DB Destination that points to SQL2005. I set the SQL command as "SELECT * from ....".
I also have problem to create new table in SQL2005 using SSIS Import and Export Wizard, it doesn't know the source table table schema (two date type column). So I create the new table manully and run the package, I got error:

SSIS package "Package1.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
Error: 0xC02090F5 at Data Flow Task, Source - Query [1]: The component "Source - Query" (1) was unable to process the data.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "Source - Query" (1) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DF at Data Flow Task, Destination - PITest CMF [169]: The final commit for the data insertion has started.
Information: 0x402090E0 at Data Flow Task, Destination - PITest CMF [169]: The final commit for the data insertion has ended.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "Destination - PITest CMF" (169)" wrote 0 rows.
Task failed: Data Flow Task
SSIS package "Package1.dtsx" finished: Success.

Can any one know what's wrong here?

And it is quite pain that you have to specify the table name every time you want to create a new table in SQL2005. It is so easy in SQL2000!!!

Thanks.Did you verify that your source query worked and the column mappings metadata is correct? That's where I would start looking.

Wednesday, March 7, 2012

Import CSV File with FTP as Source

I want to use a FTP Task to obtain a file on a remote server, then
transfer into a table.
I'm sure it can be done, but there aren't many tutorials explaining
how it works only thin acknowledgments.Hi
"JGiotta" wrote:
> I want to use a FTP Task to obtain a file on a remote server, then
> transfer into a table.
> I'm sure it can be done, but there aren't many tutorials explaining
> how it works only thin acknowledgments.
>
You can use DTS for SQL 2000 or SSIS for ASQL 2005 expecially if there are
multiple files. For DTS check out http://www.sqldts.com/302.aspx and
http://www.sqldts.com/246.aspx
You will need to do this in two stages, get the FTP files and then import
them.
John

Import CSV File with FTP as Source

I want to use a FTP Task to obtain a file on a remote server, then
transfer into a table.
I'm sure it can be done, but there aren't many tutorials explaining
how it works only thin acknowledgments.Hi
"JGiotta" wrote:

> I want to use a FTP Task to obtain a file on a remote server, then
> transfer into a table.
> I'm sure it can be done, but there aren't many tutorials explaining
> how it works only thin acknowledgments.
>
You can use DTS for SQL 2000 or SSIS for ASQL 2005 expecially if there are
multiple files. For DTS check out http://www.sqldts.com/302.aspx and
http://www.sqldts.com/246.aspx
You will need to do this in two stages, get the FTP files and then import
them.
John

Import CSV File with FTP as Source

I want to use a FTP Task to obtain a file on a remote server, then
transfer into a table.
I'm sure it can be done, but there aren't many tutorials explaining
how it works only thin acknowledgments.
Hi
"JGiotta" wrote:

> I want to use a FTP Task to obtain a file on a remote server, then
> transfer into a table.
> I'm sure it can be done, but there aren't many tutorials explaining
> how it works only thin acknowledgments.
>
You can use DTS for SQL 2000 or SSIS for ASQL 2005 expecially if there are
multiple files. For DTS check out http://www.sqldts.com/302.aspx and
http://www.sqldts.com/246.aspx
You will need to do this in two stages, get the FTP files and then import
them.
John

Import and Export SQL 2005 Maintenance Plan

I had created a maintenance plan and configure and scheduled my maintenace plan to run. I would like to save this package as a file into a source control. SO I use SSIS to export the package under stored packages->MSDB->Maintenace Plan. After that, I wanted to test my import process. So I deleted the packages under SSIS ->stored packages->MSDB->Maintenance plan and I use the import to add the package from my previously exported package stored in a file .dtxs extention.

So the problem is, after I imported my package. I lost the configured schedule and the job runs without doing anything. When I try to go in and make changes to the package, by adding a new schedule under SQL server -> management->maintenance plan. I receive a odd error message and it doesn't allow me to save the package.. The error message I got is...."GUID should contain 32 digits with 4 dashes (xxxxx-xxx-xx...)"

So my questions are: 1. why did the re-import loses the originally configure job run schedule. 2. why doesn't the re-import package works by backing up database as it was first setup. 3. why I cannot re-edit this package and saving the package error out?

Thank you for reading and for your helps! --Jon

I hope this should help. I have the same problem and working out on it

http://support.microsoft.com/default.aspx/kb/922651

http://sqlug.be/blogs/drivenbysql/archive/2006/10/21/374.aspx

|||I had similar issues. SSIS wasn't installed first, so I installed it then starting to get that GUID error. It was a simple fix actually. In studio, got to SQL Server Agent, Jobs and it had jobs for all my deleted plans. I cleared out the jobs, then created a brand new plan and it worked fine. I guess the errors created when I tried to create a plan prior to the SSIS installation caused it not to clear the jobs.

Though, this may or may not be what happened to you.

Import and Export SQL 2005 Maintenance Plan

I had created a maintenance plan and configure and scheduled my maintenace plan to run. I would like to save this package as a file into a source control. SO I use SSIS to export the package under stored packages->MSDB->Maintenace Plan. After that, I wanted to test my import process. So I deleted the packages under SSIS ->stored packages->MSDB->Maintenance plan and I use the import to add the package from my previously exported package stored in a file .dtxs extention.

So the problem is, after I imported my package. I lost the configured schedule and the job runs without doing anything. When I try to go in and make changes to the package, by adding a new schedule under SQL server -> management->maintenance plan. I receive a odd error message and it doesn't allow me to save the package.. The error message I got is...."GUID should contain 32 digits with 4 dashes (xxxxx-xxx-xx...)"

So my questions are: 1. why did the re-import loses the originally configure job run schedule. 2. why doesn't the re-import package works by backing up database as it was first setup. 3. why I cannot re-edit this package and saving the package error out?

Thank you for reading and for your helps! --Jon

I hope this should help. I have the same problem and working out on it

http://support.microsoft.com/default.aspx/kb/922651

http://sqlug.be/blogs/drivenbysql/archive/2006/10/21/374.aspx

|||I had similar issues. SSIS wasn't installed first, so I installed it then starting to get that GUID error. It was a simple fix actually. In studio, got to SQL Server Agent, Jobs and it had jobs for all my deleted plans. I cleared out the jobs, then created a brand new plan and it worked fine. I guess the errors created when I tried to create a plan prior to the SSIS installation caused it not to clear the jobs.

Though, this may or may not be what happened to you.