Showing posts with label packages. Show all posts
Showing posts with label packages. Show all posts

Friday, March 30, 2012

Migrating from 2000 to 2005

Hello guys, I have vry basic question. I'm migrating from 2000 to 2005. MY question is do we have to recode all packages or just migrating of old 2000 DTS is fine. If so, then I can see only small icon of DTS package in 2005, whihc is not showing the details of DTS. Kindly, explain me what do i really do, migrating old one or re-creating new in 2005. Thanks in Advance

There is some information in this forum and on the web:

http://forums.microsoft.com/MSDN/Search/Search.aspx?words=DTS+migration&localechoice=9&SiteID=1&searchscope=forumscope&ForumID=80

http://www.google.com/search?hl=en&q=migration+ssis+to+dts

Migrating DTS packages Question: Dynamic Properties

Hi guys,

After DTS Migration is "successful", I opened up dynamic properties step but I saw codes were all commented out inside the public class ScriptMain. Since I am not familiar with Microsoft.SqlServer.Dts.Runtime, I'm not sure what methods I should use as there aren't much examples available online. How do I do the following in SISS way?

' Add your code here

' Source Type = 2

' Global variable = glvTrade

' Destination = 'Connections';'Trade';'OLEDBProperties';'Data Source';'Properties';'Value'

Regards

Hi,
I have same problem too.
' Source(Type = 2)
' Global variable = SPLastDate
' Destination = 'Tasks';'DTSTask_DTSExecuteSQLTask_7';'Properties';'SQLStatement'
How can i convert this code for ssis ?
Does anyone have any solution ?

Thans
Yasemin

Migrating DTS packages Question: Dynamic Properties

Hi guys,

After DTS Migration is "successful", I opened up dynamic properties step but I saw codes were all commented out inside the public class ScriptMain. Since I am not familiar with Microsoft.SqlServer.Dts.Runtime, I'm not sure what methods I should use as there aren't much examples available online. How do I do the following in SISS way?

' Add your code here

' Source Type = 2

' Global variable = glvTrade

' Destination = 'Connections';'Trade';'OLEDBProperties';'Data Source';'Properties';'Value'

Regards

Hi,
I have same problem too.
' Source(Type = 2)
' Global variable = SPLastDate
' Destination = 'Tasks';'DTSTask_DTSExecuteSQLTask_7';'Properties';'SQLStatement'
How can i convert this code for ssis ?
Does anyone have any solution ?

Thans
Yasemin

sql

Migrating DTS packages on instances

I have problem and don't know how to resolve it.

I'm running multiple instances(developpement,acceptation, production) on one server with the same db's on each one of them.

Now I'm trying to find out a way to transfer DTS packages from one instance to another (I know it is straight-forward if you don't have to change anything) but in my case I would want to change the connection properties within the dts package e.g.

on my server\developpement instance I've created a DTS package with connection properties server name server\developpement

Now I want to transfer it to my server\acceptation instance and there I want the connection properties server name to be server\acceptation .

I've tried doing it with a dynamic properties task but I can't seem to find the instance name where the package is saved. I can only find the server name(environment variable).

This of course is no problem if you have but a few packages, but if you have over 50 to transfer

Any help would be more than welcome.

Rosko

PS: I know it is possible between two different servers using (local) as the servername but that's not an option for my boss.I have a very similar situation. I currently have global variables. When I migrate, I am changing this from Package Properties.

If I do not find a solution, I plan to have a flat file and read the information from that file using ActiveScript.

Please let me know if you have a better solution.|||See

"LoadFromSQLServer' & 'SaveToSQLServer'
in SQLServer Books online or on the web.
this can be called in VB and will repopulate on the other server.

For more advanced changes the attached structe can be populated and manipulated in VB to create new DTS Packages.|||Correct me if I am wrong. I think this still does not address the issue.

The issue is to set the datasource property to the connections within the package(s) while the package is migrated from one instance to another.

LoadFromSQLServer and SaveToSQLServer seem to help in migrating from one instance to another.|||Unfortunately jayaramanc,

The connections held in the connections object must also be updated with the new server name, as does any ADODB connections in ActiveXScript in the Steps object.

MIgrating DTS packages of SQL 2000 to SQL 2005

Hello All,

I am working on a migration project of my database(lets name it DB1), from SQL 2000 to SQL 2005. I have some DTS packages in my SQL 2000 database which also needs to be migrated to SSIS. These DTS packages currently interacts with other database(lets name it DB2) which is also in SQL 2000 and which does the data transmission. The real issue is that the other database(DB2) also has some DTS packages which also communicates with DB1 (the db to be upgraded) and so when i migrate my DB1 to SQL 2005, i will have to change my DB2 packages also (although the change will be minor). Now considering that in future I might also migrate my DB2 to SQL 2005 I wanted to know what will be the right approach to follow. One is to modify the DTS packages of DB2 to accmodate the change in connection and the other is to migrate the DTS packages of DB2 also to SSIS.

Can anyone help me with this one.

Thanks in Advance

Mitesh

Hi Mitesh,

Something to consider: You can migrate a SQL Server 2000 database (DB1) to SQL Server 2005 and continue to run it at a SQL Server 2000 Compatibility Level. The implication is you may be able to continue connecting to DB1 from the DB2 DTS packages as you do now - it will just be on a different instance of SQL Server. I'm not sure what your DB2 DTS packages do exactly, so it's difficult to make that statement with more confidence.

There's also an Execute DTS 2000 Package Task available in SSIS. You can execute your existing DTS packages from SSIS. I've used this (and the aforementioned Compatibility Level) before as part of a migration plan from SQL Server 2000 to SQL Server 2005. It's nice because I don't have to get everything converted before starting to use the new database engine.

Regarding migrating between DTS and SSIS: that's a tough one. I teach ETL with SSIS for Solid Quality Learning. One of the things I share with students is the fact that SSIS is not the new version of DTS - it's a replacement. Microsoft did not start with DTS and modify it - they rewrote the entire application. This has some bearing on how easy it is to migrate from DTS. I've seen very simple DTS packages migrate easily, but that's about it.

There's so much new under the hood, you will most likely want to rebuild - and perhaps redesign - your packages in SSIS to take advantage of some of the stellar performance gains.

Hope this helps,

Andy

|||

Hey Andy,

Thanks for the reply will surely consider your suggestion. Just one more thing, when i use the Execute DTS 2000 Package task of SSIS i will have to change my connection as my database is migrated to SQL 2005 on another server. Correct me if i am wrong

|||

Hi Mitesh,

Yes sir - you will have to change your connection to target another database, server, or both.

Hope this helps,

Andy

Migrating DTS packages from SQL2000 to SQL2005

Hi,
My initial tests show that this feature will only see DTS Local Packages.
How can we migrate the DTS Meta Data Services Packages?
Thanks
Chris
The requirements for migrating DTS packages saved to the
repository are explained in Books Online help topic:
Using the Package Migration Wizard
http://msdn2.microsoft.com/en-us/library/ms143496.aspx
-Sue
On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:

>Hi,
>My initial tests show that this feature will only see DTS Local Packages.
>How can we migrate the DTS Meta Data Services Packages?
>Thanks
>Chris
>
|||Sue,
This mentions Repository packages only if SQL2000, SQL2000 tools or the
Repository files are present on the local computer, the server that the
Migration Wizard is being run on I assume? As we have msdb as our Repository
we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
Data Services packages?
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.4ax.com...
> The requirements for migrating DTS packages saved to the
> repository are explained in Books Online help topic:
> Using the Package Migration Wizard
> http://msdn2.microsoft.com/en-us/library/ms143496.aspx
> -Sue
> On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>
|||Yes...on the server where the migration wizard is being run.
The wizard needs the repository files to read the packages.
You just need the repository redistributable files on that
server.
-Sue
On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:

>Sue,
>This mentions Repository packages only if SQL2000, SQL2000 tools or the
>Repository files are present on the local computer, the server that the
>Migration Wizard is being run on I assume? As we have msdb as our Repository
>we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
>Data Services packages?
>Thanks
>Chris
>"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
>news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.4ax.com.. .
>
|||Thanks Sue.
The analysts changed the packages to Local packages so we could use the
Migration Wizard.
Now we have migrated them we had to install the SQL2000 DTS Designer
Components, from the Feature Pack, but we still cannot see the migrated
packages. I remember an article by Brian Moran in SQL Mag on how to fix this
problem but I don't have the article on hand.
Can someone tell me the fix please? I think it had to do with the registry.
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:2t81h2trkbmjlfsve86v8egsgs9lc01cci@.4ax.com...
> Yes...on the server where the migration wizard is being run.
> The wizard needs the repository files to read the packages.
> You just need the repository redistributable files on that
> server.
> -Sue
> On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>

Migrating DTS packages from SQL2000 to SQL2005

Hi,
My initial tests show that this feature will only see DTS Local Packages.
How can we migrate the DTS Meta Data Services Packages?
Thanks
ChrisThe requirements for migrating DTS packages saved to the
repository are explained in Books Online help topic:
Using the Package Migration Wizard
http://msdn2.microsoft.com/en-us/library/ms143496.aspx
-Sue
On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:
>Hi,
>My initial tests show that this feature will only see DTS Local Packages.
>How can we migrate the DTS Meta Data Services Packages?
>Thanks
>Chris
>|||Sue,
This mentions Repository packages only if SQL2000, SQL2000 tools or the
Repository files are present on the local computer, the server that the
Migration Wizard is being run on I assume? As we have msdb as our Repository
we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
Data Services packages?
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.4ax.com...
> The requirements for migrating DTS packages saved to the
> repository are explained in Books Online help topic:
> Using the Package Migration Wizard
> http://msdn2.microsoft.com/en-us/library/ms143496.aspx
> -Sue
> On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>>Hi,
>>My initial tests show that this feature will only see DTS Local Packages.
>>How can we migrate the DTS Meta Data Services Packages?
>>Thanks
>>Chris
>|||Yes...on the server where the migration wizard is being run.
The wizard needs the repository files to read the packages.
You just need the repository redistributable files on that
server.
-Sue
On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:
>Sue,
>This mentions Repository packages only if SQL2000, SQL2000 tools or the
>Repository files are present on the local computer, the server that the
>Migration Wizard is being run on I assume? As we have msdb as our Repository
>we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
>Data Services packages?
>Thanks
>Chris
>"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
>news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.4ax.com...
>> The requirements for migrating DTS packages saved to the
>> repository are explained in Books Online help topic:
>> Using the Package Migration Wizard
>> http://msdn2.microsoft.com/en-us/library/ms143496.aspx
>> -Sue
>> On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
>> <anonymous@.discussions.microsoft.com> wrote:
>>Hi,
>>My initial tests show that this feature will only see DTS Local Packages.
>>How can we migrate the DTS Meta Data Services Packages?
>>Thanks
>>Chris
>>
>|||Thanks Sue.
The analysts changed the packages to Local packages so we could use the
Migration Wizard.
Now we have migrated them we had to install the SQL2000 DTS Designer
Components, from the Feature Pack, but we still cannot see the migrated
packages. I remember an article by Brian Moran in SQL Mag on how to fix this
problem but I don't have the article on hand.
Can someone tell me the fix please? I think it had to do with the registry.
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:2t81h2trkbmjlfsve86v8egsgs9lc01cci@.4ax.com...
> Yes...on the server where the migration wizard is being run.
> The wizard needs the repository files to read the packages.
> You just need the repository redistributable files on that
> server.
> -Sue
> On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>>Sue,
>>This mentions Repository packages only if SQL2000, SQL2000 tools or the
>>Repository files are present on the local computer, the server that the
>>Migration Wizard is being run on I assume? As we have msdb as our
>>Repository
>>we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
>>Data Services packages?
>>Thanks
>>Chris
>>"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
>>news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.4ax.com...
>> The requirements for migrating DTS packages saved to the
>> repository are explained in Books Online help topic:
>> Using the Package Migration Wizard
>> http://msdn2.microsoft.com/en-us/library/ms143496.aspx
>> -Sue
>> On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
>> <anonymous@.discussions.microsoft.com> wrote:
>>Hi,
>>My initial tests show that this feature will only see DTS Local
>>Packages.
>>How can we migrate the DTS Meta Data Services Packages?
>>Thanks
>>Chris
>>
>

Migrating DTS packages from SQL2000 to SQL2005

Hi,
My initial tests show that this feature will only see DTS Local Packages.
How can we migrate the DTS Meta Data Services Packages?
Thanks
ChrisThe requirements for migrating DTS packages saved to the
repository are explained in Books Online help topic:
Using the Package Migration Wizard
http://msdn2.microsoft.com/en-us/library/ms143496.aspx
-Sue
On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:

>Hi,
>My initial tests show that this feature will only see DTS Local Packages.
>How can we migrate the DTS Meta Data Services Packages?
>Thanks
>Chris
>|||Sue,
This mentions Repository packages only if SQL2000, SQL2000 tools or the
Repository files are present on the local computer, the server that the
Migration Wizard is being run on I assume? As we have msdb as our Repository
we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
Data Services packages?
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.
4ax.com...
> The requirements for migrating DTS packages saved to the
> repository are explained in Books Online help topic:
> Using the Package Migration Wizard
> http://msdn2.microsoft.com/en-us/library/ms143496.aspx
> -Sue
> On Tue, 19 Sep 2006 08:04:56 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>
>|||Yes...on the server where the migration wizard is being run.
The wizard needs the repository files to read the packages.
You just need the repository redistributable files on that
server.
-Sue
On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
<anonymous@.discussions.microsoft.com> wrote:

>Sue,
>This mentions Repository packages only if SQL2000, SQL2000 tools or the
>Repository files are present on the local computer, the server that the
>Migration Wizard is being run on I assume? As we have msdb as our Repositor
y
>we would have to migrate the SQL2000 msdb to the SQL2005 to find the Meta
>Data Services packages?
>Thanks
>Chris
>"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
> news:7a20h2pq7uggrgp7iuvggghvdpu343mms8@.
4ax.com...
>|||Thanks Sue.
The analysts changed the packages to Local packages so we could use the
Migration Wizard.
Now we have migrated them we had to install the SQL2000 DTS Designer
Components, from the Feature Pack, but we still cannot see the migrated
packages. I remember an article by Brian Moran in SQL Mag on how to fix this
problem but I don't have the article on hand.
Can someone tell me the fix please? I think it had to do with the registry.
Thanks
Chris
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:2t81h2trkbmjlfsve86v8egsgs9lc01cci@.
4ax.com...
> Yes...on the server where the migration wizard is being run.
> The wizard needs the repository files to read the packages.
> You just need the repository redistributable files on that
> server.
> -Sue
> On Tue, 19 Sep 2006 10:07:43 -0600, "Chris Wood"
> <anonymous@.discussions.microsoft.com> wrote:
>
>sql

MIgrating DTS packages from one machine to another

I am building a new machine (SQL server 2000 SP3a on Windows 2003 R2) for an
exisitng production michine (SQL server 2000 SP3a on Windows 2000) with 27
user databases. I can move databases, logins, scheduled jobs. I have lot of
DTS packages and Maintenance Plans. Is there an easy way to recreate these
DTS packages and Maintenance plans on the new server with out creating each
one of them manually?
--
RKYou can open the package on one server, using DTS Designer, and then using
[Package], [Save As], save it to a different server.
Arnie Rowland
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"RK73" <RK73@.discussions.microsoft.com> wrote in message
news:F5FA74AD-DFFC-4C10-AC6D-CED5F8DE6203@.microsoft.com...
>I am building a new machine (SQL server 2000 SP3a on Windows 2003 R2) for
>an
> exisitng production michine (SQL server 2000 SP3a on Windows 2000) with 27
> user databases. I can move databases, logins, scheduled jobs. I have lot
> of
> DTS packages and Maintenance Plans. Is there an easy way to recreate these
> DTS packages and Maintenance plans on the new server with out creating
> each
> one of them manually?
> --
> RK|||Arnie Rowland wrote:[vbcol=seagreen]
> You can open the package on one server, using DTS Designer, and then using
> [Package], [Save As], save it to a different server.
> --
> Arnie Rowland
> Most good judgment comes from experience.
> Most experience comes from bad judgment.
> - Anonymous
>
> "RK73" <RK73@.discussions.microsoft.com> wrote in message
> news:F5FA74AD-DFFC-4C10-AC6D-CED5F8DE6203@.microsoft.com...
http://www.sqldts.com/?204
Regards
Amish Shah

Wednesday, March 28, 2012

MIgrating DTS packages from one machine to another

I am building a new machine (SQL server 2000 SP3a on Windows 2003 R2) for an
exisitng production michine (SQL server 2000 SP3a on Windows 2000) with 27
user databases. I can move databases, logins, scheduled jobs. I have lot of
DTS packages and Maintenance Plans. Is there an easy way to recreate these
DTS packages and Maintenance plans on the new server with out creating each
one of them manually?
--
RKYou can open the package on one server, using DTS Designer, and then using
[Package], [Save As], save it to a different server.
--
Arnie Rowland
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"RK73" <RK73@.discussions.microsoft.com> wrote in message
news:F5FA74AD-DFFC-4C10-AC6D-CED5F8DE6203@.microsoft.com...
>I am building a new machine (SQL server 2000 SP3a on Windows 2003 R2) for
>an
> exisitng production michine (SQL server 2000 SP3a on Windows 2000) with 27
> user databases. I can move databases, logins, scheduled jobs. I have lot
> of
> DTS packages and Maintenance Plans. Is there an easy way to recreate these
> DTS packages and Maintenance plans on the new server with out creating
> each
> one of them manually?
> --
> RK|||Arnie Rowland wrote:
> You can open the package on one server, using DTS Designer, and then using
> [Package], [Save As], save it to a different server.
> --
> Arnie Rowland
> Most good judgment comes from experience.
> Most experience comes from bad judgment.
> - Anonymous
>
> "RK73" <RK73@.discussions.microsoft.com> wrote in message
> news:F5FA74AD-DFFC-4C10-AC6D-CED5F8DE6203@.microsoft.com...
> >I am building a new machine (SQL server 2000 SP3a on Windows 2003 R2) for
> >an
> > exisitng production michine (SQL server 2000 SP3a on Windows 2000) with 27
> > user databases. I can move databases, logins, scheduled jobs. I have lot
> > of
> > DTS packages and Maintenance Plans. Is there an easy way to recreate these
> > DTS packages and Maintenance plans on the new server with out creating
> > each
> > one of them manually?
> > --
> > RK
http://www.sqldts.com/?204
Regards
Amish Shah

Migrating DTS packages and SQL Jobs

Hello All,
We are in the process of migrating our SQLServer2000 production server to a
better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
that needs to be migrated as well. I can script the Jobs and run the script
on the new server and similarly I can save the DTS package on the new server.
Instead, Can I backup the msdb database and restore it on the new server. I
was wondering which method would be the best.
Thanks,
rgnRestoring MSDB is certainly an option but if you have anything that
references the original server name directly you may have to fix that.
These links may help:
http://vyaskn.tripod.com/moving_sql_server.htm Moving DBs
http://msdn2.microsoft.com/en-us/library/ms345408(en-US,SQL.90).aspx Moving
system dbs 2005
http://www.databasejournal.com/features/mssql/article.php/3379901 Moving
system DB's 2000
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://www.support.microsoft.com/?id=221465 Using WITH MOVE in a Restore
http://www.sqlservercentral.com/columnists/cBunch/movingyouruserswiththeirdatabases.asp
Moving Users
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after
a Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.sqlservercentral.com/scripts/scriptdetails.asp?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
http://www.support.microsoft.com/?id=320125 Moving a Diagram
http://www.support.microsoft.com/?id=274463 Copy DB Wizard issues 2000
http://www.sqlservercentral.com/scripts/contributions/1598.asp Script
Roles and Permissions
--
Andrew J. Kelly SQL MVP
"rgn" <rgn@.discussions.microsoft.com> wrote in message
news:798481FC-6AD6-461C-A554-AF99EBD5B3E7@.microsoft.com...
> Hello All,
> We are in the process of migrating our SQLServer2000 production server to
> a
> better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
> that needs to be migrated as well. I can script the Jobs and run the
> script
> on the new server and similarly I can save the DTS package on the new
> server.
> Instead, Can I backup the msdb database and restore it on the new server.
> I
> was wondering which method would be the best.
> Thanks,
> rgn

Migrating DTS packages and SQL Jobs

Hello All,
We are in the process of migrating our SQLServer2000 production server to a
better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
that needs to be migrated as well. I can script the Jobs and run the script
on the new server and similarly I can save the DTS package on the new server
.
Instead, Can I backup the msdb database and restore it on the new server. I
was wondering which method would be the best.
Thanks,
rgnRestoring MSDB is certainly an option but if you have anything that
references the original server name directly you may have to fix that.
These links may help:
http://vyaskn.tripod.com/moving_sql_server.htm Moving DBs
http://msdn2.microsoft.com/en-us/library/ms345408(en-US,SQL.90).aspx Moving
system dbs 2005
http://www.databasejournal.com/feat...cle.php/3379901 Moving
system DB's 2000
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://www.support.microsoft.com/?id=221465 Using WITH MOVE in a Restore
http://www.sqlservercentral.com/col...se
s.asp
Moving Users
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after
a Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.sqlservercentral.com/scr...sp?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
http://www.support.microsoft.com/?id=320125 Moving a Diagram
http://www.support.microsoft.com/?id=274463 Copy DB Wizard issues 2000
http://www.sqlservercentral.com/scr...utions/1598.asp Script
Roles and Permissions
Andrew J. Kelly SQL MVP
"rgn" <rgn@.discussions.microsoft.com> wrote in message
news:798481FC-6AD6-461C-A554-AF99EBD5B3E7@.microsoft.com...
> Hello All,
> We are in the process of migrating our SQLServer2000 production server to
> a
> better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
> that needs to be migrated as well. I can script the Jobs and run the
> script
> on the new server and similarly I can save the DTS package on the new
> server.
> Instead, Can I backup the msdb database and restore it on the new server.
> I
> was wondering which method would be the best.
> Thanks,
> rgn

Migrating DTS packages and SQL Jobs

Hello All,
We are in the process of migrating our SQLServer2000 production server to a
better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
that needs to be migrated as well. I can script the Jobs and run the script
on the new server and similarly I can save the DTS package on the new server.
Instead, Can I backup the msdb database and restore it on the new server. I
was wondering which method would be the best.
Thanks,
rgn
Restoring MSDB is certainly an option but if you have anything that
references the original server name directly you may have to fix that.
These links may help:
http://vyaskn.tripod.com/moving_sql_server.htm Moving DBs
http://msdn2.microsoft.com/en-us/library/ms345408(en-US,SQL.90).aspx Moving
system dbs 2005
http://www.databasejournal.com/features/mssql/article.php/3379901 Moving
system DB's 2000
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://www.support.microsoft.com/?id=221465 Using WITH MOVE in a Restore
http://www.sqlservercentral.com/columnists/cBunch/movingyouruserswiththeirdatabases.asp
Moving Users
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after
a Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.sqlservercentral.com/scripts/scriptdetails.asp?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
http://www.support.microsoft.com/?id=320125 Moving a Diagram
http://www.support.microsoft.com/?id=274463 Copy DB Wizard issues 2000
http://www.sqlservercentral.com/scripts/contributions/1598.asp Script
Roles and Permissions
Andrew J. Kelly SQL MVP
"rgn" <rgn@.discussions.microsoft.com> wrote in message
news:798481FC-6AD6-461C-A554-AF99EBD5B3E7@.microsoft.com...
> Hello All,
> We are in the process of migrating our SQLServer2000 production server to
> a
> better hardware and Windows 2003. We have lot of DTS packages and SQL Jobs
> that needs to be migrated as well. I can script the Jobs and run the
> script
> on the new server and similarly I can save the DTS package on the new
> server.
> Instead, Can I backup the msdb database and restore it on the new server.
> I
> was wondering which method would be the best.
> Thanks,
> rgn
sql

Migrating DTS packages

Is there anyway to copy all the DTS packages and jobs in one step from one server to another. I usvally copy one DTS package at a time. As our packages count grown bigger, it became a painful process.
If anyone knows a way to copy these packages and jobs in one then it would be very helpful.

ThanksI don't think you can for DTS: Steps I have used in the past:

Save DTS package to hard drive:
Save DTS package as file on the disk, then copy this file on the target SQL Server and load this DTS package on the target machine. Dont forget to save in SQL Server.

Or...

Run DTS Export Wizard, and transfer this select statement:

SELECT * FROM sysdtspackages WHERE name = 'DTSPackageName'

from the msdb database on the source server to the msdb database on the destination server to transfer the DTSPackageName package.

You may be able to add multi packageName, I have never play around with the code. (Maybe someone else will have an idea)

And...

As for jobs you can create a DTS package and under Task select 15 - Transfer job task and follow the steps. You can do all of your jobs at once.

Hope it works out for you.

Thanks

Lystra|||As for the DTS packages go to http://www.sqldts.com/default.aspx?242 This is a free tool to migrate the DTS packages in one step.

migrating DTS packages

I need to migrate dts packages on one server to structured storage files and then restore them all to another server in a different network

i got the restore method down fine, but all of the backup methods I used yield an incorrect DTS package. the only way i can get the package to look right is if I manually open and save it

i would like an automated way if possible

i tried tools from sqldts.com, wrote my own, etc... none of them work right

it is sql server 2000

thanksI need to migrate dts packages on one server to structured storage files and then restore them all to another server in a different network

i got the restore method down fine, but all of the backup methods I used yield an incorrect DTS package. the only way i can get the package to look right is if I manually open and save it

i would like an automated way if possible

i tried tools from sqldts.com, wrote my own, etc... none of them work right

it is sql server 2000

thanks

What's the code you are using for your backup method? If you do a quick search on here, you should find a post that I recently wrote on a script that backsup all DTS packages for a selected instance. I have restored from one of these with no problem (other than perhaps needing to reset a few connection properties).

Regards,

hmscott|||I cannot find your recent post regarding backing up DTS packages, could you provide me with the title of the post, or re-post the code

Here is one of the methods I used:

DECLARE @.COMMAND varchar(1000)
DECLARE @.TARGETDIR varchar(500)
SET @.TARGETDIR = 'D:\DTS\'

DECLARE c1 CURSOR FOR

SELECT distinct
'DTSRUN.EXE /S '
+ CONVERT(varchar(200), SERVERPROPERTY('servername'))
+ ' /E '
+ ' /N '
+ '"' + name + '"'
+ ' /F '
+ '"' + @.TARGETDIR + replace(name,' ','_') + '.dts"'
+ ' /!X'
FROM msdb.dbo.sysdtspackages P

OPEN c1

FETCH NEXT FROM c1
INTO @.COMMAND

WHILE @.@.FETCH_STATUS = 0
BEGIN

exec xp_cmdshell @.COMMAND, no_output

FETCH NEXT FROM c1
INTO @.COMMAND
END

CLOSE c1
DEALLOCATE c1|||Search appears to be FUBAR. I found it by scrolling through. Here it is...

http://www.dbforums.com/showthread.php?t=1607904

Regards,

hmscott|||nevermind.....|||didn't work

i made the update to this thread over here:

http://www.dbforums.com/showthread.php?p=6233426#post6233426|||I can not find my script but I swear I have done this by just moving records between msdb.dbo.sysdtspackages tables|||didn't work

i made the update to this thread over here:

http://www.dbforums.com/showthread.php?p=6233426#post6233426

It's bad form, I realize, but I posted an update to your update on the other thread. Let's agree to move the discussion there for now. Perhaps we can convince a mod to merge the threads?

Regards,

hmscott

Friday, March 23, 2012

migrate sql2k dts packages to sql2k5

new to sql2k5.
i have some sql2k dts packages i migrated to sql2k5 using wizard and
they showed up under intergration services, stored packages, msdb, but
I can't open or edit any of the package.
For the same packages, if I saved them as external file in sql2k, then
import them in sql2k5, they will show up in SSMS, management, legacy,
DTS folder, and if had downloaded the Microsoft SQL Server 2000 DTS
Designer Components , I can open and edit the packages.
This is all very confusing to me.
first, why the dts showed up at different place in sql2k5 (what's the
difference between import and migrate?)
2nd, how to edit the migrated dts packages?
3rd, in sql2k, dts package is part of msdb, as long as I backup msdb, I
can restore all the pckages, how about in sql2k5, where are dts being
stored and are they still part of msdb?
thank you.=== Steve L === wrote:
> new to sql2k5.
> i have some sql2k dts packages i migrated to sql2k5 using wizard and
> they showed up under intergration services, stored packages, msdb, but
> I can't open or edit any of the package.
> For the same packages, if I saved them as external file in sql2k, then
> import them in sql2k5, they will show up in SSMS, management, legacy,
> DTS folder, and if had downloaded the Microsoft SQL Server 2000 DTS
> Designer Components , I can open and edit the packages.
> This is all very confusing to me.
> first, why the dts showed up at different place in sql2k5 (what's the
> difference between import and migrate?)
> 2nd, how to edit the migrated dts packages?
> 3rd, in sql2k, dts package is part of msdb, as long as I backup msdb, I
> can restore all the pckages, how about in sql2k5, where are dts being
> stored and are they still part of msdb?
> thank you.
>
DTS doesn't exist in SQL 2005, it has been replaced with Integration
Services...|||DTS services is no loginer in sql2k5, but you can still import or
migrate sql2k dts to it. I think you are missing what I asked in my
posting.
Tracy McKibben wrote:
> === Steve L === wrote:
> DTS doesn't exist in SQL 2005, it has been replaced with Integration
> Services...

Friday, March 9, 2012

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!