Showing posts with label project. Show all posts
Showing posts with label project. Show all posts

Friday, March 30, 2012

Migrating From Development to Production Environment in MSAS 2005

Hi,

I have built a project in Ananlysis Services 2005 and would like to migrate the contents of the Project ( Dimensions, Measures, Cubes Etc) to a New server having analysis Services 2005 , what is the best approach. Does microsoft provide any best practices document. I searched the forum and the topics on migration seem to be from Analysis Services 2000 to Analysis Services 2005, can some one guide me on migration between Analysis Services 2005 and 2005.

Thansks

A backup and restore (like in AS2000) is certainly possible. This is the only method if you need to get the data cube moved in a processed state (with data). Except you could use the new synchronization feature, but that would probably be using this feature for something it was not quite intended.

The recommended method would be using the new AS Deployment Wizard. Make a "build" in BI Development Studio of your database and fire up the AS Deployment Wizard. This will allow you to specify different settings for your cube database when it is deployed to the production server. For more info on how to use the Deployment Wizard, see http://msdn2.microsoft.com/en-US/library/ms176121.aspx

Migrating from Access to SQL Server with a web front end

Afternoon all,

Apologies for cross-posting but as my query covers both Access and SQL
Server I thought I'd send it both!

I have inherited a project to migrate a fairly complex series of
Access databases into a single proper SQL database with a web front
end.

Its quite a nasty job as people are working on a variety of data sets
at several Universities around the world and the data has got very
messy; hence the requirement to put it all on one live web enabled
database server and provide a web-based front end (particularly as
some users insist on using Macs so can't run Access as a front end
anyway).

If anyone could give me hints on how to perform such a migration or if
anyone knows of any good books or other documents on this I'd be
grateful for assistance.

Many thanks

Rich May
Museum of LondonFirst off, congratulations - this is a great move to make.

As far as converting the tables, use DTS (Data Transformation Services) in
sql server. It is pretty much wizard driven and works really well. As for
the queries, you would do best to forget you even have them in Access and
rewrite everything using Stored Procedures. My favorite book for "stepping"
between access and sql server is "Microsoft Access Developer's Guide to SQL
Server" by Chipman and Baron. It's wonderful.

As for your web front end... that is where the headache starts. You can
pretty much choose two paths if you want to stay on the Microsoft path. You
can do it using ASP or ASP.NET. Both will run on IIS (.net needs framework
installed).

If you are serving to Macs, you may deliver your app better using ASP
because you will be able to better control the rendered output (very good
for Netscape). I use Visual Basic to build "ActiveX DLL" components to
alleviate the complexity of the ASP pages so you would want to be proficient
in Visual Basic and instantiating COM components. Next, you need to take
ADO (ActiveX Data Objects) head on and learn it well, that is what you will
use to connect to sql, execute queries, and work with data.

If you are using IE on all clients and have control over them, you will find
that Visual Studio.NET does a great job at deploying web applications in
ease and speed, once you pass the high learning curve. I VERY highly
recommend "Programming Microsoft Visual Basic .Net" by Francesco Balena
(1600 pages).

If it all sounds complex, don't get me wrong... it is. If you need help,
let me know, we do this for a living.

--
Jerry Boone
Analytical Technologies, Inc.
http://www.antech.biz

"Rich May" <rmay@.museumoflondon.org.uk> wrote in message
news:e12c70e0.0311140943.722ed0ba@.posting.google.c om...
> Afternoon all,
> Apologies for cross-posting but as my query covers both Access and SQL
> Server I thought I'd send it both!
> I have inherited a project to migrate a fairly complex series of
> Access databases into a single proper SQL database with a web front
> end.
> Its quite a nasty job as people are working on a variety of data sets
> at several Universities around the world and the data has got very
> messy; hence the requirement to put it all on one live web enabled
> database server and provide a web-based front end (particularly as
> some users insist on using Macs so can't run Access as a front end
> anyway).
> If anyone could give me hints on how to perform such a migration or if
> anyone knows of any good books or other documents on this I'd be
> grateful for assistance.
> Many thanks
> Rich May
> Museum of London|||several choices, depending on what you're comfy with ..
1) sql server, IIS & ASP
2) mySQL, apache, php

here's a RAD tool that I found the other day to create web pages for
either option (I haven't tried it yet)
http://www.citybusinesslogic.com/apwab.php

and somewhere I've got vba code to convert ms-access tables to mysql
format

rmay@.museumoflondon.org.uk (Rich May) wrote in message news:<e12c70e0.0311140943.722ed0ba@.posting.google.com>...
> Afternoon all,
> Apologies for cross-posting but as my query covers both Access and SQL
> Server I thought I'd send it both!
> I have inherited a project to migrate a fairly complex series of
> Access databases into a single proper SQL database with a web front
> end.
> Its quite a nasty job as people are working on a variety of data sets
> at several Universities around the world and the data has got very
> messy; hence the requirement to put it all on one live web enabled
> database server and provide a web-based front end (particularly as
> some users insist on using Macs so can't run Access as a front end
> anyway).
> If anyone could give me hints on how to perform such a migration or if
> anyone knows of any good books or other documents on this I'd be
> grateful for assistance.
> Many thanks
> Rich May
> Museum of London|||Hi Rich,

I think there are 3 options. a) Access' Upsizing Migration Wizard
b)Sql Server's DTS and c) setting Access as a linked server and
programmatically importing the data. I believe the main problem
you'll run into is that Access' datatypes are not 100% compatible with
Sql server's. - Louis|||louisducnguyen@.hotmail.com (louis nguyen) wrote in message news:<b0e9d53.0311142220.41dff853@.posting.google.com>...
> Hi Rich,
> I think there are 3 options. a) Access' Upsizing Migration Wizard
> b)Sql Server's DTS and c) setting Access as a linked server and
> programmatically importing the data. I believe the main problem
> you'll run into is that Access' datatypes are not 100% compatible with
> Sql server's. - Louis

Another problem I have seen with the upsize wizard is it makes all
columns with the type nvarchar i.e. unicode which doubles the space
requirements.

Duncan|||"duncan" wrote

> Another problem I have seen with
> the upsize wizard is it makes all
> columns with the type nvarchar i.e.
> unicode which doubles the space
> requirements.

First, I don't think that is true -- all _text_ columns, perhaps, but
certainly not numeric and date/time columns.

Secondly, were you under the impression that Access does NOT use unicode? It
has since Access 2000. There is an option for "unicode compression" to
reduce the impact; I don't know if there is such an option in SQL Server,
but would think that it would have a similar feature.

Larry Linson
Microsoft Access MVP|||Is Cold Fusion a viable alternative to ASP in this kind of situation?

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!|||In Ms SQL...

Char is fixed length and pads with spaces
NChar is fixed length and pads with spaces, plus is unicode
VarChar is variable character (with max length spec)
NVarChar is variable (maxlength), plus is unicode

Since I don't deploy to any other language and don't plan to, I never use
the N' datatypes.

Also in SQL, the "SmallDateTime" field type is 4 bytes and the "DateTime" is
8 bytes). If it's 4 bytes in Access, then the equiv is "SmallDateTime".

In SQL Numeric DT's, there is tinyint, smallint, int, real, bigint, float,
money, smallmoney, and decimal. All with their mix of capacity.

--
Jerry Boone
Analytical Technologies, Inc.
http://www.antech.biz

"Larry Linson" <bouncer@.localhost.not> wrote in message
news:bpytb.58064$p9.53071@.nwrddc02.gnilink.net...
> "duncan" wrote
> > Another problem I have seen with
> > the upsize wizard is it makes all
> > columns with the type nvarchar i.e.
> > unicode which doubles the space
> > requirements.
> First, I don't think that is true -- all _text_ columns, perhaps, but
> certainly not numeric and date/time columns.
> Secondly, were you under the impression that Access does NOT use unicode?
It
> has since Access 2000. There is an option for "unicode compression" to
> reduce the impact; I don't know if there is such an option in SQL Server,
> but would think that it would have a similar feature.
> Larry Linson
> Microsoft Access MVP|||Sure, I looked into it some time ago when I was a beginner, but then I
discovered the unimaginable amount of documentation and books from multiple
authors in Microsoft products and I didn't even give it a second thought.
To me, no product is satisfactory if I have to spend days looking for
answers, and especially when they aren't there to be found.

--
Jerry Boone
Analytical Technologies, Inc.
http://www.antech.biz

"John Callaghan" <jackit2@.yahoo.com> wrote in message
news:3fb88be3$0$194$75868355@.news.frii.net...
> Is Cold Fusion a viable alternative to ASP in this kind of situation?
>
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!|||In my experience, the data conversion part of a project like this is almost
trivial compared to the front-end application challenge.

Access forms and reports can be incredibly complex, with subforms, simple
data binding, no-code dynamic behavior (record navigation, filtering,
sorting, finding, column-reordering, all those are built in to Access, not
at all in asp), and the list goes on and on. If your Access front-end
application is complex, it could be impossibly time-consuming to do in asp,
slightly less so in asp.net.

You would need some other tool for reporting (Crystal or others), I've made
crude html reports in asp and it is nearly impossible to do even the easy
stuff Access reports can do.

As a example, I wrote a full maintenance application for a 30 table database
(with lots of RI between tables), with multiple subforms, dropdown
navigation, and lots of other features, all in one day of work.. ASP would
have taken weeks to do even close to the same functionality.

So overall, my suggestion is if the Access forms are complex, plan for 1000
to 2000% more development time over Access forms. You will definitely need a
better reporting tool than html/asp. Hope this helps,

Steve Nyberg

"Rich May" <rmay@.museumoflondon.org.uk> wrote in message
news:e12c70e0.0311140943.722ed0ba@.posting.google.c om...
> Afternoon all,
> Apologies for cross-posting but as my query covers both Access and SQL
> Server I thought I'd send it both!
> I have inherited a project to migrate a fairly complex series of
> Access databases into a single proper SQL database with a web front
> end.
> Its quite a nasty job as people are working on a variety of data sets
> at several Universities around the world and the data has got very
> messy; hence the requirement to put it all on one live web enabled
> database server and provide a web-based front end (particularly as
> some users insist on using Macs so can't run Access as a front end
> anyway).
> If anyone could give me hints on how to perform such a migration or if
> anyone knows of any good books or other documents on this I'd be
> grateful for assistance.
> Many thanks
> Rich May
> Museum of Londonsql

MIgrating DTS packages of SQL 2000 to SQL 2005

Hello All,

I am working on a migration project of my database(lets name it DB1), from SQL 2000 to SQL 2005. I have some DTS packages in my SQL 2000 database which also needs to be migrated to SSIS. These DTS packages currently interacts with other database(lets name it DB2) which is also in SQL 2000 and which does the data transmission. The real issue is that the other database(DB2) also has some DTS packages which also communicates with DB1 (the db to be upgraded) and so when i migrate my DB1 to SQL 2005, i will have to change my DB2 packages also (although the change will be minor). Now considering that in future I might also migrate my DB2 to SQL 2005 I wanted to know what will be the right approach to follow. One is to modify the DTS packages of DB2 to accmodate the change in connection and the other is to migrate the DTS packages of DB2 also to SSIS.

Can anyone help me with this one.

Thanks in Advance

Mitesh

Hi Mitesh,

Something to consider: You can migrate a SQL Server 2000 database (DB1) to SQL Server 2005 and continue to run it at a SQL Server 2000 Compatibility Level. The implication is you may be able to continue connecting to DB1 from the DB2 DTS packages as you do now - it will just be on a different instance of SQL Server. I'm not sure what your DB2 DTS packages do exactly, so it's difficult to make that statement with more confidence.

There's also an Execute DTS 2000 Package Task available in SSIS. You can execute your existing DTS packages from SSIS. I've used this (and the aforementioned Compatibility Level) before as part of a migration plan from SQL Server 2000 to SQL Server 2005. It's nice because I don't have to get everything converted before starting to use the new database engine.

Regarding migrating between DTS and SSIS: that's a tough one. I teach ETL with SSIS for Solid Quality Learning. One of the things I share with students is the fact that SSIS is not the new version of DTS - it's a replacement. Microsoft did not start with DTS and modify it - they rewrote the entire application. This has some bearing on how easy it is to migrate from DTS. I've seen very simple DTS packages migrate easily, but that's about it.

There's so much new under the hood, you will most likely want to rebuild - and perhaps redesign - your packages in SSIS to take advantage of some of the stellar performance gains.

Hope this helps,

Andy

|||

Hey Andy,

Thanks for the reply will surely consider your suggestion. Just one more thing, when i use the Execute DTS 2000 Package task of SSIS i will have to change my connection as my database is migrated to SQL 2005 on another server. Correct me if i am wrong

|||

Hi Mitesh,

Yes sir - you will have to change your connection to target another database, server, or both.

Hope this helps,

Andy

Monday, March 26, 2012

migrating access to SQL

i am doing ASP small project using ACCESS database
i want to migrate access to SQL what i need to do?
it is very diificult. i studied sql very well.but i don't know what s/w i need to download how to make connection????

Quote:

Originally Posted by subashini Thiyagarajan

i am doing ASP small project using ACCESS database
i want to migrate access to SQL what i need to do?
it is very diificult. i studied sql very well.but i don't know what s/w i need to download how to make connection????


Hi there,

Make use of the Import and Export wizard, hope it helps. Good luck & Take care.|||How can i do that?

what changes i should make in my asp codings?

in order to export to SQL what s/w i need to download?

pls reply|||

Quote:

Originally Posted by subashini Thiyagarajan

How can i do that?

what changes i should make in my asp codings?

in order to export to SQL what s/w i need to download?

pls reply


Hi there,

Not much changes are required as long as you provide the source & destination database name correctly, that's more then enough i guess. Good luck & Take care.|||Here is a MS article about doing this although not sure what version of SQL Server you are using.

HTH,
Aric|||thanks for your reply.
can i get SQL server free download from Net.Not a trial version.|||

Quote:

Originally Posted by subashini Thiyagarajan

thanks for your reply.
can i get SQL server free download from Net.Not a trial version.


Hi there,

The answer will be NO & NOTHING BUT NO! Good luck & Take care|||It is confirmed "NO"

what did u mean that source and desrtination path should be correct.

using which source i should concentrate about the path.
i downloaded the express edition what i need to do in order to import my access file? pls give a clear viewsql

Wednesday, March 21, 2012

migrate msjet to mssql server

hi, i already have a project in vb6 / msjet 3.5 using dao methods .index .seek , ... I want to convert it to sql , is there a fast way (! converting .index .seek to select...where clauses is time consumming issue) plse reply since the package is already running and i am really desperate to the conversion.

many thanks and appreciation
amatouriI'm afraid you'll have to go through the pain of taking all this DAO s$%t out. And while you're at it, do yourself and those who'll come later a favor, - take out .Add/.Edit/.Update as well, - very nasty stuff, and while being a very lazy way to code, yields a very paiful and weekendless life for your DBA...

Monday, March 19, 2012

Migrate from ingres to SqlServer

I was given a project from my supervisor and i must complete it until Tuesday. I must transfer our ingres database into sql server database. So far i have done all the work with the tables, primary keys, views and data. I don’t know how to go on with the procedures and rules that i have to transfer. Is there anyone that had faced the same condition? What should i do?

Maria

I'm not aware of any easy way to do this, the SqlServer migration tools don't cover Ingres (see http://www.microsoft.com/sql/solutions/migration/default.mspx for what they do cover) so it may be that you just have to rewrite it all . Maybe someone else knows more? It's quite a few years since I last used Ingres and things may have changed.

Monday, March 12, 2012

migrate adp projects from SQL 7 to SQL 2000

We currently have a adp project linked with a Sql server 7 running NT4.
We need to replace old server with a win server 2003 with SQL 2000 sp4.
When I restore complete DB from old server to new server all objects
have (dbo) suffix and when we test the adp - through running a saved
report the report fails because it is looking for a table without the
(dbo) suffix.
Any help would be appreiciated.Not sure if I understand the question but ...
Are these linked tables? Just rename the links and that will work.
For example if the new link is dbo_Customers just rename it to Customers.
Please let us know if this was the issue or something else.
Ben Nevarez, MCDBA, OCP
Database Administrator
"dkordyban@.gmail.com" wrote:

> We currently have a adp project linked with a Sql server 7 running NT4.
> We need to replace old server with a win server 2003 with SQL 2000 sp4.
> When I restore complete DB from old server to new server all objects
> have (dbo) suffix and when we test the adp - through running a saved
> report the report fails because it is looking for a table without the
> (dbo) suffix.
> Any help would be appreiciated.
>|||Yes, SQL tables are linked to project. When viewing the tables, views
and stored procedures from the project, they all appear with (dbo)
after the object.
It may be somthing with sql 2000 sp4. I restored the exact same .bak
file to a Windows 2000 with sql 2000 sp3 and it worked fine - no (dbo)
after every object
Thanks for your time

migrate adp projects from SQL 7 to SQL 2000

We currently have a adp project linked with a Sql server 7 running NT4.
We need to replace old server with a win server 2003 with SQL 2000 sp4.
When I restore complete DB from old server to new server all objects
have (dbo) suffix and when we test the adp - through running a saved
report the report fails because it is looking for a table without the
(dbo) suffix.
Any help would be appreiciated.
Not sure if I understand the question but ...
Are these linked tables? Just rename the links and that will work.
For example if the new link is dbo_Customers just rename it to Customers.
Please let us know if this was the issue or something else.
Ben Nevarez, MCDBA, OCP
Database Administrator
"dkordyban@.gmail.com" wrote:

> We currently have a adp project linked with a Sql server 7 running NT4.
> We need to replace old server with a win server 2003 with SQL 2000 sp4.
> When I restore complete DB from old server to new server all objects
> have (dbo) suffix and when we test the adp - through running a saved
> report the report fails because it is looking for a table without the
> (dbo) suffix.
> Any help would be appreiciated.
>
|||Yes, SQL tables are linked to project. When viewing the tables, views
and stored procedures from the project, they all appear with (dbo)
after the object.
It may be somthing with sql 2000 sp4. I restored the exact same .bak
file to a windows 2000 with sql 2000 sp3 and it worked fine - no (dbo)
after every object
Thanks for your time

migrate adp projects from SQL 7 to SQL 2000

We currently have a adp project linked with a Sql server 7 running NT4.
We need to replace old server with a win server 2003 with SQL 2000 sp4.
When I restore complete DB from old server to new server all objects
have (dbo) suffix and when we test the adp - through running a saved
report the report fails because it is looking for a table without the
(dbo) suffix.
Any help would be appreiciated.Not sure if I understand the question but ...
Are these linked tables? Just rename the links and that will work.
For example if the new link is dbo_Customers just rename it to Customers.
Please let us know if this was the issue or something else.
Ben Nevarez, MCDBA, OCP
Database Administrator
"dkordyban@.gmail.com" wrote:
> We currently have a adp project linked with a Sql server 7 running NT4.
> We need to replace old server with a win server 2003 with SQL 2000 sp4.
> When I restore complete DB from old server to new server all objects
> have (dbo) suffix and when we test the adp - through running a saved
> report the report fails because it is looking for a table without the
> (dbo) suffix.
> Any help would be appreiciated.
>|||Yes, SQL tables are linked to project. When viewing the tables, views
and stored procedures from the project, they all appear with (dbo)
after the object.
It may be somthing with sql 2000 sp4. I restored the exact same .bak
file to a windows 2000 with sql 2000 sp3 and it worked fine - no (dbo)
after every object
Thanks for your time

Migrate Access2000 to SQL2000 database

I am working on new project that migration Access database and its ASP.NET
application to SQL2000 database. Please give me the guide and strategy how to
do it from beginning to the end.
Thank you very much and Happy New Year.
Kevin,
Microsoft suggests that you use the Upsizing wizard in Access 2k to
accomplish this. I have posted a url for you below that has a link to a very
detailed white paper that explains how to use this wizard.
I hope this helps.
http://support.microsoft.com/default...roduct=acc2000
|||That request is beyond the scope of a simple newsgroup question.
Migrating the data shouldn't be too difficult -- for best results,
create the schema and database objects in SQLS and migrate the data
using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
a look at the ASP.NET security best-practices whitepaper,
http://www.microsoft.com/downloads/r...leaseID=44047. That
will get you started about thinking of your overall application and
security architecture. Also visit the ASP.NETand SQLS developer
centers at http://msdn.microsoft.com/asp.net/ and
http://msdn.microsoft.com/sql/, respectively.
--Mary
On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
<Kevin@.discussions.microsoft.com> wrote:

>I am working on new project that migration Access database and its ASP.NET
>application to SQL2000 database. Please give me the guide and strategy how to
> do it from beginning to the end.
>Thank you very much and Happy New Year.
|||Thanks Brian.
Do you know which one is better between Upsizing or DTS approach?
"Brian Brown" wrote:

> Kevin,
> Microsoft suggests that you use the Upsizing wizard in Access 2k to
> accomplish this. I have posted a url for you below that has a link to a very
> detailed white paper that explains how to use this wizard.
> I hope this helps.
> --
> http://support.microsoft.com/default...roduct=acc2000
|||Thanks Mary.
"Mary Chipman" wrote:

> That request is beyond the scope of a simple newsgroup question.
> Migrating the data shouldn't be too difficult -- for best results,
> create the schema and database objects in SQLS and migrate the data
> using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
> a look at the ASP.NET security best-practices whitepaper,
> http://www.microsoft.com/downloads/r...leaseID=44047. That
> will get you started about thinking of your overall application and
> security architecture. Also visit the ASP.NETand SQLS developer
> centers at http://msdn.microsoft.com/asp.net/ and
> http://msdn.microsoft.com/sql/, respectively.
> --Mary
> On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
> <Kevin@.discussions.microsoft.com> wrote:
>
>

Migrate Access2000 to SQL2000 database

I am working on new project that migration Access database and its ASP.NET
application to SQL2000 database. Please give me the guide and strategy how to
do it from beginning to the end.
Thank you very much and Happy New Year.Kevin,
Microsoft suggests that you use the Upsizing wizard in Access 2k to
accomplish this. I have posted a url for you below that has a link to a very
detailed white paper that explains how to use this wizard.
I hope this helps.
--
http://support.microsoft.com/default.aspx?kbid=241743&product=acc2000|||That request is beyond the scope of a simple newsgroup question.
Migrating the data shouldn't be too difficult -- for best results,
create the schema and database objects in SQLS and migrate the data
using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
a look at the ASP.NET security best-practices whitepaper,
http://www.microsoft.com/downloads/release.asp?ReleaseID=44047. That
will get you started about thinking of your overall application and
security architecture. Also visit the ASP.NETand SQLS developer
centers at http://msdn.microsoft.com/asp.net/ and
http://msdn.microsoft.com/sql/, respectively.
--Mary
On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
<Kevin@.discussions.microsoft.com> wrote:
>I am working on new project that migration Access database and its ASP.NET
>application to SQL2000 database. Please give me the guide and strategy how to
> do it from beginning to the end.
>Thank you very much and Happy New Year.|||Thanks Brian.
Do you know which one is better between Upsizing or DTS approach?
"Brian Brown" wrote:
> Kevin,
> Microsoft suggests that you use the Upsizing wizard in Access 2k to
> accomplish this. I have posted a url for you below that has a link to a very
> detailed white paper that explains how to use this wizard.
> I hope this helps.
> --
> http://support.microsoft.com/default.aspx?kbid=241743&product=acc2000|||Thanks Mary.
"Mary Chipman" wrote:
> That request is beyond the scope of a simple newsgroup question.
> Migrating the data shouldn't be too difficult -- for best results,
> create the schema and database objects in SQLS and migrate the data
> using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
> a look at the ASP.NET security best-practices whitepaper,
> http://www.microsoft.com/downloads/release.asp?ReleaseID=44047. That
> will get you started about thinking of your overall application and
> security architecture. Also visit the ASP.NETand SQLS developer
> centers at http://msdn.microsoft.com/asp.net/ and
> http://msdn.microsoft.com/sql/, respectively.
> --Mary
> On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
> <Kevin@.discussions.microsoft.com> wrote:
> >I am working on new project that migration Access database and its ASP.NET
> >application to SQL2000 database. Please give me the guide and strategy how to
> > do it from beginning to the end.
> >Thank you very much and Happy New Year.
>

Migrate Access2000 to SQL2000 database

I am working on new project that migration Access database and its ASP.NET
application to SQL2000 database. Please give me the guide and strategy how t
o
do it from beginning to the end.
Thank you very much and Happy New Year.Kevin,
Microsoft suggests that you use the Upsizing wizard in Access 2k to
accomplish this. I have posted a url for you below that has a link to a ver
y
detailed white paper that explains how to use this wizard.
I hope this helps.
http://support.microsoft.com/defaul...product=acc2000|||That request is beyond the scope of a simple newsgroup question.
Migrating the data shouldn't be too difficult -- for best results,
create the schema and database objects in SQLS and migrate the data
using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
a look at the ASP.NET security best-practices whitepaper,
http://www.microsoft.com/downloads/...eleaseID=44047. That
will get you started about thinking of your overall application and
security architecture. Also visit the ASP.NETand SQLS developer
centers at http://msdn.microsoft.com/asp.net/ and
http://msdn.microsoft.com/sql/, respectively.
--Mary
On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
<Kevin@.discussions.microsoft.com> wrote:

>I am working on new project that migration Access database and its ASP.NET
>application to SQL2000 database. Please give me the guide and strategy how
to
> do it from beginning to the end.
>Thank you very much and Happy New Year.|||Thanks Brian.
Do you know which one is better between Upsizing or DTS approach?
"Brian Brown" wrote:

> Kevin,
> Microsoft suggests that you use the Upsizing wizard in Access 2k to
> accomplish this. I have posted a url for you below that has a link to a v
ery
> detailed white paper that explains how to use this wizard.
> I hope this helps.
> --
> http://support.microsoft.com/defaul...product=acc2000|||Thanks Mary.
"Mary Chipman" wrote:

> That request is beyond the scope of a simple newsgroup question.
> Migrating the data shouldn't be too difficult -- for best results,
> create the schema and database objects in SQLS and migrate the data
> using DTS or the Access upsizing wizard. As far as ASP.NET goes, take
> a look at the ASP.NET security best-practices whitepaper,
> http://www.microsoft.com/downloads/...eleaseID=44047. That
> will get you started about thinking of your overall application and
> security architecture. Also visit the ASP.NETand SQLS developer
> centers at http://msdn.microsoft.com/asp.net/ and
> http://msdn.microsoft.com/sql/, respectively.
> --Mary
> On Wed, 29 Dec 2004 07:50:25 -0800, "Kevin"
> <Kevin@.discussions.microsoft.com> wrote:
>
>

Friday, March 9, 2012

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException

Hi

I have a SSIS project that has one parent package and three child packages. When I run the project on my development machine in debug mode it works fine. Also if i run the packages using dtexec on my development machine it still works fine. However the problem comes in when I try and run the project using dtexec on the staging server i get the following error:

Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

does anyone have any idea how to fix this please?

thanks

G

Probably the data on staging server is different from the data on development machine?

You are trying to put data into buffer that is bigger than the buffer column size allocated for this data. E.g. script component builds some string, gets a string of length 60, and then tries to put it into STR column of length 50 - you'll get this exception as a result. You need to adjust the column size to match expected data, or cleanup the input data to avoid the overflow.

You probably don't see it on development machine, because your test data is different and this condition never occurs.

http://msdn2.microsoft.com/en-us/microsoft.sqlserver.dts.pipeline.doesnotfitbufferexception.aspx

|||thanks for the response but I am not sure that that is the problem since I am using the exact same data and exact same packages on both machines?|||

Hi

After some testing I have found that when I run the child packages in series as apposed to in parallel it all works fine. This should do for the mean time but if you have any idea why this is could u please let me know.

Thanks for the help.

Gary

|||

Hi again

I have a second package that has the same setup but manipulates a different data set and I am getting this same error but this time I can't find a way around it.

The data and the packages are exactly the same on both servers.On the development server the packages run with no problem but on the staging server I get buffer memory issues.

Anyone know why this is maybe?

It seems that the column width is too wide for the buffer so the buffer is overflowing into a second buffer which is being used by the other child package.

Any ideas?

thanks

G

|||

Pipeline Buffers are not shared between child and parent packages.

Could you describe what the packages are doing? What component reports this error? Do both dev and staging machine have the same architecture (32-bit, or 64-bit)?

|||

Hi

In the parent package data is extracted from multiple flat files and bulk inserted into multiple database table. Following this 4 child packages are initiated using the execute package task.

within each child package a data flow is started. In the data flow information is extracted from a table (populated in the parent package). The data is transformed using a script component. A row count is taken and then the data is inserted into a second database table.

After some investigation. I have discovered:

-on the development server the entire package works perfectly with no errors

-on the staging server each child package works when it is the only one executed from the parent package.

-on the staging server when I run two child packages it seems the first child package works fine but in the second package to be executed the script component does not get executed (I inserted a MessageBox and ComponentMetaData.FireInformation and there was no output).

This is the error message in the log when running all 4 child packages together:

--

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:12:57 PM,2006/06/29 02:12:57 PM,0,0x,The value is too large to fit in the column data area of the buffer. Record ID - Merchant Number (ODS: QMAKBACCTNBR) : 00000000000000000002063 MerchantNumber (ODS: QMAKBACCTNBR)OnInformation,,,Transform Data Into MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,ODS To DWH MerchantStatementsMonthly,,,2006/06/29 02:15:07 PM,2006/06/29 02:15:07 PM,1074016264,0x,Post Execute phase is beginning.

OnInformation,,,TrOnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:03 PM,2006/06/29 02:23:03 PM,-1073450910,0x,Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:06 PM,2006/06/29 02:23:06 PM,-1073450974,0x,The ProcessInput method on component "Script Component" (10641) failed with error code 0x80131600. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1071636284,0x,The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:07 PM,2006/06/29 02:23:07 PM,-1073450975,0x,Thread "WorkThread0" has exited with error code 0x80131600.

OnError,,,Transform Data Into MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

OnError,,,ODS To DWH MerchantMasterMonthly,,,2006/06/29 02:23:09 PM,2006/06/29 02:23:09 PM,-1073450952,0x,The PrimeOutput method on component "Q19400D" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-

The development server is 32 bit and the staging server is 64bit but they are both runnning 32 bit operating systems.

Any help would be appreciated.

thanks

G


|||

Hi

We had somewhat the some problems as described above. We are implementing a datawarehouse-solution using sql2005 and an 64bit windows 2003 server. We use workflow packages that executes childpackages simultaniously. Errors concerning memory and buffers popped up at random in different childpackages.

We installed SP1 and the Cumulative hotfix package (build 2153). This helped a little but didn't solve all the problems. We are testing for two days now and it seems that changing the SSISpackage-property MaxConcurrentExecutables from -1 to 1 in the workflow package solves the remaining memory and buffer problems. Setting this property prohibits concurrent execution of childpackages.

It seems buffermanagement on 64bit servers is still a bit buggy.

Hope this helps in your case too.

Jan

|||

Hi

Thanks for the help. I narrowed the problem down to the script components in the child packages so I deleted them and recreated them and it seems to work fine.

No sure why this is. Any one know?

|||

Strange.

Did you create them in a beta version?

Do you have a copy of the old package? Maybe check the differences in the XML in the .dtsx file.

-Jamie

|||

hi,

I also have same problem and also recreate my Task....but not solve my problem

I'm using Script Component Task to do custom transformation.
I add 12 output columns in output0.
5 of these columns is DT_STR and their length is 4, 3 is DT_STR 2, 3 is
DT_STR 12 and 1 is DT_I4.

When executing the package ,I get an exception as below.

The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.SetString(Int32
columnIndex, String value)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32
columnIndex, Object value)
at Microsoft.SqlServer.Dts.Pipeline.ScriptBuffer.set_Item(Int32
ColumnIndex, Object value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.Output0Buffer.set_isjsst(String
Value)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.ScriptMain.Input0_ProcessInputRow(Input0Buffer
Row)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.Input0_ProcessInput(Input0Buffer
Buffer)
at
ScriptComponent_04dc3685b8d744898f09d88716141989.UserComponent.ProcessInput(Int32
InputID, PipelineBuffer Buffer)
at
Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32
inputID, PipelineBuffer buffer)


I debug and find the execption comes form output0Buffer set value.
I alse do some testing,
1.Create a simple package wiht same Script Component Task and find no
exception.
2.Remove some output0 columns and find no exception.
It's very confusing me...

Any help is very appreciated.

|||I had the same error and exception. I deleted and recreated the script component and the destination component that the output from the script component was going to. It works for me now.|||was getting the same error - found the resolution was to make sure the output column created in the design script was set to the same size as the source column .

In my case I was trying to set the desitination column to 35, since that is the size of the ultimate destination column in the destination database. Even though I was making sure the string inside the script was truncating to 35 characters, the design script component would still throw the error.

I instead create a derived column component immediately following the script component, and use that one to do the trucation to the destination column size...|||In my case it was Custom Component which did not update OutputBuffer field lengths after a change in a Source component

Thanks team for pointing the right direction!

Microsoft.ReportingServices.Designer

Hello all,

After reinstalling several times the MSSQL 2005 express, and the SP1 when I'm trying to create a project with the BIDS I've got an error:
Could not load file or assembly
Microsoft.ReportingServices.Designer
Version=9.0.242.0
..
or one of its dependencies. System cannot find the file.

But it exists!!!What can I do?

Many thx!

I've to inform that it looks like this file is already on my machine, and with the same version.

I tried to uninstall all SQL Server instances, NET framework 2.0 and all Vstudio (even 2003 version), restarted my machine and reinstall everything. The order was:

Vstudio for web developers Express

SQLExpress2005 Advanced

SQLExpress 2005 SP1

The installation was fine but the BIDS is causing me problems.

Sugestions?

|||

Finally solved by myself!

The order which I've done it is....

Uninstall all VStudio + SQLServer
Install .NET Framework 2.0
Install SQL Server 2005 Advanced in the default path (C:\)
Install SQL Server 2005 SP1 in the default path (C:\)
Install VStudio 2005 WebDeveloper in the default path (C:\)

Hope this helps to someone

|||

Hi:

Well, I guess the problem is the path of installation...

As most of us seems to be running out of space on 'C' Drive;

we tend to change the default path of installation...

During installation, if one sets all path to be the same drive; then the problem should disappear...

HTH,

|||

hi, i have encountered this same error message. I'm using VS 2005 Professional, it is a mature installation with several plugins and additional components, everything is customised the way i want it.

I cannot accept that i must uninstall VS2005 just to get the BI Dev Studio plugin working. That would represent a total failure of the VS integration in my opinion. all my SQL 2005 components are installed on C: so that solution wouldn't be relevant for me.

is there a response from MS on why this error message happens? and a proper way to resolve it?

hoping for an informative response :)
tim.

|||

hmmm. I have the same issue as you. I cannot uninstall my SQL server as it has production data running on it. I DID go through the painful process of uninstalling the VS 2005 installation and reinstalling it. It did not change a single thing. I am right back to the same error message.

Hope someone figures this out soon. Nothing on MS sites so far that I can find.

Brian King

|||

I had the same problem. I did my install on D: drive, but surprisingly it seems Reporting Services was installed on C: drive.

I solved the problem by copying the content of C:\Program Files\Microsoft Visual Studio 8\Common7\IDE to the same directory on the D: drive. Seems to work fine.

|||I've had the same problem and re-installed VS2005 from scratch, tried to repair .NET 2.0 but all in vain. Confirm that the above copy solution worked for me. Thanks for info Guy.|||the copying worked for me aswell, thanks guy|||

Microsoft documents it's solution here http://support.microsoft.com/?id=823245 but it didn't work for me. After repairing I found the copy
C:\Program Files\Microsoft Visual Studio 8\Common7\IDE to
D:\Program Files\Microsoft Visual Studio 8\Common7\IDE and it worked a treat!

Thanks!!

Microsoft.ReportingServices.Designer

Hello all,

After reinstalling several times the MSSQL 2005 express, and the SP1 when I'm trying to create a project with the BIDS I've got an error:
Could not load file or assembly
Microsoft.ReportingServices.Designer
Version=9.0.242.0
..
or one of its dependencies. System cannot find the file.

But it exists!!!What can I do?

Many thx!

I've to inform that it looks like this file is already on my machine, and with the same version.

I tried to uninstall all SQL Server instances, NET framework 2.0 and all Vstudio (even 2003 version), restarted my machine and reinstall everything. The order was:

Vstudio for web developers Express

SQLExpress2005 Advanced

SQLExpress 2005 SP1

The installation was fine but the BIDS is causing me problems.

Sugestions?

|||

Finally solved by myself!

The order which I've done it is....

Uninstall all VStudio + SQLServer
Install .NET Framework 2.0
Install SQL Server 2005 Advanced in the default path (C:\)
Install SQL Server 2005 SP1 in the default path (C:\)
Install VStudio 2005 WebDeveloper in the default path (C:\)

Hope this helps to someone

|||

Hi:

Well, I guess the problem is the path of installation...

As most of us seems to be running out of space on 'C' Drive;

we tend to change the default path of installation...

During installation, if one sets all path to be the same drive; then the problem should disappear...

HTH,

|||

hi, i have encountered this same error message. I'm using VS 2005 Professional, it is a mature installation with several plugins and additional components, everything is customised the way i want it.

I cannot accept that i must uninstall VS2005 just to get the BI Dev Studio plugin working. That would represent a total failure of the VS integration in my opinion. all my SQL 2005 components are installed on C: so that solution wouldn't be relevant for me.

is there a response from MS on why this error message happens? and a proper way to resolve it?

hoping for an informative response :)
tim.

|||

hmmm. I have the same issue as you. I cannot uninstall my SQL server as it has production data running on it. I DID go through the painful process of uninstalling the VS 2005 installation and reinstalling it. It did not change a single thing. I am right back to the same error message.

Hope someone figures this out soon. Nothing on MS sites so far that I can find.

Brian King

|||

I had the same problem. I did my install on D: drive, but surprisingly it seems Reporting Services was installed on C: drive.

I solved the problem by copying the content of C:\Program Files\Microsoft Visual Studio 8\Common7\IDE to the same directory on the D: drive. Seems to work fine.

|||I've had the same problem and re-installed VS2005 from scratch, tried to repair .NET 2.0 but all in vain. Confirm that the above copy solution worked for me. Thanks for info Guy.

|||the copying worked for me aswell, thanks guy
|||

Microsoft documents it's solution here http://support.microsoft.com/?id=823245 but it didn't work for me. After repairing I found the copy
C:\Program Files\Microsoft Visual Studio 8\Common7\IDE to
D:\Program Files\Microsoft Visual Studio 8\Common7\IDE and it worked a treat!

Thanks!!