Showing posts with label package. Show all posts
Showing posts with label package. Show all posts

Tuesday, March 20, 2012

BOL Package example Errors

Hi There

Ok i realize this may not be exactly the correct place to post this, but i seem to get good advice here.

I am trying to follow the SMO Tables DBCC Package Sample in BOL.

I have copied the Microsoft.SqlServer.Smo.dll and Microsoft.SqlServer.SmoEnum.dll to my latest .NET Framework folder as specified in BOL example.

Problem is when i try to run the package i get the following error:

An error occured while compiling the script for the Script Task

Error 30466: Namespace or type specified in the Imports 'Microsoft.SqlServer.Management.Common' cannot be found. Make sure the name space or the type is defined and it doesn't contain other aliases.
Line 9 Column 9 through 45

Imports Microsoft.SqlServer.Management.Common

Now i am guessing it has something to do with the SqlSmo dll's , i am not sure what to check as i have copied them to my .NET framework.

Someone suggested i should go to my .NET Framework 2.0 configuration and confirm that it has picked up the new smo dll's, but since installing the Beta .Net Framework 2.0 from the June CTP i get the following error when i try open the configuration:

Snap-in failed in to initialize
.NET Framework 2.0 Configuration

I have tried reinstalling the .Net Framework and i still get the same error, have not found any appropriate solutions on the net.

Are these issues related ? Any help would be greatly appreaciated.

Thanx
I'm not familiar with that sample. Where are you trying to use the SMO assemblies? In the script task?|||Hi Kirk

That is correct.

If you have installed the sample packages in the default directory.
You can open the project from the following path:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Package Samples\SmoTablesDBCC\SmoTablesDBCC\SmoTablesDBCC.dtproj

If you have not installed tthe sample packages i can copy the script and post it here if you like.

Thanx|||Sounds like you haven't moved over the assembly.
I describe how to do that here:
http://sqljunkies.com/WebLog/knight_reign/archive/2005/07/07/16018.aspx

Thanks,|||Hi Kirk

Ok please bear with me but i have a few questions.

Firstly here are the assembly imports as they are in the script.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Management.Common ***
Imports Microsoft.SqlServer.Management.Smo

The one marked *** is the one with the compilation error.
Now for the example package i copied the Microsoft.SqlServer.SmoEnum.dll and Microsoft.SqlServer.Smo.dll assemblies to %windir%\Microsoft.net\framework\v2.0.xxxxx\ as specified. Now these refer to the import following the one giving me the error.So question number 1 what is the correct assembly for Imports Microsoft.SqlServer.Management.Common? As i can find no Management.Common assembly on the CTP assemblies folder? In other words i am not sure which assembly goes with that import.
Secondly I cannot find the option Project - Add References in BI studio from the project menu or an option like that in the script task editor, i have feeling i am being pretty stupid here, but i cannot, maybe it was another CTP release and it has another name in the June release?

Thanx Again Kirk|||Apologies, I assumed that the assemblies were correctly listed. The correct assembly for that namespace is actually: Microsoft.SQLServer.ConnectionInfo.dll
HTH|||Hi Kirk

Thank You it works now, i wonder why there is no mention of that assemly in the package examples in BOL? As it does nto work without it.
Lastly if you dont mind why did i not have to add a reference to it in BI, as stipulated in your link? As i mentioned above i cannot find this option in BI stdio?

Thanx Again|||Sean,
The documentation team are encouraging people to use the "Send Feedback" link in BOL if you've any problems with it. Drop them a line on that and it'll get actioned.

-Jamie|||

Lastly if you dont mind why did i not have to add a reference to it in BI?
Not clear on the question but will venture an answer.
You did add it in the VSA environment, right? The VSA environment is a sub-environment that is agnostic to and ignorant of the fact that it's running inside the BI environment. So, the VSA script can reference the assembly without the BI environment even knowing about it. I think that's the answer to the question...

|||

The information in the Readme for this sample explains some of the issues discussed in this thread.

The Script task project already contains the necessary References for all the managed namespaces that are also imported by using Imports statements. However certain DLLs need to be copied to the .NET directory as described in the Readme to be "visible" to the Script task.

You need to make the Project Explorer window visible to view and add References.

After copying the DLLs, you may need to close and reopen the script for the blue squigglies under the imported namespaces to disappear. If that doesn't work, try setting Precompile to False temporarily.

-Doug

|||Hi Kirk

That does clear things up, i found it in the VSA environment, bit unclear in the link i was trying to find it in BI, Thanxsql

Monday, March 19, 2012

Bogus ''out of disk space'' error message

When running SQLEVAL.EXE, I'm getting an error message when the installation wizard finished 'reading contents of the package' reading:

There is not enough space on C:\ to extract this package.

Having checked the requirements http://www.microsoft.com/sql/editions/developer/sysreqs.mspx, the free disk space on my c:\ is 45.42Gb.

Any ideas?

See: http://blogs.msdn.com/nikop/archive/2005/03/13/394797.aspx

Bogus 'out of disk space' error message

When running SQLEVAL.EXE, I'm getting an error message when the installation wizard finished 'reading contents of the package' reading:

There is not enough space on C:\ to extract this package.

Having checked the requirements http://www.microsoft.com/sql/editions/developer/sysreqs.mspx, the free disk space on my c:\ is 45.42Gb.

Any ideas?

See: http://blogs.msdn.com/nikop/archive/2005/03/13/394797.aspx

Sunday, March 11, 2012

Blocking Transaction

I have a 3 simple packages get called from master package. First package populates time dimension (stored proc), second one populates transaction type dimension(stored proc) and the final one populates date dimension.

I set the TransactionOption = Required in the Master package, every thing else (package & component) set TransactionOption = Supported.

I have been testing transactions, basically I made time and transaction type dimension to fail and the package did roll back. I am stuck in the date dimension.

Within Date dimension I got a stored procedure which populates calendar date attributes into Wrk_Date table. Then I have a data flow task which reads from thats working table, but it cant access it. I tried running SP_WHO2 command, and the status is SUSPENDED, being blocked by id -2.

I saw someone had similar problem and I did read Ash's comments. I did try to change the isolation level, didnt help.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=74372&SiteID=1

Any thoughts please?

Thanks

Sutha

Sutha,

I had enormous problems with this and eventually just dropped the use of distributed transactions altogether. In some cases it simply wouldn't work because of the way DTC holds onto resources. Since then I have used checkpoint files and it works just fine.

-Jamie

|||

Jamie

Thanks for the infor, but it is annoying. Do you know this is going to be fixed at any point ?

Client is not happy having a checkpoint. They want to rollback the whole dimension load. For example if we have got 6 dimensions and 5 of them loaded fine and 6th one failed, then they want to rollback the whole lot. What would be the solution in that scenario please, if we cant use transactionoption correctly?

Thanks

Sutha

|||

I don't think its a case of it being fixed - its just "the way it is". DTC isn't doing anything wrong - its doing exactly what you've told it to i.e. Maintaining a lock on a table because of a running transaction.

Its a difficult one to broach with the client, I can see that. Maybe you could restructure your package so that the locks don't occur.

-Jamie

|||

Jamie

Thanks. I was going through step by step and identified the reason for locks. During step 2, if date table exist in stage database it TRUNCATES the table else it creates the table. Truncate statement seems to hold the lock and it is not releasing it further down the line. I have managed to get the rollback to work after removing truncate.

Thanks

Sutha

Blocking issue in DTS package running on 2005

I'm testing an existing 2000 DTS package in 2005. A Data Transformation Task selects from a view and loads a table that was just truncated. The view definition contains a subselect selecting data off the table,

View: select ... from a, (select ... from b) b2 ON ...

Table to insert into: b

There is blocking going on during the execution of the package. The select * from view (Source) and the insert bulk into the table (Destination) are blocking each other. This does not happen in 2000.

Is there a command that I can put in the view definition subselect that will allow me to just grab the data that it can without worrying about blocking? Would READCOMMITTED or READUNCOMMITED work without impact on the view? Other recommendations?

Has anyone ever run into this?

I found that when you uncheck the Table Lock box of the Data Transformation Task, that this blocking does not occur. Does anyone know why this is required in 2005 but works fine in 2000 with the box checked?

Saturday, February 25, 2012

BLOB or VARBINARY(MAX) as SSIS source.

Hi all,

Can a SSIS package treat a file from a table ( VARBINARY(MAX) or BLOB) as a source for migration ?

Thanks in advance,

DBAnalyst

Sure. The varbinary(max) SQL 2005 data type maps to the SSIS type DT_IMAGE, which is just a bunch of bytes of which the file consists.

So what you have at that point is a byte stream, or byte array.

To use those bytes as file source, you could write them to a file in one dataflow and read them in a second dataflow.

Another way to use varbinary(max) data as a "file source" would be to use a script or custom source component (its not that painful) to wrap a StreamReader object arround the SQL byte stream. Using this approach, the varbinary(max) never touches down to disk as an intermediate step.

There are certainly other techniques, but those come to mind.|||Thanks a lot Jaegd.

Tuesday, February 14, 2012

Bizzarre SSIS Configuration behavior

I have an SSIS package, (actually many packages at this point because I can duplicate this problem each and every time) with an FTP Task and three connection managers (one for FTP, one for a file share, one OLEDB).

The package in its simplest form, grabs a file from an FTP site and transfers it to the file share. When there are no files found on the FTP site, the FTP task fails as expected.

Here is where it gets strange...I turned on a package configuration, with all variables / properties saved to a SQL table; all except for the OLEDB connection which I am leaving hardcoded in the package for the time being.

The problems I am seeing as soon as I activate the package configuration are twofold: The first occurs with the property "FileUsageType" defined to the file share connection. I have it set to "existing folder" in the SSIS tool, the SQL config table also has a row for this item defined with a value of 2. I have even tried manually changing this value in the SQL config table and the result is always the same error when I run the package:

Error: 0xC002F314 at FTP Task, FTP Task: File usage type of connection "Z:\" should be "FolderExists" for operation "Receive".

If I remove this entry totally from the SQL config table, and leave that property hardcoded in the package then I am able to continue execution in the package and that particular error goes away. But then I come to weird problem #2: I stated above that when there are no files in the FTP directory to retrieve, then the FTP task fails, as I would expect. However, as soon as I activate the package configuration, even when there are no files at the FTP site, the FTP task doesnt fail anymore, it marks itself as successful instead. This is wreaking havoc on my precedence constraints that follow the FTP task naturally. As soon as I deactivate the package configuration, the FTP task behaves normally again.

The problems only seem to happen with config packages stored in SQL. I tested this also with XML file config instead and everything seemed to behave as it should. Unfortunately, I need to store these configs in SQL, not XML.

Does anyone have any ideas on what I need to look at to fix these weird issues?

When you say you have "all variables / properties" saved to a SQL table, what do you mean? Do you really have ALL of them? I'd suggest only storing what you need, if that's the case.|||

Phil,

I have tried it both ways, storing all variables / properties in the SQL table, and just storing only the ones I need (i.e connection strings, file paths, etc). In both cases the outcome is the same as far as the issue with the FTP Task reporting success when it shouldnt. Of course when I store only what I need in the table then the first problem with the "FileUsageType" property, goes away; I can live with keeping that property setting hardcoded in the package. But the success reporting on a task that should fail, and ignoring all my precedence constraints is one I cant so easily work around.

|||

The only thing I can suggest is starting with a configuration-free package that works as desired, and then start configuring one property at a time, testing each step of the way. When you can pinpoint "taking this specific step makes it stop working in this speific way" you'll likely see what's going wrong.

If not, this will make it a lot easier for the folks here to help troubleshoot...