Showing posts with label errors. Show all posts
Showing posts with label errors. Show all posts

Tuesday, March 20, 2012

BOL Package example Errors

Hi There

Ok i realize this may not be exactly the correct place to post this, but i seem to get good advice here.

I am trying to follow the SMO Tables DBCC Package Sample in BOL.

I have copied the Microsoft.SqlServer.Smo.dll and Microsoft.SqlServer.SmoEnum.dll to my latest .NET Framework folder as specified in BOL example.

Problem is when i try to run the package i get the following error:

An error occured while compiling the script for the Script Task

Error 30466: Namespace or type specified in the Imports 'Microsoft.SqlServer.Management.Common' cannot be found. Make sure the name space or the type is defined and it doesn't contain other aliases.
Line 9 Column 9 through 45

Imports Microsoft.SqlServer.Management.Common

Now i am guessing it has something to do with the SqlSmo dll's , i am not sure what to check as i have copied them to my .NET framework.

Someone suggested i should go to my .NET Framework 2.0 configuration and confirm that it has picked up the new smo dll's, but since installing the Beta .Net Framework 2.0 from the June CTP i get the following error when i try open the configuration:

Snap-in failed in to initialize
.NET Framework 2.0 Configuration

I have tried reinstalling the .Net Framework and i still get the same error, have not found any appropriate solutions on the net.

Are these issues related ? Any help would be greatly appreaciated.

Thanx
I'm not familiar with that sample. Where are you trying to use the SMO assemblies? In the script task?|||Hi Kirk

That is correct.

If you have installed the sample packages in the default directory.
You can open the project from the following path:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Package Samples\SmoTablesDBCC\SmoTablesDBCC\SmoTablesDBCC.dtproj

If you have not installed tthe sample packages i can copy the script and post it here if you like.

Thanx|||Sounds like you haven't moved over the assembly.
I describe how to do that here:
http://sqljunkies.com/WebLog/knight_reign/archive/2005/07/07/16018.aspx

Thanks,|||Hi Kirk

Ok please bear with me but i have a few questions.

Firstly here are the assembly imports as they are in the script.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Management.Common ***
Imports Microsoft.SqlServer.Management.Smo

The one marked *** is the one with the compilation error.
Now for the example package i copied the Microsoft.SqlServer.SmoEnum.dll and Microsoft.SqlServer.Smo.dll assemblies to %windir%\Microsoft.net\framework\v2.0.xxxxx\ as specified. Now these refer to the import following the one giving me the error.So question number 1 what is the correct assembly for Imports Microsoft.SqlServer.Management.Common? As i can find no Management.Common assembly on the CTP assemblies folder? In other words i am not sure which assembly goes with that import.
Secondly I cannot find the option Project - Add References in BI studio from the project menu or an option like that in the script task editor, i have feeling i am being pretty stupid here, but i cannot, maybe it was another CTP release and it has another name in the June release?

Thanx Again Kirk|||Apologies, I assumed that the assemblies were correctly listed. The correct assembly for that namespace is actually: Microsoft.SQLServer.ConnectionInfo.dll
HTH|||Hi Kirk

Thank You it works now, i wonder why there is no mention of that assemly in the package examples in BOL? As it does nto work without it.
Lastly if you dont mind why did i not have to add a reference to it in BI, as stipulated in your link? As i mentioned above i cannot find this option in BI stdio?

Thanx Again|||Sean,
The documentation team are encouraging people to use the "Send Feedback" link in BOL if you've any problems with it. Drop them a line on that and it'll get actioned.

-Jamie|||

Lastly if you dont mind why did i not have to add a reference to it in BI?
Not clear on the question but will venture an answer.
You did add it in the VSA environment, right? The VSA environment is a sub-environment that is agnostic to and ignorant of the fact that it's running inside the BI environment. So, the VSA script can reference the assembly without the BI environment even knowing about it. I think that's the answer to the question...

|||

The information in the Readme for this sample explains some of the issues discussed in this thread.

The Script task project already contains the necessary References for all the managed namespaces that are also imported by using Imports statements. However certain DLLs need to be copied to the .NET directory as described in the Readme to be "visible" to the Script task.

You need to make the Project Explorer window visible to view and add References.

After copying the DLLs, you may need to close and reopen the script for the blue squigglies under the imported namespaces to disappear. If that doesn't work, try setting Precompile to False temporarily.

-Doug

|||Hi Kirk

That does clear things up, i found it in the VSA environment, bit unclear in the link i was trying to find it in BI, Thanxsql

Friday, February 24, 2012

Blanks in columns

Quick Question I hope.

Does having a lot of blanks in a column cause errors ?

In 2 or 3 packages where I get 95% of the data in the Error Colomn I can find nothing wrong with the data at all ? it all either seems perfect or there is no data in the column in question usually I would have maybe 800 rows where data would have been inserted but in the other 40, 000 rows the column is blank

Using the "Retain null values from the source as null values in the destination" doesnt seem to make a differance.

This is the error description

The data value cannot be converted for reasons other than sign mismatch or data overflow.

Anyone know of a solution / reason why this keeps happening

Thanks

Without seeing the package, no. It would apear that there is an implicit conversion going on somewhere which is failing. You have the error files so it shouldn't be difficult to work out what is happening. Do you know which column is failing?

-Jamie

|||It would also help for us to know what type of source you are using. I would guess that your using a Flat File Source.

Please give us a little more info about the structure of the package and we'll be in a better position to offer suggestions.

Larry Pope|||

Will do, Thanks for the help

Package Details are as follows

Import data from a flat file source in csv format with semicolons as the delimiters

Change the date's using a derived column with an expression like
(DT_DBTIMESTAMP)[Column 21]

Then to a SQL Server Destination where that column wher e the table for that column would read [Column 2] DATETIME

I have also thrown in a ErrorDescription Package (Which is great Thanks Jamie :D) and also a package that tells me which column is failing (Which I got from Simon)

From these I can work out which column seems to be failing although when I switch that to Ignore errors another date column usually fails.

Is it REALLY bad practice to have all columns on "Ignore errors" ....when the data seems to be fine when I look over it after in the table I cant find anything wrong with it.

If there was an charactor such as '# 'or a random / ,\ surely that would just fail one row and not all the rows

|||Are you trying to cast nulls to a DT_DBTIMESTAMP. I think there

may be problems with this. Jamie might be able to confirm this.

You might want to change your derived column to

ISNULL([Column 21] ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)([Column

21]). And be aware that trying to cast invalid dates could also

raise errors (example '0000-00-00').

Have you put a data viewer between the source and the derived column to see how the Flat File Source is interpreting file?

Larry Pope|||

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Thanks guys ill try that,

Just realised aswell that 1 column is in the format MM/DD/YYYY while the other is DD/MM/YYYY

Should be fun trying to sort them out

|||

Can someone please elaborate on this occurrence. I believe we are experiencing a flavor of this.

We have a package that checks an amount field for values greater than 0.

When the value is greater than 0 then we populate with the processing date, else we place a Null Timestamp value.

We are having failed inserts because our target column is SmallDateTime and somehow the time value of "0000-00-00 00:00:00" is being created.

According to the logic, there should only be a valid date or a null date....no zeros. Does anyone have any information on this issue?

ChargeOffAmount > 0 ? (DT_DBTIMESTAMP)ReportingDate : NULL(DT_DBTIMESTAMP)

Thanks in advance...

Jamie Thomson wrote:

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Romeo,

Is there a default value specified in the column definition in the database?

|||

Phil, no, there is no default value specified for this column.

I also failed to mention that these values are not always consistently produced...meaning that we can try to recreate the occurrence and then the date will be generated correctly and load. Makes debugging not so easy and causes me to doubt my sanity.....well, that's never been in doubt, I get opinions on both sides....