Friday, February 24, 2012

Blanks in columns

Quick Question I hope.

Does having a lot of blanks in a column cause errors ?

In 2 or 3 packages where I get 95% of the data in the Error Colomn I can find nothing wrong with the data at all ? it all either seems perfect or there is no data in the column in question usually I would have maybe 800 rows where data would have been inserted but in the other 40, 000 rows the column is blank

Using the "Retain null values from the source as null values in the destination" doesnt seem to make a differance.

This is the error description

The data value cannot be converted for reasons other than sign mismatch or data overflow.

Anyone know of a solution / reason why this keeps happening

Thanks

Without seeing the package, no. It would apear that there is an implicit conversion going on somewhere which is failing. You have the error files so it shouldn't be difficult to work out what is happening. Do you know which column is failing?

-Jamie

|||It would also help for us to know what type of source you are using. I would guess that your using a Flat File Source.

Please give us a little more info about the structure of the package and we'll be in a better position to offer suggestions.

Larry Pope|||

Will do, Thanks for the help

Package Details are as follows

Import data from a flat file source in csv format with semicolons as the delimiters

Change the date's using a derived column with an expression like
(DT_DBTIMESTAMP)[Column 21]

Then to a SQL Server Destination where that column wher e the table for that column would read [Column 2] DATETIME

I have also thrown in a ErrorDescription Package (Which is great Thanks Jamie :D) and also a package that tells me which column is failing (Which I got from Simon)

From these I can work out which column seems to be failing although when I switch that to Ignore errors another date column usually fails.

Is it REALLY bad practice to have all columns on "Ignore errors" ....when the data seems to be fine when I look over it after in the table I cant find anything wrong with it.

If there was an charactor such as '# 'or a random / ,\ surely that would just fail one row and not all the rows

|||Are you trying to cast nulls to a DT_DBTIMESTAMP. I think there

may be problems with this. Jamie might be able to confirm this.

You might want to change your derived column to

ISNULL([Column 21] ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)([Column

21]). And be aware that trying to cast invalid dates could also

raise errors (example '0000-00-00').

Have you put a data viewer between the source and the derived column to see how the Flat File Source is interpreting file?

Larry Pope|||

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Thanks guys ill try that,

Just realised aswell that 1 column is in the format MM/DD/YYYY while the other is DD/MM/YYYY

Should be fun trying to sort them out

|||

Can someone please elaborate on this occurrence. I believe we are experiencing a flavor of this.

We have a package that checks an amount field for values greater than 0.

When the value is greater than 0 then we populate with the processing date, else we place a Null Timestamp value.

We are having failed inserts because our target column is SmallDateTime and somehow the time value of "0000-00-00 00:00:00" is being created.

According to the logic, there should only be a valid date or a null date....no zeros. Does anyone have any information on this issue?

ChargeOffAmount > 0 ? (DT_DBTIMESTAMP)ReportingDate : NULL(DT_DBTIMESTAMP)

Thanks in advance...

Jamie Thomson wrote:

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Romeo,

Is there a default value specified in the column definition in the database?

|||

Phil, no, there is no default value specified for this column.

I also failed to mention that these values are not always consistently produced...meaning that we can try to recreate the occurrence and then the date will be generated correctly and load. Makes debugging not so easy and causes me to doubt my sanity.....well, that's never been in doubt, I get opinions on both sides....

No comments:

Post a Comment