Showing posts with label packages. Show all posts
Showing posts with label packages. Show all posts

Sunday, March 11, 2012

Blocking Transaction

I have a 3 simple packages get called from master package. First package populates time dimension (stored proc), second one populates transaction type dimension(stored proc) and the final one populates date dimension.

I set the TransactionOption = Required in the Master package, every thing else (package & component) set TransactionOption = Supported.

I have been testing transactions, basically I made time and transaction type dimension to fail and the package did roll back. I am stuck in the date dimension.

Within Date dimension I got a stored procedure which populates calendar date attributes into Wrk_Date table. Then I have a data flow task which reads from thats working table, but it cant access it. I tried running SP_WHO2 command, and the status is SUSPENDED, being blocked by id -2.

I saw someone had similar problem and I did read Ash's comments. I did try to change the isolation level, didnt help.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=74372&SiteID=1

Any thoughts please?

Thanks

Sutha

Sutha,

I had enormous problems with this and eventually just dropped the use of distributed transactions altogether. In some cases it simply wouldn't work because of the way DTC holds onto resources. Since then I have used checkpoint files and it works just fine.

-Jamie

|||

Jamie

Thanks for the infor, but it is annoying. Do you know this is going to be fixed at any point ?

Client is not happy having a checkpoint. They want to rollback the whole dimension load. For example if we have got 6 dimensions and 5 of them loaded fine and 6th one failed, then they want to rollback the whole lot. What would be the solution in that scenario please, if we cant use transactionoption correctly?

Thanks

Sutha

|||

I don't think its a case of it being fixed - its just "the way it is". DTC isn't doing anything wrong - its doing exactly what you've told it to i.e. Maintaining a lock on a table because of a running transaction.

Its a difficult one to broach with the client, I can see that. Maybe you could restructure your package so that the locks don't occur.

-Jamie

|||

Jamie

Thanks. I was going through step by step and identified the reason for locks. During step 2, if date table exist in stage database it TRUNCATES the table else it creates the table. Truncate statement seems to hold the lock and it is not releasing it further down the line. I have managed to get the rollback to work after removing truncate.

Thanks

Sutha

Friday, February 24, 2012

Blanks in columns

Quick Question I hope.

Does having a lot of blanks in a column cause errors ?

In 2 or 3 packages where I get 95% of the data in the Error Colomn I can find nothing wrong with the data at all ? it all either seems perfect or there is no data in the column in question usually I would have maybe 800 rows where data would have been inserted but in the other 40, 000 rows the column is blank

Using the "Retain null values from the source as null values in the destination" doesnt seem to make a differance.

This is the error description

The data value cannot be converted for reasons other than sign mismatch or data overflow.

Anyone know of a solution / reason why this keeps happening

Thanks

Without seeing the package, no. It would apear that there is an implicit conversion going on somewhere which is failing. You have the error files so it shouldn't be difficult to work out what is happening. Do you know which column is failing?

-Jamie

|||It would also help for us to know what type of source you are using. I would guess that your using a Flat File Source.

Please give us a little more info about the structure of the package and we'll be in a better position to offer suggestions.

Larry Pope|||

Will do, Thanks for the help

Package Details are as follows

Import data from a flat file source in csv format with semicolons as the delimiters

Change the date's using a derived column with an expression like
(DT_DBTIMESTAMP)[Column 21]

Then to a SQL Server Destination where that column wher e the table for that column would read [Column 2] DATETIME

I have also thrown in a ErrorDescription Package (Which is great Thanks Jamie :D) and also a package that tells me which column is failing (Which I got from Simon)

From these I can work out which column seems to be failing although when I switch that to Ignore errors another date column usually fails.

Is it REALLY bad practice to have all columns on "Ignore errors" ....when the data seems to be fine when I look over it after in the table I cant find anything wrong with it.

If there was an charactor such as '# 'or a random / ,\ surely that would just fail one row and not all the rows

|||Are you trying to cast nulls to a DT_DBTIMESTAMP. I think there

may be problems with this. Jamie might be able to confirm this.

You might want to change your derived column to

ISNULL([Column 21] ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)([Column

21]). And be aware that trying to cast invalid dates could also

raise errors (example '0000-00-00').

Have you put a data viewer between the source and the derived column to see how the Flat File Source is interpreting file?

Larry Pope|||

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Thanks guys ill try that,

Just realised aswell that 1 column is in the format MM/DD/YYYY while the other is DD/MM/YYYY

Should be fun trying to sort them out

|||

Can someone please elaborate on this occurrence. I believe we are experiencing a flavor of this.

We have a package that checks an amount field for values greater than 0.

When the value is greater than 0 then we populate with the processing date, else we place a Null Timestamp value.

We are having failed inserts because our target column is SmallDateTime and somehow the time value of "0000-00-00 00:00:00" is being created.

According to the logic, there should only be a valid date or a null date....no zeros. Does anyone have any information on this issue?

ChargeOffAmount > 0 ? (DT_DBTIMESTAMP)ReportingDate : NULL(DT_DBTIMESTAMP)

Thanks in advance...

Jamie Thomson wrote:

I've definately seen cases where an attempt to put a NULL into a DT_DBTIMESTAMP column resulted in a value of "0000-00-00 00:00:00" - which is just useless. This happened from inside a script task when setting the column to 'Nothing' but I guess the same symptoms could occur elsewhere as well!!

-Jamie

|||

Romeo,

Is there a default value specified in the column definition in the database?

|||

Phil, no, there is no default value specified for this column.

I also failed to mention that these values are not always consistently produced...meaning that we can try to recreate the occurrence and then the date will be generated correctly and load. Makes debugging not so easy and causes me to doubt my sanity.....well, that's never been in doubt, I get opinions on both sides....

Tuesday, February 14, 2012

Bizzarre SSIS Configuration behavior

I have an SSIS package, (actually many packages at this point because I can duplicate this problem each and every time) with an FTP Task and three connection managers (one for FTP, one for a file share, one OLEDB).

The package in its simplest form, grabs a file from an FTP site and transfers it to the file share. When there are no files found on the FTP site, the FTP task fails as expected.

Here is where it gets strange...I turned on a package configuration, with all variables / properties saved to a SQL table; all except for the OLEDB connection which I am leaving hardcoded in the package for the time being.

The problems I am seeing as soon as I activate the package configuration are twofold: The first occurs with the property "FileUsageType" defined to the file share connection. I have it set to "existing folder" in the SSIS tool, the SQL config table also has a row for this item defined with a value of 2. I have even tried manually changing this value in the SQL config table and the result is always the same error when I run the package:

Error: 0xC002F314 at FTP Task, FTP Task: File usage type of connection "Z:\" should be "FolderExists" for operation "Receive".

If I remove this entry totally from the SQL config table, and leave that property hardcoded in the package then I am able to continue execution in the package and that particular error goes away. But then I come to weird problem #2: I stated above that when there are no files in the FTP directory to retrieve, then the FTP task fails, as I would expect. However, as soon as I activate the package configuration, even when there are no files at the FTP site, the FTP task doesnt fail anymore, it marks itself as successful instead. This is wreaking havoc on my precedence constraints that follow the FTP task naturally. As soon as I deactivate the package configuration, the FTP task behaves normally again.

The problems only seem to happen with config packages stored in SQL. I tested this also with XML file config instead and everything seemed to behave as it should. Unfortunately, I need to store these configs in SQL, not XML.

Does anyone have any ideas on what I need to look at to fix these weird issues?

When you say you have "all variables / properties" saved to a SQL table, what do you mean? Do you really have ALL of them? I'd suggest only storing what you need, if that's the case.|||

Phil,

I have tried it both ways, storing all variables / properties in the SQL table, and just storing only the ones I need (i.e connection strings, file paths, etc). In both cases the outcome is the same as far as the issue with the FTP Task reporting success when it shouldnt. Of course when I store only what I need in the table then the first problem with the "FileUsageType" property, goes away; I can live with keeping that property setting hardcoded in the package. But the success reporting on a task that should fail, and ignoring all my precedence constraints is one I cant so easily work around.

|||

The only thing I can suggest is starting with a configuration-free package that works as desired, and then start configuring one property at a time, testing each step of the way. When you can pinpoint "taking this specific step makes it stop working in this speific way" you'll likely see what's going wrong.

If not, this will make it a lot easier for the folks here to help troubleshoot...