.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Cannot transfer file using FTP Task

Posted By:      Posted Date: December 04, 2010    Points: 0   Category :Sql Server


I try to upload a file using the "FTP task" from SSIS 2008. I have to upload the file on 3 FTPs, 2 of them are ok, and with the last I  get the following error "200 Type set to I.  200 PORT command successful.550 /myDirectory/myFile.xml: Access is denied."

The 3 FTPs have the same configuration (same login/password), my tasks were made on the same way...

I tried to copy/paste the file from the local directory to the FTP and it worked ! I really don't understand

why the package fails (either in a job or from BIDS).

Thank you for your help (and sorry for the mistakes, english is not my mother tongue...)


View Complete Post

More Related Resource Links

Transfer SQL Server Objects Task

I'm trying to use the Transfer SQL Server Objects Task to copy database users and database roles from one database to another. The problem is that some of the users already exists in the destination database. Is there a setting or expression or error handler that will allow me to specify to only copy the objects that don't already exist? I can ignore the failure but I won't know if it's really a copy failure or a duplicate. I read the roll-your-own blog referenced in a similar post (http://blogs.msdn.com/b/mattm/archive/2007/04/18/roll-your-own-transfer-sql-server-objects-task.aspx) but I don't know if a property exists for the transfer object with will allow me to indicate that I want to copy users that aren't already in the destination. Has anyone successfully done this? It seems like it would be a simple task.

Task scheduler and System.IO.File.WriteAllText() not working properly

I have a simple console application that has this in mainline:  Console.WriteLine(System.String.Format("the current directory is: {0}", System.IO.Directory.GetCurrentDirectory())); System.IO. File.WriteAllText(@"test.txt", "new contents");   Console.ReadLine(); I compile this in VS 2010 and copy the EXE to a directory named "c:\test".  When I run this code in two different OS's via "Task Scheduler" and "Scheduled Task" I get different results.   In Windows Server 2003 R2 "Scheduled Task" this code will create a file as I would expect, c:\test\test.txt with the proper contents written to it.  When I run this same code on Windows 2008 R2 (64 bit version) "Task Scheduler" it writes the expected contents to an unexpected location, c:\C:\Windows\SysWOW64\test.txt. In both environments if I run the exe's from a command line, it writes to c:\test\test.txt as I would expect. As I understand it, if you don't fully qualify the file name in the first parameter of the WriteAllText method it will write the contents to the file in the directory of the executable however this seems to break down when running in the Windows 2008 Task Scheduler.  I know I can fully qualify the file name or put the expected path in a

Web Services Task Editor: The input Web Services Description Language (WSDL) file is not valid

I am trying to prove I can use SSIS to connect to a web service.  The WS I am trying to connect to was developed by a vendor and covered by a NDA, but I was able to reproduce the issue with a public WS. Here are the steps to reproduce the issue: In the Web Services Connection Manager, I entered http://office.microsoft.com/Research/Providers/MoneyCentral.asmx?wsdl in the URL window.  I am able to successfully "test" the connection I pasted the above link into IE and saved the resulting XML as a .wsdl file on my local machine.  In the Web Services Task Editor, General Tab, I specify the path to the .wsdl file and click on "Download WSDL" button.  No Issues When I click on "Input" and select "MoneyCentralRemote" from the drop-down for Service, I receive an error message saying "This version of the Web Services Description Language (WSDL) is not supported" So the questions are: Did I perform the above steps correctly? What WSDL versions are supported in SSIS? How can I tell what WSDL version was used to create the .wsdl I am trying to access? If the WSDL is an unsupported version, is there a work-around to fix the issue?

URL location of the WSDL File for Web Services File Task

I am finding that in order to have the Web Services Task work successfully the location of the WSDL file has to be on a local drive that SSIS is executing upon.  Is the current intended behavior? In my SSIS task I use a URL path to store information extracted from the Web Service.  The information is stored on a different server than the one that SSIS is running upon.  This works properly without error. I have confirmed that SSIS has appropriate permissions to read/write to that directory on that server.  When I attempt to reference the WSDL file (located in the same URL directory that I am saving the information) I get a web services error, 'The Web Services Name is empty, Verify that a valid web service name is available." When I update the Web Service Task attribute to point to the WSDL file located on a local drive it works correctly.  I have confirmed that both WSDL documents are exactly the same. The behavior seems a little strange...so I must be missing something subtle. ...cordell...

WSDL file for Web Services task

I'm trying to use a Web Service for the first time in SSIS.  I have the httpConnection specified, and the test succeeded.  I have put in a text file the code below, and renamed the file with a .WSDL extension.  I'm getting the error below.  What is wrong with my WSDL file?  Thanks!   Could not read the Web Services Description Language (WSDL) file.  The input WSDL file is not valid.  The follwing error occurred while reading the file.  There is an error in XML document (3,3)...   Code Snippet<?xml version="1.0" encoding="utf-8"?><soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"><soap:Body><helloworld xmlns="http://lesl.com/" /></soap:Body></soap:Envelope>

Populate a Data Flow Task's variables with values from a .dtsConfig file?

I have an SSIS Project to import exchange rates from an XML file into SQL. The project works when I have certain values hard-coded into variables, such as the URL of the XML File, proxy username and passwords etc. I decided to put in a .dtsConfig file, and have it update the values of the variables in the data flow task. So, one of my SSIS Packages in this solution is called ECBDailyRates.dtsx. It contains one control flow, which is a data flow task. The data flow task is a script task which does some XPATH on an XML file, and then provides outputs to an OLE DB Destination, which in turn puts the data into a table in SQL. That much worked before I put in the dtsConfig file. There are a few variables declared inside the data flow task, such as; PricesXMLUriDaily, String ProxyAddress, String ProxyAddressPort, Int32 etc. etc. These are now being updated from the dtsConfig, which is below. It was my understanding that SSIS would run through the config, and update any of the variables as required, and then run my data tasks. However when SSIS runs my script task, none of the variables have been populated, and so it falls over... as it were :) <?xml version="1.0"?> <DTSConfiguration> <DTSConfigurationHeading> <DTSConfigurationFileInfo GeneratedBy="...." GeneratedFromPackageName="dim_Institution" GeneratedFromPackageID="

SQL Server 2005 (X64) Maintenance Clean Up Task Using Hours Option for File Age

I'm running into an issue with SQL Server 2005 64-bit and the maintenance clean up tasks.  I have it set up to delete any transaction logs or backups older than 12 hours.  It seems that there could be some sort of bug because, according to the log files, SQL is running the clean up for 12 days instead.  I've created a work around for this by using execute transact-sql tasks instead, but was wondering if there's a problem with my version of SQL Server, like an SP or patch that I've overlooked. Here's my version information: Microsoft SQL Server 2005 - 9.00.1399.06 (X64)   Oct 14 2005 00:35:21   Copyright (c) 1988-2005 Microsoft Corporation  Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)

Does Transfer SQL Server Objects task transfer objects created in the source AFTER the package has b

I created an SSIS package which contains a Transfer SQL Server Objects Task. I configured this task to copy table objects, stored procedures, and object permissions to the destination. Between the time I created my SSIS package and the time it was run, someone created a new table object in the source, and changed permissions on a stored proc in the source. My question is this, at the time the SSIS package is created, behind the scenes, does SSIS create a list of objects to transfer? I had hoped that it creates the list of what specific objects (of the pre-defined type) to transfer at runtime so that whatever changes were made to the source database would be included at runtime.

Is there any way to get the Transfer SQL Server Objects Task to not throw error if an object already


I've asked this before but never got an answer. Is there a way to configure the Transfer SQL Server Objects Task so that it will only transfer objects that don't already exist in the destination? Or to skip over objects that already exist?

I do not want to "roll my own". I want to use the task in order to save time.

Calling remote batch file using Execute Process Task



The requiement is to create excel file. I cannot run this code using SSIS vbscript task because this requires Microsoft.Office.Interop DLL which can not be installed on the dev / prod server. Hence using the SQL task the data exported to excel.

But now the excel is required to be updated to merge some cells. The approach we are thinking is to keep VBS file with required code to UNC path. Keep the BAT file running VBS file using CSCRIPT command in same location. Call the BAT file from SSIS package using Execute Process Task. This approach is tested in local system and also works on dev server. But somehow the sample code used to create excel is not creating excel to UNC path.

VBS code below

Const xlSaveChanges = 1
Set objExcel = CreateObject("Excel.Application")
objExcel.Visible = False
objExcel.Cells(1, 1).Value ="Test value"

The BAT used to call above VBS file is given below

cscript \\server.com\folder$\Demo\Excelfile.vbs

The above UNC path is used in execute process task package which runs fine. But the file is not yet created.


Huge temdb and msdb data file after Cleanup Task


I've run the History Cleanup Task has a maintenance job.

After the job ran successfully, I noticed that the data files of msdb and temdb are over 2 GB of size. I've tried to shrink both databases and I didn't get much. It is normal to have the tempdb over 3 GB after run History Cleanup Task even when this is was not the size of the database tempdb before run the History Cleanup Task ??

Best regards

transfer database task error



I have created a SSIS package that does nothing more than loop through all DBs and copies the userDBs to another server. However, I keep getting an error after the task has created the database during its execution of "Create Role" statements. Here is the error:

Error: The Execute method on the task returned error code 0x80131500 (ERROR : errorCode=-1073548784 description=Executing the query "CREATE ROLE [aspnet_WebEvent_FullAccess] " failed with the following error: "User, group, or role 'aspnet_WebEvent_FullAccess' already exists in the current database.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.  helpFile= helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}). The Execute method must succeed, and indicate the result using an "out" parameter.


Now it appears to me that the Transfer DB task keeps using master as the current database even after it has created the new DB? Why would it does this when at the source the database role is under the usersDB?



Stale security timestamp error encountered in the middle of a large file transfer


I have two WCF services configured as Windows Services for both client and server that perform file transfers for DR replication. I'm using chunked streaming with a wsHttpBinding.

One of the files I am testing with is 58MByte, and after it has been going for 5 miinutes it throws a stale message exception

System.ServiceModel.Security.MessageSecurityException: The security timestamp is stale because its expiration time ('2010-10-07T20:36:25.322Z') is in the past. Current time is '2010-10-07T20:44:34.155Z' and allowed clock skew is '00:05:00'.

The problem is not clock skew because the two servers are within a minute of each other, and the transfer proceeds just fine until it reaches the 5 minute mark.  It's comparing the time at the beginning of the transfer to current time.  If got the receiveTimeOut set to 59 minutes.

I've seen posts about creating a custom binding to increase the clockSkew tolerance, but this seems to be more of a timeout issue.

Any ideas?



// get the stream from the remote server, copy it to a local file stream.

                                        stream1 = client.Get

How to transfer PDF file?


 PDF File Transfer is a terrific software which is very simple to use. You can transfer the most common-used eBook formats PDF and EPUB files between iPad and your PC directly without iTunes.

Besides a file transfer, PDF file transfer also allows you to customize the PDF and EPUB files by editing information, like name and author. This PDF files converter free download is compatible with various digital devices, including: iPad, iPod touch and iPhone versions, iPod touch2, iPod touch3, iPhone 3G, iPhone 3GS, iPhone 4 and iPad. Fast and convenient, PDF file transfer is a best choice to transfer PDF and EPUB files between iPad, PC and iTunes.

Add Software:Convert PDF Files to iPad ,Convert PDF Files to Flash

More info:Convert PDF Files ,PDF Files Converter Create PDF Files ,

What is the SSIS Transfer Task?


I have just been using the Import Data wizard in SSMS and when going through the wizard I checked the "Optimize for many tables" checkbox.

The resultant package contains a task called "Transfer Task". I've never heard of this before. Its not listed in the toolbox and its not documented in BOL.

It does some rather strange things as well.

  • From what I can determine it uses an XML manifest file (stored in c:\documents and settings\<user>\Local settings\temp) to construct a SSIS package on the fly which is then executed by an Execute Package Task.
  • Its SourceDB property is always "smo_Pubs". Its DestinationDB property is always "smo_Pubs_xfred"
  • There's no UI for it

All very peculiar.


What is it?

Where has it come from?

Did it arrive with SP1 or have I just never noticed it before?

Why is it preferable to a package with lots of data-flows?

Why is it not documented?

When should I use it/not use it?


Questions questions questions...





Why does BI "Transfer SQL Server Objects Task" error occur?


I'm using SSIS to copy all tables and the data from server1 to server2.  Database names are same on both source and destination servers. dbo.MyTable definately exists in the source so I don't understand this error message:

 [Transfer SQL Server Objects Task] Error: Execution failed with the following error: "ERROR : errorCode=-1071636471 description=SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E37. An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E37  Description: "Invalid object name 'dbo.MyTable'.".  helpFile=dtsmsg100.rll helpContext=0 idofInterfaceWithError={C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}".

 There's nothing fancy about MyTable:

CREATE TABLE [dbo].[MyTable](

[MyId] [

ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend