.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Help resolve an issue for importing data

Posted By:      Posted Date: September 10, 2010    Points: 0   Category :ASP.Net
Hello All, What is the best way to get info from a directory on machine to import into SQL?  I basically need the file path and file size of a certain directory.  I tried using the command prompt and print to a text file, but I can't find a way to get JUST file path and file size.    I'm looking into some free software online like "Treesize" but I'm not sure that'll do it. I know I can do this prgrammatically, but I'm just curious to see if there is a faster way because I am dealing with large directories. Any suggestions would be appreciated. Thanks!

View Complete Post

More Related Resource Links

Importing xml data with bcp issue

I am trying to transfer one table's data from one server to another. The table structure on both servers are identical: CREATE TABLE [dbo].[_CachedQueriesArchive]( [id] [int] NULL, [statement_text] [varchar](max) NULL, [execution_count] [bigint] NULL, [avg_logical_reads] [bigint] NULL, [last_logical_reads] [bigint] NULL, [min_logical_reads] [bigint] NULL, [max_logical_reads] [bigint] NULL, [plan_handle] [varbinary](64) NULL, [query_plan] [xml] NULL, [cursor_type] [varchar](max) NULL ) I am using bcp DBNAME.._CachedQueriesArchive out e:\temp\cq.dat -N -T statement to export data. And bcp DBNAME.._CachedQueriesArchive in e:\temp\cq.dat -N -T to import. All records were imported successfuly. The issue is that column "query_plan" (of "xml" type) is filled in source DB in all 22 records. But in target DB it is filled only in 2 (two!) records. Other columns were imported perfectly. Tried bcp with -e option. The error file was empty. Tried BULK INSERT [_CachedQueriesArchive] FROM 'e:\temp\cq.dat' WITH (DATAFILETYPE='widenative') for import - same result. Tried bcp with -w option instead of -N. No success. Tried importing file on the source server - everything was fine (all 22 records was imported with their "query_plan" data). The problem occurs only on target server. Servers have different versions: source - 9.0.3257 target - 9.0.4053

Data Truncation issue with Enterprise Library Logging WriteLog stored Proc


Hi ,

I'm using Enterprise Library Logging  feature for logging. The issue i am facing is when the Logging message is too large(more than 65534 chars) ,complete data  is not logged in the Formatted Mesage column which is  of data Type nText .

I am able insert complete data if i try inserting from Sql insert Query from sql management studio. Do i need to add any attributes to data base listener or do i need to change the sp.

 Is there any way to increase the WriteLog stored proc param size in EnterpriseLibrary.Logging config file ? . Please let me know.


Thanks In Advance.

Importing SharePoint List Data into Project Server 2007 Custom Fields

Learn how to use the programmability features of Project Server 2007 and Windows SharePoint Services 3.0 to import SharePoint list data into an enterprise custom field.

Error message when importing data

Goodday , I get this error message when trying to import data from xls format.  Any suggestions Thanks Rob TITLE: SQL Server Import and Export Wizard Could not retrieve table list. ADDITIONAL INFORMATION: Cannot generate SSPI context SQL Server Network Interfaces: The Local Security Authority cannot be contacted (Microsoft SQL Server Native Client 10.0) BUTTONS: OK  rh

Convert Data from TXT file and face a length issue! Need help!

Hi All, I am trying to convert some TXT files in to sqlserver database by using SSIS.  The problem I am facing right now is that some TXT files I have are fixed with line length 286.  However, for the some other TXTs they have 296 characters per line. The reason why this happened is some one added one more field before generating those TXTs.  Instead of creating two SSIS packages for converting those TXTs, can I find another way to get around this?   

Data Reader Destination issue

Hi, I'm new to BI, I created a package to test the data reader destination the dtsx is executing well every thing is green but what after that. The question is how to use this Data reader destination? As I know as a .net programmer there  is a class named data reader that could be used to retieve data in connected mode programmatically using a  connection and a command objects, but in the case of data reader destination how to use it. Or am I confusing the data reader that I kno in the ADO .Net with this one used in BIDS  Should I add a script task after the data reader destination or should I consume this data reader within my proper separate code with a dll or an exe that I create as a .Net project ? For instance, two properties are remaquables for me those are locted within the custom proeprties  datareader and  usercomponentdatatype   The complexity resides in the simplicity

Data query task issue

Hi Someone help me undersand how to implement a Data meaning query  in SSIS project step by step. even a small package for testing this task ThanksThe complexity resides in the simplicity

OLE DB Destination issue in ForEach Data Flow

Has anyone encountered the following problem:I have a ForEach container with a single task in it - a Data Flow. The Data Flow uses a Source Script transformation to read each file and writes out to several output streams, each of which is connected to a OLE DB Destination.The ForEach executes for each file (in my example 3 times) but only the data from the last execution is in the DB tables.This from is my output window:SSIS package "PIF.dtsx" starting.Information: 0x4004300A at ETL Audit, DTS.Pipeline: Validation phase is beginning.Information: 0x4004300A at Load V6 Policy, DTS.Pipeline: Validation phase is beginning.Information: 0x40043006 at Load V6 Policy, DTS.Pipeline: Prepare for Execute phase is beginning.Information: 0x40043007 at Load V6 Policy, DTS.Pipeline: Pre-Execute phase is beginning.Information: 0x4004300C at Load V6 Policy, DTS.Pipeline: Execute phase is beginning.Information: 0x402090DF at Load V6 Policy, stg_RiskLocation [5449]: The final commit for the data insertion has started.Information: 0x402090DF at Load V6 Policy, stg_PDMRFeed [5277]: The final commit for the data insertion has started.Information: 0x402090DF at Load V6 Policy, stg_Policy [5208]: The final commit for the data insertion has started.Information: 0x402090E0 at Load V6 Policy, stg_RiskLocation [5449]: The final commit for the data insertion has ended.Information: 0x402090DF at Load V6 Pol

Issue with Data Access Layer DLL

Hi, I am working on a project in which I have made changes to LINQ to SQL code in data access layer project. Initially the code was using Single() extension method, I have changed it to SingleOrDefault() was earlier method was throwing exception if no records found in db. After making changes I ran the project but during debugging VS 2008 was showing that file is changed and do you want to reload the file box. When selected the file, the debugger was still executing the commented code (Single() method). I tried cleaning the entire solution and rebuilding it but it did not help. All the projects (Presentation, BL, DAL) are under single solution. Can any one please help me to solve the issue? Thanks in advance.

Issue with data conversion

Hi Guys,   Input data format: 9261.00001e+012 Out put Required in NVARCHAR  I am using below expression in derived column but still its not converting. (DT_WSTR,50)new_master_plu Any help much appreciated. Thanks,            

Importing data from recursive XML elements

The requirement is to import data (to SQL Server 2005) from a recursive xml schema. The problem is, the element names are positioned recursively. For example, the element would have another element and so on. The depth of occurences are not known.  This data has to be imported and stored in a flattened form Here is the sample xml: <ROOT id="1"> <TIME>2010-02-24 22:28:20</TIME> <REGION code="SW">South West</REGION> <DOMAIN code="LEGL" seq="1">LEGAL</DOMAIN> <YEAR>2009</YEAR> <NOTICE id="1"> <SECTION lo="3" hi="3" seq="680" code="3">SECTION 3</SECTION> <CASE id="79524D7A-D55D-11DD-BF1B-36609DFF4B22" level="1" placeholder="N" linkable="N">Level 1 Section 3 case 1.<CASE id="79538DA2-D55D-11DD-BF1B-36609DFF4B22" parent_id="79524D7A-D55D-11DD-BF1B-36609DFF4B22" level="2" placeholder="N" linkable="N">Level 2 Section 3 case 1.<CASE id="7957BA94-D55D-11DD-BF1B-36609DFF4B22" parent_id="79538DA2-D55D-11DD-BF1B-36609DFF4B22" level="3" placeholder="N" linkable="N">Level 3 Section 3 case 1.<CASE id="7954B7EA-D55D-11DD-BF1B-36609DFF4B2

External Lists and Infopath forms issue with Secondary Data Sources

I've followed this tutorial for creating and customising forms for external data (http://blogs.msdn.com/infopath/archive/2010/03/11/customize-an-external-list-form-in-infopath-2010.aspx).  The only difference being that my seconday data source is not an XML file (I've tried database connection and existing sharepoint lists both internal and external) When running the design checker I get the message "Additional Data connections not supported" and my form won't publish.  Any ideas?

Importing Data from Excel into SQL Server using SSIS: some datetime values appear as NULLS How to Re

I created a Package in the Business Intelligence studio to Import data from Excel file  into SQL Server 2005 using a Excel Source and a OLE DB Destination that uses a data convertion transformation before it reaches the destination a mjority of the data is copied over. However i am having 2 Issues. 1. In the Date field some of the values appear as Null in SQL Server 2. I need to change the format of the date in Excel from dd/mm/yyyy to mm/dd/yyyy before inserting into SQL Server if Possible. I am not sure of the solution for Ques 1 but i attempted using a script task for #2 It did not work. Please Advice what the best way to proceed Thanks.

Help to resolve an error "The conversion of a char data type to a datetime data type resulted in an

Why am I getting an error when executing this: select convert(datetime, '2010-09-12T18:11:48', 120) The message is: "Msg 242, Level 16, State 3, Line 1 The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value." I don’t get the error if I remove “T” from the string. But I need it to work with “T”.

Ajax issue: delay in getting data from web service using innerHTML, please guide

I am working on an ajax application which will display about a million records in an html table. Web service returns records from server, I build a logn string by concatinating data and tags and than put this string using innerHTML (not using DOM for getting better performance). For testing I have put 6000 recods in database (stored procedure takes about 4 seconds in completion of its execution). While testing on local system (database and application on same machine) it took about 5 minutes to display the records in page. After deplying on web server it did not responde even for more time. It looks very low performance. I put records in a CSV file and its weight was less than 2 MB. I couldn't understand why string concatinations to build html table and putting string in innerHTML is taking such a huge time (if it is the issue). Requiment is to show about million records in web page but performance on just 6000 records is disappointing. I am not gettign what to do to increase performance. Kindly guide me and help me.  

SharePoint Data Source for InfoPath Issue

I've setup an InfoPath form which uses web services to pull an XML data source into the form.  The form works great internally, but when users on the extranet try to use it they get a "Cannot Access Data Source" error.  I have tried connecting to the web service via the extranet and it works just fine, but as a data source in the InfoPath form, extranet users cannot access it.  I've tried using both the intranet and extranet paths to the web service with no luck.  Any ideas?

Data form web part issue with pagination

I have a data form web part which I put inside a user control. I have an issue with the default pagination. Clicking the previous button on the pagination causes a "Webpage is expired" error. I've tried copying the code from the next button but it doesn't work and I've also tried caching my pages still nothing. I read somewhere here that someone was able to implement the previous button pagination from code behind but he still hasn't replied to my question. Does anyone else know how to implement this from code behind? Really need help with this.
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend