.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

reading XML files and inserting to SQL

Posted By:      Posted Date: May 22, 2011    Points: 0   Category :ASP.Net

Hi there,

i was wondering if someone might be able to help me find the most efficient solution to read xml files and insert/update the data into SQL tables.

this is the XML file format


<?xml version="1.0" encoding="utf-8"?>
<table name="Machines">
<table name="Mac_1116">
<time>01/01/2011 00:00</time>
<time>01/01/2011 01:00</time>
<table name="Mac_1357">
<table name="Mac_1358">


the XML delivers two sets of data:

1.information about Machines: nodes inside

<table name="Machines">......</table>

this data needs to update a "Machines" table in the database.

2. Machine raw Data. each machine has node

<table name="Mac_[id]"></Machine>

View Complete Post

More Related Resource Links

reading values from config files in NUnit tests


One of my NUnit tests has to read in some values from config files.  In my main application this process works perfectly well, however when I run the unit test, the code that reads in the values from the config files doesnt read anything in.  Ive tried putting app.config in my unit test project (I even tried web.config) but nothing seems to work.  Are there any special steps involved when reading from config files in an nunit test ?

Reading data from .doc or .docx and inserting into db



A vendor is providing content which needs to be inserted into the db. The content is basically questions with options and explanations. An example is below.


Reading Excel files from 64-bit ASP.Net app

I have an ASP.Net app that is running on a 64-bit server. Part of that app reads data from Excel files and loads that data into our SQL_Server database.I am using the ACE OLE driver to read the Excel files and it works great on my 32-bit development machine. When we deploy the app (from a 64-bit client machine) to our 64-bit server, I get this exception when trying to open the connection:"System.InvalidOperationException: The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine."There are many posts about addressing this issue, so I think these are my options:1. Compile the app for any cpu from a 32-bit machine and deploy it to the server (making the app 32-bit) - not desirable as ideally we would like to run the app in 64-bit mode2. Convert the excel file to csv then use more base .Net libraries to get the data out3. Install Office on the server and use the Microsoft.Office.Interop.Excel library to access the data - not sure if this will work though4. Purchase a conversion library5.  Wait for the 64-bit version of Office and use the new Microsoft.ACE.OLEDB.14.0 driver. - Can I get a Beta version now?I am looking for confirmation that my options are accurate/complete and guidance on which of these (or another option) are the most viable.Thanks, Mike

Reading and checking .CSV File and Inserting Data to SQL server



I have got a page in which there is a file upload option where I have to upload/import the csv file. What I want to do is to check if the correct format of file is uploaded for instance if any other than csv file is uploaded, the system should give an error message. Also what I need to do is to check certain fields of the csv file for instance there are some mandatory fields in the csv file which should be there like name , postcode, How can I check that these fields are not empty . After performing these task, the system should automatically upload the csv file onto the sql sever 2008. Any ideas  or tutorial ... will be highly appreciated. 

SQL Server 2008 r2 - Setup support files - Install error / reading from file


I'm installing SQL server 2008 r2 on a Win2003 R2 x86 system.

After i start the setup, the "setup support files" box begins installing, after a while i get this error:

"Error reading from file f:\1033_ENU_LP\x86\setup\sqlsupport_msi\PFiles\SqlServr\100\Setup\ewbfphdh\x86\Patch\rs2000.msp verify that the file exists and that you can access it."

I've tried another DVD, i've tried a mounted share on another machine...no go....i've tried an ISO file mounted on this server... same error.

Is there any way i can install this "setup support files" from another location? (download)

Is there a switch soo it won't install reporting services? (...i think this file rs2000.msp is from reporting services)





Reading large binary files



I currently have a WPF app that controls a printer. The app reads very large binary files (bitmap images, larger than 1 GB) and does some bit manupilation on them.

I am currently using a FileStream to read 100MB worth of bytes into a buffer (using FileStream.Read method). After that I start my method that does the bit manupilation on the image file and sends it out to the interface on a printer. When the buffer nears the end of the 100 MB length, say when there are few 100 bytes left to read in the buffer, I call the FileStream.Read method again to read in the next 100MB worth of data from the binary file. I specify the offset to read the binary file by using FileStream.Position method.

The problem I am facing is that when it finishes reading the 100MB buffer and goes to read the next 100MB worth of data, the printer pauses for the amount of time it takes to read the next 100MB and after that there is a pause for every line it prints in the image. It is like .NET framework or the compiler looses sync of doing things a particular because I have to read the next 100MB and then never recovers. The pause is about 2-3 seconds.

Everything is working great, the bit manipulation is correct, it reads the file correctly, the printer prints the file the way it is supposed to, except I want to avoid the pause.

Any suggestions....Can I use a back

The reading of visual fox pro files (.dbf) does not work if the server has more that 1 GB of RAM

I am reading .dbf files using openrowset instructions, but it does not work if the files are bigger that 250MB. I been looking a lot for information and doing tests. At the end I tested with all my development enviroment mounted in a virtual machine and I discovered something really rare. If the RAM assigned to de virtual machine is 1 GB, the openrowset query works fine, no matter how big is the .dbf file(I tested with 1.5 GB files and it works fine). So my question is if somebody knows if is necesary some special configurations in SQL Server 2005 for to work in a server with 4 processors and 4 GB of RAM. The error message that appears when the server memory exceeds 1 GB is the next:

Msg 7399, Level 16, State 1, Line 1
El proveedor OLE DB "MSDASQL" para el servidor vinculado "(null)" informó de un error. El proveedor no proporcionó información acerca del error.
Msg 7320, Level 16, State 2, Line 1
No se puede ejecutar la consulta "SELECT d2,d3,d4,d5,d6,d7,d8,d9,d10,d10_2,d11,d12,d17,d18,d18_2,d18_3,d23,d13 FROM tadat_gigante.dbf" en el proveedor OLE DB "MSDASQL" para el servidor vinculado "(null)".

the query is the next:

    'Driver=Microsoft Visual FoxPro Driver; SourceDB=c:\; So

reading csv files


please help me  Reading csv file


name, division, address, Phone

I want to read data from csv file and then access it line  and column wise

please direct e to a code or guide for the same.

values = inputLine.Split(";");

reading input files


I am a new VB user. I have written a program in visual basic( visual studio 2005). 2 of my input files are really big, and they are my network characteristics, so they do not change in diiferent runs.The other input files chang from one run to another. Reading two network inputs every time I want to run takes a lot of time.  How can I read big input files once and for the next runs I just read the other files which has change?

I appreciate it if you could help me.

Reading and Writing Files in SQL Server using T-SQL

SQL Server has never been short of ways to read from and write to files and it is always better to use the standard techniques provided by SQL Server where possible. However, most of them are really designed for reading and writing tabular data and aren't always trouble-free when used with large strings or relatively unstructured data.

For reading tabular data from a file, whether character-delimited or binary, there is nothing that replaces the hoary old Bulk Copy Program (BCP), which underlies more esoteric methods such as Bulk Insert. It is possible to read text-based delimited files with ODBC, simple files can be read and written-to using xp_cmdshell, and you will find that OSQL is wonderful for writing results to file, but occasionally I've found I need to do more than this.

Thankfully, when armed with OLE Automation and the FileSystem Object (FSO), all sorts of things are possible. The FileSystem Object was introduced into Windows to provide a single common file-system COM interface for scripting languages. It provides a number of handy services that can be accessed from TSQL. In this article, I provide examples of stored procedures that use this interface to allow you to:

Clean Web.Config Files (VS 2010 and .NET 4.0 Series)

.NET 4 includes a new version of the CLR, and a new .NET 4 specific machine.config file (which is installed side-by-side with the one used by .NET 2, .NET 3 and .NET 3.5).

The new .NET 4 machine.config file now automatically registers all of the ASP.NET tag sections, handlers and modules that we've added over the years, including the functionality for:

.ASP.NET Dynamic Data
.ASP.NET Routing (which can now be used for both ASP.NET WebForms and ASP.NET MVC)
.ASP.NET Chart Control (which now ships built-into ASP.NET V4)
What this means is that when you create a new "Empty ASP.NET application" project in VS 2010, you'll find that the new default application-level web.config file is now clean and simple:

Inserting new row in GridView in ASP.NET 2.0

The GridView was not designed to insert new rows, but there is a way to accomplish this with very little code. This article shows how to do that. The GridView was not designed to insert new rows, but there is a way to accomplish this with very little code.

SharePoint document migration challenges when migrating files and folders

There are several challenges when migrating documents to Microsoft SharePoint. While these challenges can be overcome, they are a real pain if the migration source, content and file systems are not SharePoint friendly.

Combine, minify and compress JavaScript files to load ASP.NET pages faster

Websites are getting more interactive these days and most of this interactivity comes from JavaScript. People are using different JavaScript libraries and frameworks to make their websites more interactive and user friendly.

Reading and Writing Images From a Windows Mobile Database using UltraLite 10(C#)

Periodically I get a request for information on how to read and write binary data to a database running on Windows Mobile. If you search the Internet you can typically find examples that are available on Windows Desktops or allow you to read and write to a local file system. The problem is that it can take a bit of work to get this code to work on Windows Mobile accessing a database.

Ultimately you might be asking, why would I want to store and image in a database? Well in an environment where you synchronizing data between a local mobile database and a consolidated (central) database this can be extremely useful. Imagine if an insurance adjuster went to an accident scene, took a picture of a damaged car, loaded it into his Windows Mobile database and then replicated that image up to the Insurance headquarters for approval. All of this could be done in a very short period of time when using images in the database. Another good example might be a doctor who was waiting for a patient chart to become available. If you could store the image in a database this chart could be sent down to the doctor's device once it became available.

For this article I am not going to get into how to synchronize the images to and from a remote and central database as this is typically fairly straightforward when using a data synchronization technologies like MobiLink

PrintPocketCE Print Pocket Excel, Pocket Word and email files

Version 3.560 (May 15, 2009):

Important bug fix: a slight difference in how some devices create fonts was causing a few devices to have significantly longer print times for large print jobs.

Adjusted PocketJet printer paper feed commands

Other minor fixes and enhancements.

Version 3.559 (Mar 5, 2009):

Added support for Martel MCP78xx printers

Version 3.558 (Aug 20, 2008):

Added support for Martel MCP78xx printers

Version 3.557 (June 27, 2008):

Fixed error in Canon printer support

Fixed COM0 port selection

Version 3.556 (June 15, 2008):

Added support for Brother MW-260

Added support for Sato MB400

Version 3.555 (Aug 8, 2007):

Added support for Pentax RuggedJet 3 and RuggedJet 4 printers

Changed Epson TM-P60 support to maximum page width of 1200 dots

Version 3.551 (Aug 1, 2006):

Added Peripheral Nomad printer support

Version 3.550 (Mar 23, 2006):

Added Panasonic JT-H200PR printer support

Added Pocket Spectrum printer support

Fixed problem with WM5.0 "inverted image" problem

Conversion of text files from ANSI to UTF-8

reading and writing text files in ANSI format and
writing html-files in Charset ISO-8859-1 (Western Europe).
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend