I have a requirement to download a gazilion finance related files from serveral http sites and load to SQL Server. I was able to use the script task and leverage the webclient object to download the requested data to files; no sweat. Then in
a file task I read in the file into the process flow (w/in the data flow) and the rest is history. Is there a way I could bypass the "downloading to a file and re-reading it in step"? I'd like to simply stream the request using webclient.DownloadData().
My question is how do I get the resulting bytearray into the process/data flow so that I can transform/load where ever I like?
I'm also interested in if this will help with performance. I'm thinking is should but will this turn into a memory hog then and slow things down. Also, If I have to use the download and read the file approach, I'm concerned about babysitting
the temp file downloads. Would I need to continuously clean up after myself for every download?
Thanks in advance,
View Complete Post