I'm using a cache transform in a dataflow to reuse a large lookup ref data (around 1.5M records). However getting out-of-memory exceptions midway in the execution. I did monitor the memory usage and it was just above the 50% usage on the dev machine where
I was testing this package. On the same machine, I then reverted to using the lookup cache with full cache. The lookup component did not throw any exceptions although the pre-execute took (as expected) around 15min. Is there a memory issue with the Cache Transform
Here is the exception:
Error: 0xC0047012 at Populate Cache LOOKUP_REF: A buffer failed while allocating 10479040 bytes.
Error: 0xC0047011 at Populate Cache LOOKUP_REF: The system reports 60 percent memory load. There are 3740508160 bytes of physical memory with 1478172672 bytes free. There are 2147352576 bytes of virtual memory with 216399872 bytes free. The paging file has
5713952768 bytes with 2810195968 bytes free.
Error: 0xC0208252 at Populate Cache LOOKUP_REF, Refresh REFERENCE_DOMAIN_VALUE_SK : Unable to allocate memory for a new row for the main workspace buffer. An out-of-memory condition occurred.
View Complete Post