I have a process A that gathers data and hands it over for processing to process B. The amount of data is anywhere near 650MB or more, the transport should be as fast as possible, yet painless (if possible). A and B are different processes, so the scenario
could be considered Inter Process Communication.
Here is what I've done and found so far:
- Process A created an object filled with all information and serialized it into a file. B deserialized and processed the data.
- the computational time on the serialization is OK.
- the deserialization is many times bigger than serialization to the point that I consider it useless because it takes too long.
- I do not serialize/deserialize instead use a flat file for the communication.
- it's quick and same speed for read and write
- lacks flexibility and is not very elegant either.
- I used namedPipes to transfer the serialized object between processes A and B, which is elegant but ever slower because of the issue in #1 and the fairly poor performance of namedPipes and big data?
- Can somebody please advise me on how to best deal with my requirements?
- Explain to me why deserialization is soooo slow (I understand that depending on type a lot of object have to be created) and what better approach one should take.
View Complete Post