$include_dir="/home/hyper-archives/boost/include"; include("$include_dir/msg-header.inc") ?>
From: Matthias Troyer (troyer_at_[hidden])
Date: 2002-09-27 11:03:52
Hi Robert,
I have looked at the serialization library in even more detail but
there is still the issue with large data sets that I do not fully
understand. I often have large vectors of small classes,
i) To construct a simple example:
struct Item {
int x;
int y;
};
std::vector<Item> v(10000000);
If I now serialize v, it seems that there will be an overhead unless I
write a specialized save function for std::vector<Item>. Is that
corrrect? That would be very inconvenient,
as I have lots of such small classes with just a few bytes of contents
bbut many member functions.
ii) I do not always want to serialize a std::vector of a POD by calling
operator<< for each element. Consider for example a
std::vector<int> v(10000000)
I need to serialize this into three different types of archives
a) a message buffer for use with MPI. Here I want to do something like
memcpy(buffer,&v[0],v.size()*sizeof(int))
b) a file or buffer in XDR format. Here I want to use just one call to
xdr_vector instead of millions of separate calls to xdr_int
c) output to a text file. here the standard way of writing each number
is just perfect
Thus, I need runtime polymorphism for the serialization of a
std::vector, which I do not seem to have in your current library. You
have operator<< as virtual functions for the POD types. I additionally
need similar virtual functions for writing C-arrays of POD types -
which your library currently lacks. Would it be possible to add such
functions?
All the best,
Matthias