$include_dir="/home/hyper-archives/boost/include"; include("$include_dir/msg-header.inc") ?>
From: Jody Hagins (jody-boost-011304_at_[hidden])
Date: 2004-04-15 18:56:59
I know that C++ IOStreams are supposed to take over the world, and
cstdio with FILE is considered taboo by some C++ efficionados. However,
due to many reasons, lots of code still uses cstdio FILE I/O. One issue
that comes up from time to time is the need to manage more files than
the OS will allow a single process to have open simultaneously. I
process tons of data, and this is an issue for me. I have developed a
small library that provides on-demand caching of FILE pointers, so that
an application can "open" as many FILEs as necessary, and use them as
normal. A simple LRU eviction algorithm is used to reclaim FILEs when
"all" have been used.
I was discussing this library with another developer, and he said he has
seen several questions recently about a similar issue, and he advised me
to ask if there was interest here. This library, however, does not seem
"on the edge" enough though...
A very simple example of how you can use the library (of course there
are better ways to do the following, but it is meant to be a small easy
to use example).
// Open 10,000 FILEs
wjh::file_cache file_cache;
std::vector<wjh::cached_fptr> fp;
for (int i = 0; i < 10000; ++i)
{
std::stringstream strm;
strm << "FILE_" << i;
wjh::cached_fptr fptr = file_cache.open(strm.str().c_str(), "w");
if (!fptr)
{
std::cerr << strm.str() << ": " << strerror(errno) << std::endl;
break;
}
fp.push_back(fptr);
}
// Randomly write to a particular file.
for (int i = 0; i < 200000; ++i)
{
int x = rand() % fp.size();
fprintf(fp[x], "file %d, iteration %d\n", x, i);
}
Is it something useful for more than just me (and the pitiful souls who
work for me and must use it) on this list?
Is it something worth posting here?