$include_dir="/home/hyper-archives/boost/include"; include("$include_dir/msg-header.inc") ?>
From: vesa_karvonen (vesa_karvonen_at_[hidden])
Date: 2002-02-06 02:51:16
--- In boost_at_y..., "mfdylan" <dylan_at_m...> wrote:
[on recompilation caused by changes to a header]
> Unfortunately this is a problem whether or not headers are coarsely 
> or finely grained.  You might have a header file that declares only 
> one class but it is a class that is a fundamental part of the 
> application and hence included by almost everything.  It may just 
> happen to contain one single member function that is only used in 2 
> or 3 places, and requires a tweak to its defintion. 
[...]
The design you are describing above violates the Interface 
Segregation Principle. If you are in such a situation, you need to 
change the design.
> Because this could cause a huge rebuild most programmers are loathe
> to do it, and in my experience, tend to up resorting to hacks to
> avoid a total rebuild (and yes I've done it myself!).
Personally, I think that when most programmers make the choice of not 
making mini restructurings of large systems to avoid situations like 
this, they are reaching for a local optimum that is very very far 
from the global optimum. In other words, they are betting on the 
wrong horse in the long run. In plain english, restructuring the 
large system would make their work more pleasant as recompiles would 
eventually become orders of magnitude faster.
> The problem really comes 
> down to dumb make systems, as I commented on c.l.c++.m recently.  
> Make systems that work at the file-level granularity are really 
> inadequate in today's big software development environments.  
Perhaps. Have you considered the amount of data the make system would 
need to store? Have you considered the amount of time the make system 
would need to spend examining dependencies? Not that they would 
necessarily be problems, but have you actually considered them? I'd 
like to see an actual implementation of a smart make system rather 
than base important software design decision on vapour ware.
Personally I think that long build times in C++ are primarily caused 
by the design of C++ and secondarily caused by the design of large 
systems. Since changing C++ is too difficult, programmers need to 
design their systems better.
There are many ways to improve build times. One of the most effective 
ways is to use a programming language that does not use text based 
headers.
[...]
> IMO libraries should always provide finely grained headers along
> with wrapper headers (preferably one single one for the whole
> library) so that users can choose what works for them.
I agree. Such a design is likely to please most users.
> With modern
> precompiled header support, for a library that isn't likely to
> change much, I'd always use a single header file.  For our own
> libraries that are under constant development I'll #include as
> little as possible.  This seems to be so patently common sense I
> can't imagine there'd be so much disagreement over it.
The common sense is wrong if you happen to be mostly and continuously 
developing all the libraries you are using. Most good large system 
are really collections of modules more or less like libraries.