With the onset of C++ and object-oriented design, code has become seemingly more complex although arguably simpler to read (compared to assembly). As a software project (specifically c++) grows, the ability to test, update, refactor and compile/link the code begins to increase dramatically without proper care. This was to me personally a very new problem, given that in academia (i.e. assignments in my undergrad), the software never gets to size I've seen throughout my co-ops (millions of lines) and although strong logical design is pushed; physical design is not. It is the physical design of the code that can help reduce some of the issues mentioned above specifically compiling and linking times.
How long really can compile/linking take? I've heard of some code bases requiring more than a full day running in parallel on multiple machines. Although never that bad in the co-ops I've worked for, fast compilation and linking was something to strive for especially since the development cycle for game development is very fast (i.e. may be constantly tuning the codebase rapidly). Throughout my internships, I've come to encounter different/specific ways of combating these issues to help smooth out the development cycle and to hopefully allow for more maintainable code. In an attempt to somewhat turn that information into a work report, I decided to perform some additional work on my own and reading. Most of the information I gleamed referred to or was from the following book:
Compilation and Linking
In order to link and create the final executable, the compiler will recursively include the headers in the source files to create an intermediate object file. These object files are finally linked together (by the linker) together to create the final executable. The whole process is a bit more complicated than I've explained (there is external/internal linking); read the book above for more in depth info.
One of the early major problems, not so much anymore due to the preprocessor macro #pragma once and managed code, was including the same header more than once during the recursive inclusion of header files (leads to multiple define errors). This issue however is still of importance in the video game industry since the compilers for certain platforms *cough* PS3 *cough* aren't as great as current day PC compilers.
Header Guards & Redundant Header Guards
The time honored way of solving this issue in C++ is to wrap the header files in a unique header guard.
The recursive includes aren't the result of cyclic dependencies (see below) but rather simply the recursive includes which can exist in an acyclic structured program
Further optimizations are used by also including redundant header guards around the include statements. The addition of the redundant guards are simply an optimization to help naive compilers so that they don't keep reopening files that have already been included to only find out that they have already been included.
The use of redundant include guards however is extremely unpleasant as it binds you to external defines, can hide wrong #include statements and adds an additional two lines for every #include. These factors are what prompted my current employer to begin looking into other means of optimization for the compiler/linker.
How useful is redundant guards for it to be a necessary evil? Well if you take a look at pg87 of Large-Scale C++ Software Design, includes some benchmark numbers. Essentially, by not using redundant header guards you run the risk of a quadratic O(n^2) compile times, where N are the number of #include statements. Wow.
AKA "Dependancy Hell"
Time can either be spent attacking compile time dependencies or linking ones. One of the biggest problems I've personally seen either unmanaged or attacked with a zealot faith that would impress the most religious people are cyclic dependencies.
Easy to explain, difficult to find, and harder to remove, cyclic dependncies are simply when you have the following relationship:
At my last co-op, I began investigation of the current state of dependencies of the code base by building a dependably graph. Starting off by using the following perl script (modified slightly to work in our environment and some additional options) would create a .dot file which is an input file for software within the GraphViz programs that output directed/undirected graphs. An example of the output that can be achieved can be seen from that of the script being run on the ToirtoiseCVS codebase:
Easier than it seems... as always
Running the script on the source code resulted pretty much as what could be described by a black dot. There were so many interdependencies that nothing was visible and the final file was enormous (100s of MB).
- Edit the script and started cutting out files to comb through as well as to include in the dependency chart, starting with the internal libraries. Didn't work.
- Running the script on individual components, such as AI, Core, Gameplay or Animation. Didn't work. The final graph was still too intertwined.
- running the script on individual files, "manage files" (manage a specific functionality such as inventory) and only recurse to a certain depth.
I finally ended up with the third try with a graph and resulting image that were sizable, legible and useful. The images did demonstrate many unnecessary #includes as well as some intricate circular dependencies...wonderful.
More to come.