Compilation and DVCS-3

This would be my last and perhaps the most where I would go wrong as dealing with git and I have the least technical experience/understanding of it while the philosophy is much simpler to understand or know intuitively.

What is interesting to know/note is that both mercurial and git started at the same time (April 2005) and both started due to the GNU/Linux’s kernel source-code repository tool Bitkeeper not being available anymore.

The best way to describe what Bitkeeper did and what hg and git do is to borrow from that lkml article.

Unlike most tools that require a central repository to house the master copy of whatever data is under version control, BitKeeper allows a truly distributed system in which everybody owns their own master copy

One of the important reasons perhaps why distributed control repository systems/servers won over the central version control systems is commit access and the control/buy in for that. A central repository server/system is or would be more conservative in nature where they make calls as to who can commit any changes to the repository. Gaining trust and commit access could and does take a long time and thus a potential contributor would lose/loses the motivation to contribute back to the pool. Free time is precious.

Cut to a DVCS and you do not need to have anybody’s permissions and can just do/maintain your own thing/fork/branch whatever you feel to do. If your code has merit and it gets noticed and picked up by quite a few people, the possibility of those changes getting merged to the master repository are that much faster. In the meantime you have also short-circuited the whole ‘am I qualified to put patches in or not ? ‘ routine.

There may be exceptions to the rule, but those would be exceptions.

The other real-use-case is when development is happening in parallel as in browser development or/and the GNU/kernel development where each part/component is separately and parallelly being built/worked upon.In such a scenario having a central repository to put all changes would not just function, or it may function with major loss of productivity.

So this is what big projects do, use a dvcs, let development go on, at some arbitrary time/feature whatever, decide to call some hacked together code as version x.x and take few weeks to a month to merge all the changes so it builds cleanly and works together and polish the release.

When a team has done its part of the deal/branch whatever, it can go back to doing work on its branch developing it further so productivity is maintained.

That in essence is the idea and if one looks at it, it is more attuned to the agile practices that most developers and software development companies are looking at.

Apart from the productivity factor there is also the social factor which is something that open-source projects want to use and emulate. The power to share and do your own thing. That is one of the big reasons also that people go for DVCS as they want other people to be able to use and enrich themselves.

The other difference I have experienced in couple of projects that I have seen is a certain quirkiness either in the tools they use for development or the way they develop ( I mean instruction wise but this could also because I am not at all familiar with git workflows).

So without further ado let us go to the first game project I came to know about which uses git. The name of the project is adonthell.

The last release of the game in some sort of playable state was Adonthell 0.3, they are now moving to 0.4 which basically use low-resolution 3-D models. The project on the 0.4 part is still in infancy but they have done quite a bit of work in polishing the tools that would be used to make the game, which should prove useful when they actually go about re-doing or extending the game.

What is possible to see right now is the hero going through the redwynn tavern whatever . The basic command to check out the game is :-

git clone git://

There are a couple of other repositories that the user/gamer would need to check out in order to have the whole experience.

git clone git://

and the last one in the series :-

git clone git://

Of the three adonthell and adonthell-tools have been having most changes while wastesedge (the game which when finished 0.4) has been the least amount of changed code till now with adonthell-tools being in-between.

One of the first things I looked for was an equivalent command/tool to ‘svn info’ . After a bit of googling I found this excellent shell-script by Duane Johnson. It’s a bash shell script so can simply be run in any git repository and it will do its work. For instance this is the output it gave for adonthell :-

/usr/local/src/adonthell$ ./
== Remote URL: origin git:// (fetch)
origin git:// (push)
== Remote Branches:
origin/HEAD -> origin/master

== Local Branches:
* master

== Configuration (.git/config)
repositoryformatversion = 0
filemode = true
bare = false
logallrefupdates = true
[remote “origin”]
fetch = +refs/heads/*:refs/remotes/origin/*
url = git://
[branch “master”]
remote = origin
merge = refs/heads/master
file-view-path = config

== Most Recent Commit
commit dcc76874864ed83acfa180330b0b29003c6a1746
Author: Kai Sterker
Date: Thu Jul 14 21:38:21 2011 +0200

CHANGED file detection during loading to use file magic instead of extension

Type ‘git log’ for more commits, or ‘git show’ for full commit details.

Similar output can be found for adonthell-tools as well.I do find it pretty useful and have asked the good people at to package it for newbies and old hands as well.

As always there is quite a bit of data that can be gleaned by just reading the script which otherwise is not known. One of the first things I came to know which is interesting that they have been writing/doing unit-tests, see this part of the info. got :-

== Remote Branches:
origin/HEAD -> origin/master

While as a casual contributor/user I may not want to try out the unit-tests or look for memory leakages using valgrind. Also as somebody who hasn’t checked out those branches I cannot say whether the unit-tests are being constantly written and updated for the bleeding edge ,it does show that the developers have the right idea in mind so that regressions are at minimum.

Now both adonthell and adonthell-tools can be compiled using the basic gcc stuff we have been using. The first thing a user has to run is the shell script. It’s a very simply yet nice shell script as it simply removes some of the old values and re-builds it every time so the values are fresh. Here’s what we find if we open the script :-

# First of all clean up the generated crud
rm -f configure config.log config.cache
rm -f config.status aclocal.m4
rm -rf libltdl
rm -f `find . -name ''`
rm -f `find . -name 'Makefile'`

# Regenerate everything
aclocal -I .
libtoolize --force --copy
automake -Wno-portability --add-missing --copy --foreign

echo "Now type './configure' to prepare Adonthell for compilation."
echo "Afterwards, 'make' will build Adonthell."

So all of the things are familiar to us. Simply removes old/outdated or whatever .config.* files and Makefiles and asks to run configure again.

The second step where it starts becoming a bit weirder though. The second step entails :-

mkdir ../adonthell-build && cd ../adonthell-build

So it asks to make a new directory in the parent path, I wonder why it needs that ?

/somepath/adonthell-build$../adonthell/configure --prefix=/usr/local

This feels a bit weird. While being in adonthell-build directory, we are calling the configure script from adonthell directory. Not sure though what the prefix=/usr/local does though. Anyways at the end of the configure script I have a config.log, config.status files in adonthell-build as well as a Makefile in the adonthell-build directory.

Only two commands remain.

/somepath/adonthell-build$sudo make install

I really do not get why sudo make install is used/needed as right now, as there isn’t a adonthell menu entry on either the GNOME menu or/and the KDE Menu.

Anyways, to see what has been compiled or done till date, go the test directory and run the following command :-

/somepath/adonthell-build/test/$ ./worldtest -g ../../adonthell/test data

This does nothing but shows off the newest 3-d model (alpha male/our hero). As can be seen he is so dirt poor that he does not have a shirt for his back (literally) and is in shorts . Hopefully he will see better days as and when he gets to fighting and get some experience (xp) and money on him.

Another thing one can do/play with is seeing the world. This can be done simply by :-

/somepath/adonthell-build/test/$ ./worldtest -g ../.. wastesedge

There isn’t much in the game world atm. Just a bare inn with a basement and just a bare barrel on the outside. Most of their efforts atm seem to be focussed on improving the adonthell-tools i.e. the map editor (mapedit), the modeller (modeller), the dialog editor(dlgedit) and lastly the quest-editor (questedit). While the building process is similar/same as given of adonthell as above, the only difference is using adonthell-tools instead of adonthell where-ever they have been shared. During generating the configure script by going through ./ you will see a couple of warnings, please ignore those for the moment.

Running the tools themselves is interesting.

Go to the adonthell-tools-build/src/tool directory and run the following :-

/somepath/adonthell-tools-build/src/mapedit$./mapedit -g ../../../ -p wastesedge ../../../wastesedge/redwyne-inn.xml

This feels a bit weird as a casual user I did not do so much traversing between directories using the ../ convention.

Anyways, the tools by themselves look to be top-notch and seem to be written in python as I get these sort of statements while running the tools on stdout/terminal emulator .

(mapedit:30251): Gtk-CRITICAL **: IA__gtk_recent_manager_add_full: assertion `uri != NULL' failed
*** info: scanning '../../../wastesedge/schedules/obj/' for python scripts.
- trying 'comment' ... okay
*** info: scanning '../../../wastesedge/schedules/obj/' for python scripts.
- trying 'comment' ... okay

I am so looking forward to trying out when there are things to actually play.

Another fledgling project I came to know which uses git is BartK orange-engine. For those who do not know BartK, he is the founder of His project lies at gitorious . It’s also an RPG retro game with 8-bit goodness.

Just as before to checkout the goodies, one needs to do :-

git clone git:// orange-engine

and then to pull things from the server do each time :-

/somepath/bartk-orange-engine$ git pull origin qtquick
From git://
* branch qtquick -> FETCH_HEAD
Already up-to-date.

Compiling it requires a slightly different tool though. It requires Nokia QT Creator. Thankfully the tool is in Debian’s archive, although it does amuse and worry me the number of multiple licenses involved with the QT Creator tool and do wonder if all the libraries are bundled with it or does it use system libraries to do most of its work.

$ aptitude show qtcreator
Package: qtcreator
State: installed
Automatically installed: no
Version: 2.2.1-1
Priority: optional
Section: devel
Maintainer: Debian Qt/KDE Maintainers
Uncompressed Size: 42.5 M
Depends: libqt4-help (>= 4:4.7.1), libqt4-sql-sqlite, libc6 (>= 2.2.5),
libgcc1 (>= 1:4.1.1), libqt4-declarative (>= 4:4.7.1),
libqt4-designer (>= 4:4.7.1), libqt4-network (>= 4:4.7.1),
libqt4-script (>= 4:4.7.1), libqt4-sql (>= 4:4.7.1), libqt4-svg (>=
4:4.7.1), libqt4-xml (>= 4:4.7.1), libqtcore4 (>= 4:4.7.1),
libqtgui4 (>= 4:4.7.1), libqtwebkit4, libstdc++6 (>= 4.4.0)
Recommends: gdb, make, qt4-demos, qt4-dev-tools, qt4-doc, qtcreator-doc,
xterm | x-terminal-emulator
Suggests: cmake, git-core, kdelibs5-data, subversion
Description: lightweight integrated development environment (IDE) for Qt
Qt Creator is a new, lightweight, cross-platform integrated development
environment (IDE) designed to make development with the Qt application
framework even faster and easier.

It includes:
* An advanced C++ code editor
* Integrated GUI layout and forms designer
* Project and build management tools
* Integrated, context-sensitive help system
* Visual debugger
* Rapid code navigation tools
* Supports multiple platforms

Tags: devel::debugger, devel::ide, implemented-in::c++, interface::x11,
role::program, uitoolkit::qt, x11::application

The biggest disk-space hogger is the qtcreator-doc which gets pulled in with qtcreator. While I have surfed superficially on the documentation, but do hope to read them at leisure on some Saturday Sunday, the documentation at surface seems to be good.

$ aptitude show qtcreator-doc
Package: qtcreator-doc
State: installed
Automatically installed: yes
Version: 2.2.1-1
Priority: optional
Section: doc
Maintainer: Debian Qt/KDE Maintainers
Uncompressed Size: 8,581 k
Suggests: qt4-dev-tools, qt4-doc
Description: documentation for Qt Creator IDE
Qt Creator is a new, lightweight, cross-platform integrated development environment (IDE) designed to make development with the Qt application framework even faster and easier.

This package contains the documentation for Qt Creator IDE.

Tags: devel::doc, role::documentation

While I have put the whole building instructions on the game’s homepage I have to say that while the tool is pretty looking I have not seen any benefit other than being able to compile for mobile platforms as well. It does give a sort of flash of things that a developer needs to fix but do not know/see a way to capture/store it somewhere as a reference. There might be a way but not discoverable at first instance.

Anyways, coincidentally both the projects seem to be at initial stages. Although adonthell has done some remarkable documentation and in great detail which can be found in the wiki I do see a visible lack of contributors in both the project.

While for orange-engine I can understand as bart has not really talked/shared about the same in the wider community, adonthell’s documentation and the efforts behind the tools suggest quite a bit of passion, involvement and dedication of the developers. Also the developers are pretty approachable and do answer queries pretty quickly. Do not understand though then why they are lacking in getting active contributors.

Is it because of the nature of the tool (git) ? or something else altogether I have no idea.Adonthell does lot of its communication via the various mailing lists and IRC (I do not see a dedicated channel but whatever works for the developer community)

The good thing I came to know other than the tools are the two code repositories, and . Both the code repositories are bare and yet clean-looking. Not so much stuff on the screen as the repository has so its nice but that’s part of another story perhaps 🙂

That’s all for now. 🙂


2 thoughts on “Compilation and DVCS-3

  1. fyi, it is usual to build in a different directory than the sourcecode, hence the cd ../adonthell-build. The –prefix=/usr/local chooses where things get installed during the “make install” step.

    Nice article.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.