Install instructions for Dungeon Crawl Stone Soup (DCSS) -------------------------------------------------------- (Last updated on 20071202 for Dungeon Crawl Stone Soup 0.3.3.) Building Dungeon Crawl Stone Soup --------------------------------- Crawl Stone Soup is known to compile successfully on the following platforms as of version 0.3: - Any Unix with a recent gcc (and g++), GNU make and libncurses, including Linux and Mac OS X. "Recent" is defined as gcc 3.3 or newer. - Microsoft Windows NT/2000/XP. The game will also run on Windows 9X and ME. DOS binaries can also be compiled on Windows NT+. The only supported compiler is gcc (available on almost all Unixes, and as djgpp for DOS, and MinGW for Windows). Other platforms are unsupported, but you should be able to build Crawl on pretty much any operating system with a modern C++ compiler (full support for the standard C++ library, in particular , the collection classes and is necessary) and the curses library. Optional libraries ------------------ Crawl can be built with some optional features: * Sounds * Regular expressions * User Lua scripts * Unicode characters for the map (Unix only). Crawl Stone Soup also uses a level compiler to compile special level definitions; to make changes to the level compiler, you'll need the flex and bison/byacc tools (Other lex/yaccs may also work). More details are available below. Sounds must be enabled by editing AppHdr.h (uncomment SOUND_PLAY_COMMAND on Unixes or WINMM_PLAY_SOUNDS on Windows). Regular expressions require libpcre on non-Unix operating systems. On Unixes, the standard POSIX regular expression support is adequate for Crawl's needs. Stone Soup 0.3 includes Lua 5.1.2 in its source tree. Crawl uses Lua for dungeon generation. In addition, Crawl has a (rudimentary) Lua interface for users to run scripts which can do things such as conditionalise parts of the .crawlrc/init.txt. Such user Lua scripts are enabled by default, but can be turned off by removing -DCLUA_BINDINGS from your makefile. Unicode support needs libncursesw and its header files; these are usually available in your operating system's package-management system. Unicode is not supported on Windows or DOS. Some systems, such as Mac OS X, may have Unicode support available in libncurses itself (i.e., without a separate libncursesw). Makefile system --------------- Crawl uses a selector makefile (under source/makefile) to control what platform it's building for. Your first step in building Crawl should be to edit source/makefile and point it at the correct platform makefile. For instance, if you're building for Windows, you'd use MAKEFILE=makefile.mgw to build with MinGW for Windows (# is used for comments in makefiles). Consult the operating-system specific sections for detailed information on building Crawl. When you're done building Crawl, look at the section "Data files" for important information on what files Crawl needs as it starts up. Building on Unix (Linux, *BSD, Solaris, etc.) --------------------------------------------- Security: If you have untrusted local users, we highly recommend you do not install Crawl setgid or setuid. Just running "make install" will install Crawl setgid games, do *not* do this unless you're sure you trust your users. If you have untrusted users, the correct way to install a multiplayer Crawl is using a chrooted game launcher such as dgamelaunch. To install or not to install: If only one user on the system (you) is going to be playing Crawl, you do not need to use "make install". A simple "make" will build Crawl in the source directory, where you can run it as "./crawl". Prerequisites: GNU gcc and g++, GNU make, libncurses or libcurses. You need the development headers for ncurses - they may not be installed by default on some Unixes. flex and bison are optional but highly recommended. Recent versions of byacc are also fine (edit your makefile appropriately). If you want to use Unicode, you need to link with a curses library that understands Unicode characters, usually named libncursesw (the development headers for libncursesw are usually in /usr/include/ncursesw.) You also need to have a UTF-8 locale installed. You can then build Crawl with support for Unicode by setting UNICODE_GLYPHS = y in makefile.unix. Building: * cd to the source directory (you can safely ignore the dolinks.sh and domake.sh scripts). * Edit makefile and make sure that MAKEFILE=makefile.unix is uncommented and all other MAKEFILE= lines are commented out. * If you want to install Crawl for multiple users, edit makefile.unix and set SAVEDIR and DATADIR to appropriate directories. This is not necessary if only one user is going to play Crawl. Also check INSTALLDIR and change it if necessary. * Edit AppHdr.h and check that SAVE_PACKAGE_CMD and LOAD_UNPACKAGE_CMD are set correctly for your system. If you do not want your saves packaged in a zip archive, it's safe to comment out SAVE_PACKAGE_CMD and LOAD_UNPACKAGE_CMD. * If you don't have (or don't want to use) flex or bison, edit makefile.unix and set DOYACC := n. If you want to use byacc instead of bison, edit makefile.unix and set YACC := byacc. On some Unixes, you may not have flex (but have some other lex), in which case you'll have to set LEX := lex in makefile.unix. * Run make to build the normal (non-wizard) Crawl. On systems such as Solaris, you may have to use gmake for GNU make. Make sure your make identifies itself as GNU Make when you do make --version. * If you're installing Crawl for multiple users, run make install. Crawl will be copied into the directory specified by INSTALLDIR. The save and data directories will be created if necessary, and the level layout (.des) and help files will be copied to the data directory. * If you do not want players to be able to script Crawl with Lua, edit makefile.unix and remove -DCLUA_BINDINGS from the CFOTHERS line. Building on Mac OS X -------------------- You can follow the Unix instructions to build Crawl (but note you still need to install Xcode to get gcc and make), or alternatively you can use Xcode. Note that the Unix instructions will build Crawl assuming that your terminal can display 16 colours. If you're planning to use Terminal.app (which supports only 8 colours), you should follow the Mac build instructions below. * Crawl has been tested with Xcode 2.4 under OS X 10.4.7 on both PPC and Intel machines, but is likely to be buildable with earlier versions. * Make sure Xcode is installed. Xcode should be available on the OS X install DVD if you haven't already installed it. * Open the Xcode project (Crawl.xcodeproj) under the "source" directory. * Hit Build in Xcode. * The default build configuration, Release, will build a ppc/i386 Universal binary suitable for play on all OS X 10.3.9 or newer systems. The other build configurations are intended for development and may not result in a binary that can be distributed to others. * You can also use makefile.osx, which will run xcodebuild from the command line. * If you'd like users to be able to script Crawl with Lua, you can edit AppHdr.h, uncomment // #define CLUA_BINDINGS and rebuild to compile with Lua support. See the section on Lua for more information. Building on Windows (MinGW) --------------------------- NOTE: You cannot build Windows binaries on Windows 9x/ME using the MinGW makefile supplied (which needs the cmd.exe shell of the Windows NT family). If you're on 9x/ME, you can use the Cygwin build instructions, or build a binary on a Windows NT/2k/XP system (the binary will run on 9x), or build a DOS binary. * Install MinGW from http://www.mingw.org. The MinGW installer is best so you don't have to fiddle with individual packages (you can mess with the individual packages if you like to, of course). If you want to edit the level compiler, also get the flex and bison packages (available from the GnuWin32 project on Sourceforge: http://gnuwin32.sourceforge.net/). * Make sure you have g++ and mingw32-make in your path. * cd to the the Crawl source directory. * Build Crawl by running mingw32-make MAKEFILE=makefile.mgw install * If you want regular expression support, you can edit AppHdr.h and uncomment this line: // #define REGEX_PCRE Note that there are multiple // #define REGEX_PCRE lines in AppHdr.h - find the one in the Windows-specific section. Also see the section below on obtaining the pcre library to link against. * If you enabled REGEX_PCRE, add -lpcre to the LIB line in makefile.mgw as: LIB = -lpcre -static -lwinmm -L$(LUASRC) -l$(LUALIB) -L$(SQLSRC) -l$(SQLLIB) and build Crawl to include regex support. * If you have flex and bison, edit makefile.mgw and set DOYACC := y. * When you're done, you should have crawl.exe under a "rel" subdirectory. Building on Windows (cygwin) ---------------------------- * Get Cygwin from http://www.cygwin.com/. When installing, ensure that the following packages are selected: gcc, g++, make, flex, bison, libncurses-devel. If you'd like to build from svn, install the svn client. You may also want to install diff and patch if you'd like to apply third party patches, or create your own. * Once Cygwin is installed, open a Cygwin bash shell (use the Start menu, do not double-click bash.exe in Explorer). cd to the Crawl source directory. * Follow the Linux build instructions to build Crawl. Building for DOS (djgpp) ------------------------ * Install djgpp from http://www.delorie.com/djgpp/. Don't forget to include C++ support when the Zip picker asks for what you want. You may also have to download GNU make as a separate package. It's important to follow the install instructions carefully, because bad installs can produce rather confusing error messages. * Make sure gxx and make are in your PATH. * If you want to modify the level compiler, install the djgpp flex, bison and m4 packages and set DOYACC := y in makefile.dos. * cd to the Crawl source directory. * Build Crawl by running make MAKEFILE=makefile.dos * If you want PCRE, edit makefile.dos and change this line: CFLAGS := $(INCLUDES) $(CFWARN) $(CFOTHERS) to CFLAGS := $(INCLUDES) $(CFWARN) $(CFOTHERS) -DREGEX_PCRE And add -lpcre to the LIB line in makefile.dos, like so: LIB = -L$(LUASRC) -l$(LUALIB) -L$(SQLSRC) -l$(SQLLIB) -lpcre then build Crawl. * When the build is done, crawl.exe should be in the source directory. Building Tiles versions ----------------------- * For Windows builds, run make MAKEFILE=makefile_tiles.mgw * For Linux builds, run make MAKEFILE=makefile.x11 * For Mac OS X builds, run make MAKEFILE=makefile.x11 OSX=y * All platforms require the same prerequisites listed in the other sections above for building each of these platforms. * The Linux and Mac OS X builds additionally require libpng. The Mac OS X version expects it to be installed via fink in /sw. If this is not the case, then you should edit PNG_INCLUDE and PNG_LIB in makefile.x11. ***************************************************************************** Data files ---------- Crawl looks for several data files when starting up. They include: * Special level and vault layout (dat/*.des) files. * Core Lua code (dat/clua/*.lua). * Descriptions for monsters and game features (dat/descript/*.txt). * Monster dialogue files (dat/*.txt). All these files are in the source tree under source/dat. Crawl will also look for documentation files when players invoke the help system. These files are available under the docs directory. Your built Crawl binary must be able to find these files, or it will not start. If Crawl is built without an explicit DATA_DIR_PATH (this is the most common setup), it will search for its data files under the current directory, and if it can't find them there, one level above the current directory. In short, it uses these search paths: ., ./dat, ./docs, .., ../dat, ../docs. If Crawl is built with an explicit DATA_DIR_PATH (for multiuser installs on Unix), it will look for its startup files strictly under that directory: $DIR, $DIR/dat, $DIR/docs. As Crawl loads its startup files, it will convert them to a binary format (so that future startups will be faster) and store these binary files in the saves directory. These binary files (with extensions .db, .dsc, .idx, .lk) can be safely deleted as long as there is no running Crawl, and they will be regenerated the next time Crawl starts. ***************************************************************************** The level compiler ------------------ Crawl uses a level compiler to read the level design (.des) files in the source/dat directory. If you're using one of the standard makefiles, the steps described in this section are performed automatically: The level compiler source is in the source/util directory (levcomp.lpp and levcomp.ypp). The steps involved in building the level compiler are: * Run flex on levcomp.lpp to produce the levcomp.lex.cc lexer. * Run bison on levcomp.ypp to produce the levcomp.tab.cc parser and levcomp.tab.h * Compile the resulting C++ source files and levcomp.cc and link the object files into the Crawl executable. For convenience on systems that don't have flex/bison, pre-generated intermediate files are provided under source/prebuilt. The makefiles provided with the Crawl source distribution will use these pre-generated files automatically if flex/bison is not available. ***************************************************************************** Optional Libraries (Lua and PCRE) --------------------------------- Lua --- Security on multiuser systems (Unix): Enabling Lua user scripts is unsuitable for Crawl installed setuid or setgid - we have not audited the user scripting interface for security. If you're not planning to install Crawl setuid/setgid (not running set[ug]id is a good idea in general), you can enable the Lua user-script bindings safely. As of 0.3, the Lua source is included with Crawl. The only step needed to enable user-scripts is to uncomment CLUA_BINDINGS in AppHdr.h. PCRE ---- On Unixes, you're better served by the existing POSIX regular expression support. If you want PCRE, your package management system is again your best bet. Remember to install development headers, not just the plain library. On Windows, PCRE binaries are available from http://gnuwin32.sourceforge.net/packages/pcre.htm On DOS you get the joy of building PCRE yourself. It's a little more annoying than building Lua (you have to roll your own makefile), but not by much. Unicode (Unix only) ------------------- Modern Unixes may support Unicode terminals (particularly xterms). If you have a terminal that can display Unicode characters, and an ncurses library that can handle Unicode (libncursesw, and its devel headers), you can build Crawl to display Unicode in the map: set UNICODE_GLYPHS = y in makefile.unix. On Mac OS X, libncurses includes Unicode support; makefile.unix should detect Mac OS automatically and link to libncurses when UNICODE_GLYPHS=y. NOTE: You may have libncursesw, but not have the header files; check that you have the header files installed as well, or you'll get a lot of errors. Crawl expects the ncursesw headers to be in /usr/include/ncursesw. After compiling Crawl with Unicode support, you still need to add the line "char_set = unicode" in your .crawlrc to tell Crawl to use Unicode. You may also need to set the locale in your terminal (notably on Mac OS) if the encoding does not default to UTF-8. To check this, run "locale charmap", which should say "UTF-8". If your encoding is not UTF-8, you can force it to UTF-8 on most systems by doing "export LC_ALL=en_US.UTF-8" or the equivalent, depending on your language locale and what shell you're using. Crawl tries to use en_US.UTF-8 as its default Unicode locale. If you do not have this locale installed, but do have some other UTF-8 locale, you can tell Crawl to use your preferred UTF-8 locale by setting UNICODE_LOCALE = ${foo}.UTF-8 in makefile.unix, where ${foo} is your locale name. You may not want to embed the locale in Crawl itself, but have Crawl use the locale as set in the environment LC_* variables. In such cases, you can use UNICODE_LOCALE = . in makefile.unix. Crawl will then use the locale as set in your environment. If you're playing Crawl on a remote machine, the remote Crawl should be built with Unicode support, it needs to have a UTF-8 locale installed, *and* your local terminal (where you're running telnet/ssh) needs to be able to decode UTF-8.