summaryrefslogtreecommitdiffstats
path: root/.gitignore
diff options
context:
space:
mode:
authorj-p-e-g <j-p-e-g@c06c8d41-db1a-0410-9941-cceddc491573>2008-05-29 15:41:39 +0000
committerj-p-e-g <j-p-e-g@c06c8d41-db1a-0410-9941-cceddc491573>2008-05-29 15:41:39 +0000
commit871c03730f4206865df239d6698732ee801c5043 (patch)
tree7f7bb1f1e0f1ba49b9471962393ae183cc3da31e /.gitignore
parent5730d82050d02ac2fa93ba01900cf6a43c110a9d (diff)
downloadcrawl-ref-871c03730f4206865df239d6698732ee801c5043.tar.gz
crawl-ref-871c03730f4206865df239d6698732ee801c5043.zip
Implement a much more efficient version of the LOS testing used for
monster patrolling. As before, ray casting is used to verify LOS but ray casting now only happens if a blocked grid is encountered when searching the surroundings in a growing spiral. If a blocked grid is found, all beams that might possibly go through this grid (heuristically tested, probably tests more than needed) are checked right away and all grids behind the blocked one are marked as well - this blocking can be reversed, if we later find another beam that unblocks it. A beam that was already tried isn't tried a second time. The values are stored in a (2*LOS_RANGE)^2 array and then directly compared when deciding where to send a patrolling monster next. In the worst case we still need to fire beams at each of the edge grids, but that probably can't be avoided entirely, though performance could obviously still be improved. The increase in speed is very much noticeable, i.e. for me Crawl runs smoothly again. :) git-svn-id: https://crawl-ref.svn.sourceforge.net/svnroot/crawl-ref/trunk@5330 c06c8d41-db1a-0410-9941-cceddc491573
Diffstat (limited to '.gitignore')
0 files changed, 0 insertions, 0 deletions