Demo from Soren Renner srenner@lycosmail.com
--------------------------------------------
[editor's note: the Python Imaging Library is required]

This demo directory is a partial snapshot of my current development
directory.  I'm interested in visual psychology, and I thought that
toy worlds (like video game worlds) might more "intelligible" if more
aspects of visual attention were simulated in the interface. Some of
these are not feasible until head-mounted displays and eyetracking are
available (to me). Not everything in this directory is documented and
some things may not work. It's all alpha.  Everything here is open
source and copylefted with all the normal implications.

Everything runs in the testbed, 'attention.py'. Just type "python
attention.py" from a shell window. You may have to click between
windows several times before Tk gives the focus to the render
window. After it does, '1' will randomly switch the object of
attention. The shell window will print out the frame rate and some
other information for each frame. '2', pressed when a tree is the OOA,
switches attention to a child object. It's fun, if you like that sort
of thing. Some of the objects rotate on their own. The three mouse
buttons control movement of the camera along X,Y, and Z. If it runs
too slowly, you can comment out some of the objects in the 'start'
method, or simplify some of the objects, especially the trees - a
6-level tree is pretty big.  Texture mapping is slow too. I get about
2 FPS with everything drawn. I wonder how much hardware rendering
would speed it up. Probably the bottleneck would move into the Python
code.

The foliage on the tree is made of dots, not polygons. I like the
effect, especially from a distance - up close, it's a bit sparse.  If
you have any questions or comments, write me at srenner@lycosmail.com
.
