Tutorial: Writing jReality tools

Please note the following points from the Tutorial meeting yesterday (Jan. 10):

  • Your project software should be based on a Java class which is a sub-class of util.Assignment.  Please name this class Project{TeamIdentifier}.java.  For example, if I were a singleton team, I’d name my project ProjectGunn.java.  This is a change from the convention we adopted for assignments; I’ve decided it’s worth having unique names when I’m busy testing out the results and don’t want to get confused by a run history featuring 10 applications all named Project.
  • There are at least two teams which prefer to present their results at the end of the semester (meaning either the last week of classes or the week after the last week of classes). If you would like to also present your project in February please contact me to let me know.  All the other teams are then expected to be ready to present their projects during the first week of the summer semester (the week beginning April 8).  This is a hard deadline. Please plan accordingly.
  • A sign-up sheet for appointments for team meetings next Monday and Tuesday was passed around.  If your team wasn’t represented on Thursday at the tutorial, please contact me to arrange a meeting.
  • The rest of the period I discussed the jReality tool system.  Although I promised that there will be no more assignments (your project is THE assignment), I’ve gone ahead and written gunn.Assignment7 for your edification.  It is not an assignment in the traditional sense, since you are not expected to write your own version.  However, I expect you to study it and become familiar with how the tool contained in it works, since most projects involve an interactive aspect which will require that you write your own tools.  The rest of this post describes this application and the jReality tool system.

The problem statement

Up til now, we have concentrated on creating scenes for jReality, learning in the process how to use the class SceneGraphComponent to build a scene graph, Appearance to control how it is rendered, a variety of geometry factories for generating content, and a variety of utility classes for performing common tasks.  We’ve also practiced using some external packages relevant to our mathematical theme:  the discretegroup package for working with symmetry groups, and an animation package for designing and saving animated sequences.

One major area which we haven’t worked with in detail is the jReality tool system.  This s a sub-system which allows the user to interact with the running application using input devices.  This is not the place for an exhaustive treatment. Read this Wiki entry for an introduction to the tool system.

Very short description of the tool system: In order to provide a layer of abstraction between the actual hardware input devices and the tool system (in order to allow the same application to run in different environments such as a desktop and the PORTAL), the tool system is based on the notion of an input slot, which represents some kind of input device.  Slots generate input events; tools inserted into the scene graph are notified of these events depending on the positions of a) the tool in the scene graph and b) the source of the event (also typically in the scene graph).

The basic interactive tool will have some activation slots which cause the methods activate() and deactivate() to be called; while it is active, events arising from its current slots will cause its perform() method to be called.  The tool is provided with an instance of ToolContext, which contains information describing exactly where the scene graph has been picked by the user.  The most important part of the ToolContext are methods which provide the current pick results (getCurrentPick() and getCurrentPicks()), along with methods providing SceneGraphPath’s specifying where the active tool is located, and where the picked geometry is located (getRootToTool() and getRootToLocal()).

Assignment7 embodies a simple curve editor.  The user interacts with the curve via a single tool inserted into the same scene graph component containing the curve.  This virtually guarantees that it will only receive tool events involving this curve, but not for example the square background geometry). There is a single InputSlot, the left button input slot.  When the user presses the left button, the tool is activated, and when he releases it, the tool is deactivated.  If the cursor is over the curve, the activate() method does the following:

  1. If an edge has been picked, (this can be found out by the method PickResult.getPickType()) then the tool inserts a new vertex into the curve at the picked point of the edge, and turns off picking of edges for this geometry.
  2. If a vertex has been picked, the perform() method is called.

The perform() method only takes action when a vertex has been picked.  Since edge picking has been disabled, this should always be the case. It then edits the list of vertices of the curve to move the picked vertex to a new position given by the pick point (note that this is different from the coordinates of the picked vertex, since the coordinates of the pick point correspond to the point on the tiny sphere which is drawn to represent the underlying vertex.  If you want to access the actual coordinates of the picked vertex, you  can use the getIndex() method of the PickResult class to obtain the index in the vertex list of the IndexedLineSet to obtain this information.  The code contains numerous examples of how such methods can be used so I refrain from further “explaining”.

Don’t neglect to update jReality before testing out Assignment7: a small extension had to be made to the PickResult class in order to be able to locate exactly the picked segment along the curve (the method getSecondaryIndex()).

Leave a Reply