Touch

Usage

  • The Touch Building Block makes an object responsive to touch.
  • Use it for buttons, triggers, or interactions without picking up objects.
  • Combine it with the Workflow to link actions to touch input.

Settings

The Touch consists of the following components:

  • On Start
  • Input
  • Highlight

On Start

Defines the initial state of the object when the scenario starts.

PropertyDescription
Show MeshObject is visible
HighlightedObject is visually highlighted
EnabledObject responds to touch

Input

Defines what the touch interaction responds to.

PropertyDescription
Reacts onXR rig node that triggers the touch

Possible values:

  • xrrig_anyhand
  • xrrig_lefthand
  • xrrig_righthand
  • xrrig_head
  • xrrig_centerrig

Fingers and hand parts (hand tracking only):

  • xrrig_anyindex, xrrig_leftindex, xrrig_rightindex
  • xrrig_anymiddle, xrrig_leftmiddle, xrrig_rightmiddle
  • xrrig_anyring, xrrig_leftring, xrrig_rightring
  • xrrig_anypinky, xrrig_leftpinky, xrrig_rightpinky
  • xrrig_anypalm, xrrig_leftpalm, xrrig_rightpalm

Finger and palm inputs only work when hand tracking is enabled.


Highlight

Defines how the object is visually emphasized.

PropertyDescription
Color OverrideOverrides the highlight color
OutlineShow an outline
FillFill the object with color

Actions

The following actions are available:

  • Enable
  • Disable
  • Show Highlight
  • Hide Highlight

Enable

Makes the object responsive to touch.


Disable

Makes the object no longer responsive.


Show Highlight

Shows the object’s highlight.


Hide Highlight

Hides the object’s highlight.

Reactions

The following reactions are available:

  • When Touched By (node)

When Touched By (node)

This reaction is triggered when the object is touched by the specified input node.

Last updated on