Touch
Usage
- The
TouchBuilding Block makes an object responsive to touch. - Use it for buttons, triggers, or interactions without picking up objects.
- Combine it with the
Workflowto link actions to touch input.
Settings
The Touch consists of the following components:
- On Start
- Input
- Highlight
On Start
Defines the initial state of the object when the scenario starts.
| Property | Description |
|---|---|
| Show Mesh | Object is visible |
| Highlighted | Object is visually highlighted |
| Enabled | Object responds to touch |
Input
Defines what the touch interaction responds to.
| Property | Description |
|---|---|
| Reacts on | XR rig node that triggers the touch |
Possible values:
xrrig_anyhandxrrig_lefthandxrrig_righthandxrrig_headxrrig_centerrig
Fingers and hand parts (hand tracking only):
xrrig_anyindex,xrrig_leftindex,xrrig_rightindexxrrig_anymiddle,xrrig_leftmiddle,xrrig_rightmiddlexrrig_anyring,xrrig_leftring,xrrig_rightringxrrig_anypinky,xrrig_leftpinky,xrrig_rightpinkyxrrig_anypalm,xrrig_leftpalm,xrrig_rightpalm
Finger and palm inputs only work when hand tracking is enabled.
Highlight
Defines how the object is visually emphasized.
| Property | Description |
|---|---|
| Color Override | Overrides the highlight color |
| Outline | Show an outline |
| Fill | Fill the object with color |
Actions
The following actions are available:
- Enable
- Disable
- Show Highlight
- Hide Highlight
Enable
Makes the object responsive to touch.
Disable
Makes the object no longer responsive.
Show Highlight
Shows the object’s highlight.
Hide Highlight
Hides the object’s highlight.
Reactions
The following reactions are available:
- When Touched By (node)
When Touched By (node)
This reaction is triggered when the object is touched by the specified input node.
Last updated on