Please enable JavaScript.
Coggle requires JavaScript to display documents.
3D User Interface Input Hardware, Strategies for Building DIY Input…
3D User Interface Input Hardware
Input Device Characteristsics
Degree of freedom (DOF)
6-DOF: 3
position
values (x,y,z) & 3
orientation
values (yaw, pitch, roll)
Input frequency
Discrete
Continuous
Combination
Sensor type
Active sensors
Passive sensors
Intended use
Locator
Valuators
Choice
Traditional Input Devices
Biasanya digunakan dalam UI 3D desktop
Active sensing
Digunakan dalam more immersive settings with appropriate mappings
Types of traditional input devices
Keyboards
active sensing input device
ada discrete components (button)
Not conducive to
immersive
VR and mobile AR (laban ya salunya digunakan ba 3D modelling & computer games)
Not portable
2D mice and trackballs
Trackballs
an upside down mouse
No need flat surface to operate
Can be held
2 essential components
Continuous 2D locator
Set of discrete components
sama juga baka keyboards, not designed for immersive VR, mobile or AR applications
Pen-and touch-based tablets
Produce the same kind of input as mice (2D pixel coordinates)
Stylus & one or more fingers
Smaller
device can be used in immersive VR, mobile and AR settings
Larger
devices can be used in desktop and display wall settings
Joysticks
active sensing input device
have a combination of continuous 2D locator and a set of discrete components
Types:
isotonic joystick
isometric joystick
Desktop 6-DOF input devices
Use isometric force (velocity control)
to collect 3D position and orientation data
3D Connexion-3D mouse
Push/Pull: Utk translation
Twist/tilt: Utk orientation
3D Spatial Input Devices
Primary means of communicating user intention in 3D UIs for a variety of VR, AR, and mobile applications
ada sekali dengan physical componenets like button, sliders, etc.
Example: 3D mice
User-worn 3D mice
Handheld 3D mice
Use both active and passive sensing
to provide information about the user or a physical object’s
position, orientation, or motion tracking
in 3D space
Using sensors to track the user
Sensing Technologies
Critical for 3D UI to have accurate tracking
Types of sensing technologies for 3D tracking
Acoustic sensing
Use
high-frequency sound emitted from source components
and received by
microphones
Acoustic Sensing Demonstration
Types
Outside-in
Inside-out
Bioelectric sensing
Examples
Bioelectronic medicine
Hybrid sensing
Put
more than one sensing technology
together
Advantages(slide page 20)
Example
Inertial and ultrasonic sensing
Vision and inertial sensing
Disadvantages
makes algorithm more complex: Sensor fusion algorithm such as Kalman, Extended, Unscented filters)
Inertial sensing/IMU Sensors
variety of inertial measurement devices
Angular rate gyroscopes
Linear accelerometers
Tracking gravitational pull, measure linear acceleration
Magnetometers
Tracking magnetic north (?)
Better from acoustic sensing in terms of range and sampling rates
Most inertial sensors only track
orientation
for 3D UIs
Often need sensor fusion algorithms such as Kalman filters
to
improve accuracy
Advantages
No need bunch of cameras
100% get the body-movement
No occlusion
Can go anywhere
inexpensive, provide high update rates & low latency
Xsens 3D Motion Capture System with Inertial Sensors
Magnetic sensing
Small sensor, the receiver, determines its
position and orientation
relative to magnetic source
Size of magnetic source determines
tracking range
Have
good accuracy
, tapi menurun if the receiver jauh dari source
Often need filtering or calibration algorithms to remove distortions
Mechanical sensing
Often a component of haptic devices, headsets
As tracked object moves,
linkages
move as well to
gather position and orientation data
AgileVR Wearable
Optical sensing
Use measurements of reflected or emitted
light
Example
Single webcams
Multiple cameras
Depth cameras
360 degree cameras
Two main options
Outside-in tracking
Inside-out tracking
Positional tracking: "Outside-in" vs. "Inside-out"
Methods of optical tracking
Optical tracking with Marker
Marker-less tracking
Optical Tracking with Visible marker
Obtain positional information
challenges (slide page 18)
Radar sensing
radar sensors for assisted and automated driving
Uses
modulated electromagnetic waves
sent toward
moving or static targets
that scatter transmitted radiation
with some portion of the energy
redirected back toward the radar
where it is
intercepted by a receiving antenna
Overview of Positional Tracking Technologies for Virtual Reality (Magnetic, Acoustic, Inertial, Optical-with marker&markerless, Depth map)
Tracking the Body for 3D UI
Tracking the Head, Hands, and Limbs
Can be done with both active and passive sensing
Active sensors(electromagnetic, inertial): Sensors ba 3D glasses & hand user
Passive sensors (optical)
Tracking the Fingers
useful to have detailed tacking information such as
"How the fingers bending"
Can be done with both active and passive sensing
Data gloves (active sensing)
use bend-sensing technology to track the fingers, postures, and gestures
bend bah, so data given as
JOINT ANGLE
measurement
Advantage: 5 to 22 DOF
Useful for
hand posture
and
gesture recognition
Sometimes need calibration on a user-by-user basis
Disadvantages (Page 27)
Technolgies
Depth camera (Passive sensing)
tracking the fingers
Passive sensing provides unobtrusive (nda ngacau) finger tracking
Tracking the Eyes
Eye trackers used to determine where the user is looking
Device tracks the
user’s pupils
using
corneal reflections
detected by a camera (pupil tracking)
Based primarily on optical sensing
Can be done with both
active
( device is worn) and
passive
sensing (the device is placed in environment, does not require user to wear anything)
use as evaluation device and for interaction techniques (gaze directed navigation, object selection
Vive Pro Eye, VR EYE TRACKING
can navigate menus with eyes
Strategies for Building DIY Input Devices
Determine types of physical device components
Choose appropriate sensors
A variety of different sensors that can be used for DIY input devices
Need to house sensors into device-ergonomics, placement
Need to build physical
housing
for 3D input device-Lego brick, tanah liat (pttype)
Main approach is the 3D printer, utk housing
Connecting the DIY Input Device to the Computer
Main approach is the microcontroller: Arduino, Raspberry Pi
attach to appropriate circuit board & connect to application software
Requires some type of
logic
the user needs to specify for the computer to understand the
device data
Not all input mechanisms for 3D UI are spatial
Example
Speech input
Capture
speech signals
with
acoustic sensing devices
such as a
single microphone
or
array of microphones
Brain input
Use
neuronal activity of brain
to
control devices
and issue
commands
in both physical and virtual worlds
Input device monitor
Brain activity
[through EEG signals]
used to
translate and rotate 3D objects
or
move a robotic arm
The head-word 14 channel brain-computer input device reads EEG information
What is mappings here actually?
What is 3D UI techniques actually?
LED beacons