Please enable JavaScript.
Coggle requires JavaScript to display documents.
POSTIVE TECHNOLOGY for elderly WELL-BEING, Computer Vision Technology, TO…
POSTIVE TECHNOLOGY
for elderly WELL-BEING
VR
positive emotions + well-being
overview: Freeman et al. (2017)
tracking of body, head, eyes, face
non-immersive,
semi-immersive,
fully-immersive
FPV
First Person Vision
aim is to understand the user’s environment, his/her behaviour and intent through wearable sensors, such as action cameras or smart glasses, producing artificial intelligent systems acting as in first person: “it sees what the user sees and looks where the user is looking”
long-term monitoring of human activities in home or community living
better focus their attention or to improve their interaction with objects or people; and accomplish ordinary tasks or specific goals
MeCam, GoPro, Google Glass or Looxcie, are already applied in the field of the health care
references, overview
wearables: recover general user context
environment understanding
recognizing personal locations
challenge: understand human activities
novel pervasive system to recognize human complex daily living activities from a reading glasses integrating a 3-axis accelerometer and a first-person view camera is developed in [
Zhan et al. 2015
] .
assist recognize objects
"next-active-object prediction"
active objects
objects being manipulated by the user and provide important information about the action being performed (e.g., using the kettle to boil wa- ter)
passive objects
non-manipulated objects and pro- vide context information such as the items in a kitchen
AASI
automated analysis of social interaction
video-surveillance, robotics, social signal processing, and human-human interaction analysis
general view on computational approaches for perceiving humans and their in- teractions.
Tapus et al. 2019
Levasseur et al. (2010)
being with others
unfocused interaction
interacting with others
participating to a common activity
helping others
2 problems:
peculiarities of mobility and social interaction
specificity of classical indoor domestic environments
AmE
emotion-aware ambient intelligence
capture and integrate physiological signals, face expression and body movement, the first two aiming at deter- mining the emotional state, while the third to detect the level of activation of the monitored person
music and colour/light actuation
3-layer AmE architecture
sensors - dedicated devices
decision maker - powerful PC
actuator - eg alert
smiling count to estimate a measure of happiness
++ health & activity coefficients
=> well-being rating
propose either an activity, turns on a display, projects photographs that are known to make the user happy, or alerts the caregivers.
AC
Affective computing
emotion interactions with and through
computer
speech emotion recognition
Facial Expression recognition
body expression perception and recognition
multimodal affect recognition tech. based on physiological computing and data fusion
mood induction procedures (MIPS)
environmental light changes [106]
emotion-evoking videos [107]
virtual reality [109]
music
Mammarella et al. 2007
Anttonen und Surakka, 2007
multimodal procedures [112–114]
multimodal procedures
Castillo et al. 2016
Zhang et al. 2014
FER
Face Expression Recognition
classifying in 7 fundamental emotions:
Anger
Disgust
Fear
Happiness
Sadness
Surprise
Neutral
Emotional Healthcare System
by face and speech
Local Directional Position Pattern
by depth sensor video camera images
deep learning method for FER in older adults based on Stacked De- noising Auto-Encoder
validation / test on public age-expression data sets: FACES and Lifespan
capture longer- term affective states (mood)
Katsimerou et al. 2015
AR
display virtual layer on top of real world
enhance cultural heritage experiences
Noh et al. 2009
enrich gaming experience
improve education performance
increase quality of life of elderly
Yoo et al. 2013
handle cronic health conditions
McLaughlin et al. 2018
vision-based device camera
ICAs
provide monitoring functional- ities to recognize people activities and possibly trigger a prompt assistance if needed
emotional and affective dimension
COACH - Michailidis et al. 2008
assist in hand-washing per video-input and audio instructions
relation btw. emotions and well-being
Malhotra et al.
virtual humans with facial expressions for emotions-aligned prompts
emotionally intelligent cognitive assistant
Affect Control Theory (ACT)
affectively intelligent artificial agents
AmI
Smart Home
Health Monitoring
social communication
provide companionship
recreation & entertainment
Computer Vision Technology
activity analysis and recognition
human pose recognition
Güler et al. 2018
CNN = Convolutional Neural Network
action recognition
interaction between individuals
fall detection
classification problem
(Fall vs no-fall)
unusual activity (anomaly detection)
sleep monitoring
vision-based approaches
less intrusive
3 methods
infrared camera <-- physiological parameters (breathing)
camera <- body movements
combi of both
traditional no-vision based:
polysomnography
wearable bracelets
not wearable strips or bands
sensor-based
food recognition
apps for food intake assessment and logging
! AUTOMATED ???
quantity estimation
based on custom or pretrained CNN
comparison with reference (e.g. thumb)
3D techniques coupled with template matching or shape reconstruction algo
leftover estimation
gesture recognition
based on depth sensor
RGB cameras
CNN
frame based, spatio-temporal based, hand detection oriented
Mueller et al. 2017
TO DO
Cross-fertilization - btw. psychologists and computer scientists
Data - specific databases for elderly
acceptance - slower adoption of technology than young people
Privacy - GDPR - especially for visual data
emotional / affective
social interaction
positive / cognitive
detect emotional state
autom. induce pos. emotions
Projects
Stairway to elders