<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Human-Computer Interaction |</title><link>https://www.fabricionarcizo.com/tags/human-computer-interaction/</link><atom:link href="https://www.fabricionarcizo.com/tags/human-computer-interaction/index.xml" rel="self" type="application/rss+xml"/><description>Human-Computer Interaction</description><generator>HugoBlox Kit (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Sat, 01 Jun 2024 00:00:00 +0000</lastBuildDate><item><title>Universal Hand Gesture Interaction Vocabulary for Cross-Cultural Users: Challenges and Approaches</title><link>https://www.fabricionarcizo.com/publications/munzlinger2024/</link><pubDate>Sat, 01 Jun 2024 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/publications/munzlinger2024/</guid><description/></item><item><title>Using Machine Learning to Identify Communal Worldwide Hand Gestures for Virtual and Hybrid Meetings Context</title><link>https://www.fabricionarcizo.com/projects/gestsense/</link><pubDate>Sat, 25 Nov 2023 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/projects/gestsense/</guid><description>&lt;p&gt;This project explores how machine learning can support a hand gesture vocabulary that promotes global standardization and inclusivity. It investigates hand gesture recognition technology that allows users to communicate and control devices using natural, intuitive hand movements without touching anything. This technology can enhance user experience, safety, hygiene, and accessibility, especially for companies with international employees.&lt;/p&gt;
&lt;p&gt;The project has three main objectives:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Investigate how users from diverse backgrounds use hand gestures in virtual and hybrid meetings and which hand gestures they prefer for specific actions.&lt;/li&gt;
&lt;li&gt;Train machine learning models to identify the most common and consistent hand gestures among cross-cultural users for controlling a given function in the interactive system or device.&lt;/li&gt;
&lt;li&gt;Propose a universal hand gesture dictionary that can support a global standardization for new collaboration products and systems that use this technology and foster understanding and well-being among users who work with international teams.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Hand gesture recognition technology has huge potential in business meetings and collaboration products, as demand for meetings and e-learning is growing worldwide due to changes in work and study modes. Many industries are adopting this innovative resource, and some applications have already been launched, including the Zoom platform and certain collaborative business cameras. However, there is still room for improvement and innovation, as there is no shared standard vocabulary for hand gestures, and some gestures may have different or offensive meanings in different cultures. Therefore, it is important to consider the cultural significance of gestures and to create a conscious, communal vocabulary that is universally understood and accepted.&lt;/p&gt;
&lt;p&gt;To create hand gesture recognition products that can be used by global users from diverse backgrounds, it is not enough to ensure a high recognition rate alone. These products also need to provide a positive user experience, avoiding any embarrassment, misunderstanding, or offense that may discourage users from using the technology. Therefore, there is a need for a standardized hand gesture vocabulary that can achieve universal understanding, inclusivity, and acceptability. By conducting cross-cultural user studies, a hand gesture vocabulary can be carefully constructed to suit the needs and preferences of users from different cultures. This can increase consumer confidence and the market potential of the products, as well as improve the state of the art in hand gesture recognition for virtual and hybrid meeting contexts.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;This project is conducted by an Industrial Ph.D. student,
, who has received a grant of DKK 2.0 million from
and
. Elizabete works with the Video Technology team at
, a leading company in the collaboration business products, and studies at the
, a renowned institution for research and education in information technology.&lt;/strong&gt;&lt;/p&gt;</description></item><item><title>A Self-Employed Taxpayer Experimental Study on Trust, Power, and Tax Compliance in Eleven Countries</title><link>https://www.fabricionarcizo.com/publications/batrancea2022/</link><pubDate>Wed, 23 Nov 2022 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/publications/batrancea2022/</guid><description/></item><item><title>Using Eye/Gaze Tracking (With Narrator) to Improve Reading Ease, Speed, and Comprehension</title><link>https://www.fabricionarcizo.com/supervisions/falden2022/</link><pubDate>Wed, 17 Aug 2022 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/supervisions/falden2022/</guid><description>&lt;h3 id="abstract"&gt;&lt;strong&gt;Abstract&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;This bachelors project covers eye tracking and how it can be used as a interface with a Text-to-Speech narration, the end goal being providing assistance to those who struggle with reading this could be people with reading learning disorders such as dyslexia.&lt;/p&gt;
&lt;p&gt;Most estimates of people with reading learning disorders are between 5-20% accounting for a substantial amount of people that struggle with reading and with how integrated reading is in our societies it makes sense to try and develop assistance for those that struggle.&lt;/p&gt;
&lt;p&gt;The project goes into detail about how the different technologies can interface to increase interactability of narration software especially during narration.&lt;/p&gt;
&lt;p&gt;The results being that the sample size and bias within the data results in data of to little quality to prove anything regarding the hypothesis.&lt;/p&gt;
&lt;p&gt;This exact field of research is not directly being explored, though the different aspects are. In the end this project tries to unify and explore a lot of ideas and cannot reach any resolution.&lt;/p&gt;</description></item><item><title>Opportunities with Hand Gesture Technology in Mobile Gaming</title><link>https://www.fabricionarcizo.com/supervisions/christensen2021/</link><pubDate>Mon, 23 Aug 2021 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/supervisions/christensen2021/</guid><description>&lt;h3 id="abstract"&gt;&lt;strong&gt;Abstract&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;During the past decade, there has been steady developments in the area of computer-vision based hand-gesture recognition (HGR) technologies, and expansion in the environments they are available for. Hand-gesture input, combined with head-mounted displays, has become the principal interaction method in virtual reality games. It also shows promise in other areas, such as sign-language recognition, interactive museum exhibitions, and interactive displays available in public spaces. This paper explores the possible introduction of HGR-based interaction in mobile games, based on the identification of key concepts in literature examining the aforementioned areas. The result is a proposition of four general heuristics guiding the design and development of mobile games based on HGR as the primary interaction method.&lt;/p&gt;</description></item><item><title>Sistematização de Revisões Bibliográficas em Pesquisas da Área de IHC</title><link>https://www.fabricionarcizo.com/publications/munzlinger2012/</link><pubDate>Mon, 05 Nov 2012 00:00:00 +0000</pubDate><guid>https://www.fabricionarcizo.com/publications/munzlinger2012/</guid><description/></item></channel></rss>