Main Page: Difference between revisions

From xr4all.hhi.fraunhofer.de
Jump to navigation Jump to search
No edit summary
No edit summary
Tags: Manual revert Visual edit
 
(31 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{DISPLAYTITLE:Main page}}


One of the tasks of the H2020 coordination and support action project XR4ALL was to create and develop 


* a landscape research report on interactive eXtended Reality technologies
* a research agenda for the upcoming 3-5 years and beyond.


== The scope of eXtended Reality ==
The final versions of both reports were provided to the community in December 2020 and they describe and report the current state and the advances on technology, applications, market, obstacles and challenges until that date. However, the domain of interactive and eXtended Realities is a highly dynamic and significantly evolving research and technology area.  
Paul Milgram defined the well-known Reality-Virtuality Continuum in 1994 <ref>[1] P. Milgram, H. Takemura, A. Utsumi, and F. Kishino, "Augmented Reality: A class of displays on the reality-virtuality continuum", Proc. SPIE vol. 2351, Telemanipulator and Telepresence Technologies, pp. 2351–34, 1994.</ref>. It explains the transition between reality on the one hand, and a complete digital or computer-generated environment on the other hand. However, from a technology point of view, a new umbrella term has been introduced, named eXtended Reality (XR). It is the umbrella term used for Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), as well as all future realities such technologies might bring. XR covers the full spectrum of real and virtual environments. In Figure 1, the Reality-Virtuality Continuum is extended by the new umbrella term. As seen in the figure, a less-known term is presented, called Augmented Virtuality. This term relates to an approach, where the reality, e.g. the user’s hand, appears in the virtual world, which is usually referred to as Mixed Reality.
<br>


[[File:Mixerealitycontinuum v2.jpg|center| Extended reality scheme|alternativtext=|mini]]
Therefore, XR4ALL decided to turn both documents into living documents by using a wiki. The community on interactive and eXtended Reality technologies is invited to further complete and edit the two documents in order to keep them up-to-date and relevant, and endorse them for long-term vision development. 


Following the most common terminology, the three major scenarios of extended reality are defined as follows.
The content can be accessed here:
Starting from left-to-right, Augmented Reality (AR) consists in augmenting the perception of the real environment with virtual elements by mixing in real-time spatially-registered digital content with the real world <ref> [2] Ronald T. Azuma, “A Survey of Augmented Reality”, Presence: Teleoperators and Virtual Environments, vol. 6, issue 4, pp. 355-385, 199</ref>. Pokémon Go and Snapchat filters are commonplace examples of this kind of technology used with smartphones or tablets. AR is also widely used in the industry sector, where workers can wear AR glasses to get support during maintenance, or for training.
*[[Landscape Research Report|'''Landscape Research Report''']]
Augmented Virtuality (AV) consists in augmenting the perception of a virtual environment with real elements. These elements of the real world are generally captured in real-time and injected into the virtual environment. The capture of the user’s body that is injected into the virtual environment is a well-known example of AV aimed at improving the feeling of embodiment.
*[[Research Agenda|'''Research Agenda''']]
Virtual Reality (VR) applications use headsets to fully immerse users in a computer-simulated reality. These headsets generate realistic images and sounds, engaging two senses to create an interactive virtual world.
If you wish to contribute and further edit the content on the wiki, provide us with a short motivation and the topic, you wish to add to the wiki. <br>
Mixed Reality (MR) includes both AR and AV. It blends real and virtual worlds to create complex environments, where physical and digital elements can interact in real-time. It is defined as a continuum between the real and the virtual environments but excludes both of them.
Send an email by clicking [mailto:oliver.schreer@hhi.fraunhofer.de?subject=XR4ALL-WikiRegistration&body=message%20goes%20here here]
An important question to answer is how broad the term eXtented Reality (XR) spans across technologies and application domains. XR could be considered as a fusion of AR, AV, and VR technologies, but in fact it involves many more technology domains. The necessary domains range from sensing the world (such as image, video, sound, haptic), processing the data and rendering. Besides, hardware is involved to sense, capture, track, register, display, and to do many more things.  
In Figure 2, a simplified schematic diagram of an eXtended Reality system is presented. On the left hand side, the user is performing a task by using an XR application. In section 5, a complete overview of all the relevant domains is given covering advertisement, cultural heritage, education and training, industry 4.0, health and medicine, security, journalism, social VR and tourism. The user interacts with the scene and his interaction is captured with a range of input devices and sensors, which can be visual, audio, motion, and many more (see sec 4.1 and 4.2.). The acquired data serves as input for the XR hardware where further necessary processing in the render engine is performed (see sec. 4.7). For example, the correct view point is rendered or the desired interaction with the scene is triggered. In sec. 4.3 and 4.4, an overview of the major algorithms and approaches is given. However, not only captured data is used in the render engine, but also additional data that comes from other sources such as edge cloud servers (see sec. 4.8) or 3D data available on the device itself. The rendered scene is then fed back to the user to allow him sensing the scene. This is achieved by various means such as XR headsets or other types of displays and other sensorial stimuli.
[[File:Flag of Europe.jpg|left|thumb|150x150px]]
The complete set of technologies and applications will be described in the following chapters.
The XR4ALL project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825545.
[[File:XRSystem.jpg|Major components of an eXtended Reality system|alternativtext=|zentriert|mini]]
==Notes==
<references />


 
__FORCETOC__
= <span style="color:#FF0000">Überschrift 1 (rot)</span> =
== <span style="color:#0000FF"><big>Überschrift 2 (blau)</big></span> ==
=== Überschrift 3 ===
 
=== <span style="color:#00FFFF;font-size:92%; line-height: 1.31em;">Überschrift 3 (Aqua)</span> ===
Farbcodes: [[https://meta.wikimedia.org/wiki/Wiki_color_formatting_help https://meta.wikimedia.org/wiki/Wiki_color_formatting_help]]
 
Fort: [https://en.wikipedia.org/wiki/Help:Advanced_text_formatting<nowiki>]</nowiki>
 
 
 
=== MediaWiki wurde installiert.</span> ===
Hilfe zur Benutzung und Konfiguration der Wiki-Software findest du im [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents Benutzerhandbuch].
 
== Starthilfen ==
 
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings Liste der Konfigurationsvariablen]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ MediaWiki-FAQ]
* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce Mailingliste neuer MediaWiki-Versionen]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources Übersetze MediaWiki für deine Sprache]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Combating_spam Erfahre, wie du Spam auf deinem Wiki bekämpfen kannst]

Latest revision as of 22:37, 3 June 2022

One of the tasks of the H2020 coordination and support action project XR4ALL was to create and develop

  • a landscape research report on interactive eXtended Reality technologies
  • a research agenda for the upcoming 3-5 years and beyond.

The final versions of both reports were provided to the community in December 2020 and they describe and report the current state and the advances on technology, applications, market, obstacles and challenges until that date. However, the domain of interactive and eXtended Realities is a highly dynamic and significantly evolving research and technology area.

Therefore, XR4ALL decided to turn both documents into living documents by using a wiki. The community on interactive and eXtended Reality technologies is invited to further complete and edit the two documents in order to keep them up-to-date and relevant, and endorse them for long-term vision development.

The content can be accessed here:

If you wish to contribute and further edit the content on the wiki, provide us with a short motivation and the topic, you wish to add to the wiki.
Send an email by clicking here

Flag of Europe.jpg

The XR4ALL project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825545.