<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://www.old.web3d.org/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gjkim</id>
		<title>Web3D.org - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://www.old.web3d.org/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gjkim"/>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php/Special:Contributions/Gjkim"/>
		<updated>2026-04-15T03:02:57Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.25.1</generator>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3998</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3998"/>
				<updated>2011-06-13T02:05:40Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Calendar =&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
* Our next public teleconference is  PDT 17:00 Jun 1 (Wed) / KST 09:00 Jun 2 (Thu) / CEDT 02:00 Jun 2 (Thu), 2011. (with Korea Chapter)&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting: 17:00-18:00 (Pacific time)/20:00-21:00 (Eastern time)  on 1st Wednesday, which is 09:00-10:00 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting: 17:00-18:00 (Pacific time)/20:00-21:00 (Eastern time)  on 3rd Wednesday, which is 09:00-10:00 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
== Events ==&lt;br /&gt;
* [http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
* [http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
* [http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
* SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Produce and propose X3D components for AR/MR scenes and applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR to collect requirements and use cases&lt;br /&gt;
** Archive and distribute collected requirements and use cases through AR WG wiki page&lt;br /&gt;
* Hold regular meetings and workshops to motivate discussions and editing of AR/MR related X3D specification proposals&lt;br /&gt;
** Regular meetings will be held through teleconferencing  and workshops will be planned through regular meetings&lt;br /&gt;
* Promote X3D in AR/MR field by developing and distributing promotional materials of AR applications based on X3D&lt;br /&gt;
** Promotional materials include sample applications, video clips, documents, images distributed on the web&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Define specification prose for new functionality and encodings&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
* Myeongwon Lee&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Regular meetings are held twice-monthly through teleconference.&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  &lt;br /&gt;
Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
Meeting agenda and minutes are announced through the X3D WG mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= History and Background Information =&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The screenshots below show several issues in MR visualization.&lt;br /&gt;
From top left to bottom right: (a) real image of a room; (b) real scene augmented with virtual character (note that the character appears to be before the table); (c) augmentation with additional occlusion handling (note that the character still seems to float on the floor); (d) augmentation with occlusion and shadows (applied via differential rendering).&lt;br /&gt;
&lt;br /&gt;
[[image:Kaiser140.png|600px|MR visualization]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In the following, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real scene geometry) are simply created by rendering them ''before'' the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped file containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
   &lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/workshop-2011.pdf Gerry Kim's survey presented at the AR Standards Meeting in Taiwan (2011 Jun)].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
[http://web3d.org/x3d/wiki/images/7/7f/20101216-MR-Web3D-SiggraphAsia-TeckTalk-GunLee.pdf Slides from Web3D Tech Talk at SIGGRAPH Asia 2010]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3997</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3997"/>
				<updated>2011-06-13T01:31:41Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Calendar =&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
* Our next public teleconference is  PDT 17:00 Jun 1 (Wed) / KST 09:00 Jun 2 (Thu) / CEDT 02:00 Jun 2 (Thu), 2011. (with Korea Chapter)&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting: 17:00-18:00 (Pacific time)/20:00-21:00 (Eastern time)  on 1st Wednesday, which is 09:00-10:00 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting: 17:00-18:00 (Pacific time)/20:00-21:00 (Eastern time)  on 3rd Wednesday, which is 09:00-10:00 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
== Events ==&lt;br /&gt;
* [http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
* [http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
* [http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
* SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Produce and propose X3D components for AR/MR scenes and applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR to collect requirements and use cases&lt;br /&gt;
** Archive and distribute collected requirements and use cases through AR WG wiki page&lt;br /&gt;
* Hold regular meetings and workshops to motivate discussions and editing of AR/MR related X3D specification proposals&lt;br /&gt;
** Regular meetings will be held through teleconferencing  and workshops will be planned through regular meetings&lt;br /&gt;
* Promote X3D in AR/MR field by developing and distributing promotional materials of AR applications based on X3D&lt;br /&gt;
** Promotional materials include sample applications, video clips, documents, images distributed on the web&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Define specification prose for new functionality and encodings&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
* Myeongwon Lee&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Regular meetings are held twice-monthly through teleconference.&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  &lt;br /&gt;
Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
Meeting agenda and minutes are announced through the X3D WG mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= History and Background Information =&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The screenshots below show several issues in MR visualization.&lt;br /&gt;
From top left to bottom right: (a) real image of a room; (b) real scene augmented with virtual character (note that the character appears to be before the table); (c) augmentation with additional occlusion handling (note that the character still seems to float on the floor); (d) augmentation with occlusion and shadows (applied via differential rendering).&lt;br /&gt;
&lt;br /&gt;
[[image:Kaiser140.png|600px|MR visualization]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In the following, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real scene geometry) are simply created by rendering them ''before'' the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped file containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
   &lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/ARcontents_survey_2011 Gerry Kim's survey presented at the AR Standards Meeting in Taiwan (2011 Jun)].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
[http://web3d.org/x3d/wiki/images/7/7f/20101216-MR-Web3D-SiggraphAsia-TeckTalk-GunLee.pdf Slides from Web3D Tech Talk at SIGGRAPH Asia 2010]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3494</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3494"/>
				<updated>2011-04-21T03:00:45Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
TODO list upcoming meetings, workshops and conferences here&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Third Intl. AR Standards Meeting, Jun 15-17, Taichung, Taiwan&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ AR Standards Meeting @ Taiwan)]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3493</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3493"/>
				<updated>2011-04-21T02:59:02Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
TODO list upcoming meetings, workshops and conferences here&lt;br /&gt;
&lt;br /&gt;
Third Intl. AR Standards Meeting, Jun 15-17, Taichung, Taiwan&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ AR Standards Meeting @ Taiwan)]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3492</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3492"/>
				<updated>2011-04-21T02:58:28Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
TODO list upcoming meetings, workshops and conferences here&lt;br /&gt;
&lt;br /&gt;
Third Intl. AR Standards Meeting, Jun 15-17, Taichung, Taiwan&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ AR Standards Meeting @ Taiwan)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3334</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3334"/>
				<updated>2011-04-04T07:43:36Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3332</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3332"/>
				<updated>2011-04-04T05:32:01Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3331</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3331"/>
				<updated>2011-04-04T05:31:24Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip Zip files containint various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3330</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3330"/>
				<updated>2011-04-04T05:30:00Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
[http://dxp.korea.ac.kr/AR_standards/AR_standards.zip].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3322</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3322"/>
				<updated>2011-04-03T01:11:15Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=Talk:X3D_and_Augmented_Reality&amp;diff=3321</id>
		<title>Talk:X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=Talk:X3D_and_Augmented_Reality&amp;diff=3321"/>
				<updated>2011-04-03T01:10:00Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: New page: * After-thoughts about the four approaches (by G. J. Kim)  - While G. Kim's proposal treat physical objects sort of as special virtual objects whose pose is found by a &amp;quot;mystical&amp;quot; tracking ...&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* After-thoughts about the four approaches (by G. J. Kim)&lt;br /&gt;
&lt;br /&gt;
- While G. Kim's proposal treat physical objects sort of as special virtual objects whose pose is found by a &amp;quot;mystical&amp;quot; tracking module from whatever browser, Fraunhofer's proposal maybe too tied to its own tracking module in describing the connection between the physical object and the virtual augmentation.  A right balance should be given to the description of this aspect.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension for LiveCam is similar to that of Franhofer's proposal.  Whether LiveCam is special enough to be separated out from &amp;quot;one of&amp;quot; IO sensors should be discussed further.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension of background video is similar to that of Fraunhofer's proposal.&lt;br /&gt;
&lt;br /&gt;
- Woo's 5W structure is useful, but currently is not X3D based at least in its appearance or syntax style.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3320</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3320"/>
				<updated>2011-04-03T01:08:57Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(4) After-thoughts about the four approaches:&lt;br /&gt;
&lt;br /&gt;
- While G. Kim's proposal treat physical objects sort of as special virtual objects whose pose is found by a &amp;quot;mystical&amp;quot; tracking module from whatever browser, Fraunhofer's proposal maybe too tied to its own tracking module in describing the connection between the physical object and the virtual augmentation.  A right balance should be given to the description of this aspect.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension for LiveCam is similar to that of Franhofer's proposal.  Whether LiveCam is special enough to be separated out from &amp;quot;one of&amp;quot; IO sensors should be discussed further.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension of background video is similar to that of Fraunhofer's proposal.&lt;br /&gt;
&lt;br /&gt;
- Woo's 5W structure is useful, but currently is not X3D based at least in its appearance or syntax style.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3319</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3319"/>
				<updated>2011-04-03T01:06:22Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
&amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt;&lt;br /&gt;
&amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use existing standards for describing the augmentation information themselves (e.g. X3D, Collada, etc).&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(4) After-thoughts about the four approaches:&lt;br /&gt;
&lt;br /&gt;
- While G. Kim's proposal treat physical objects sort of as special virtual objects whose pose is found by a &amp;quot;mystical&amp;quot; tracking module from whatever browser, Fraunhofer's proposal maybe too tied to its own tracking module in describing the connection between the physical object and the virtual augmentation.  A right balance should be given to the description of this aspect.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension for LiveCam is similar to that of Franhofer's proposal.  Whether LiveCam is special enough to be separated out from &amp;quot;one of&amp;quot; IO sensors should be discussed further.&lt;br /&gt;
&lt;br /&gt;
- G. Lee's extension of background video is similar to that of Fraunhofer's proposal.&lt;br /&gt;
&lt;br /&gt;
- Woo's 5W structure is useful, but currently is not X3D based at least in its appearance or syntax style.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3318</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3318"/>
				<updated>2011-04-03T00:42:44Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Scene&amp;gt;&lt;br /&gt;
    &amp;lt;Group&amp;gt;&lt;br /&gt;
      &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
      &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
      &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
        &amp;lt;Shape&amp;gt;&lt;br /&gt;
          &amp;lt;Appearance&amp;gt;&lt;br /&gt;
            &amp;lt;Material/&amp;gt;&lt;br /&gt;
          &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
          &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
        &amp;lt;/Shape&amp;gt;&lt;br /&gt;
      &amp;lt;/Transform&amp;gt;&lt;br /&gt;
    &amp;lt;/Group&amp;gt;&lt;br /&gt;
&amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt;&lt;br /&gt;
&amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(4) Comparison to Fraunhofer's apporach: &lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3317</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3317"/>
				<updated>2011-04-03T00:40:10Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    &amp;lt;Group&amp;gt;&lt;br /&gt;
      &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
      &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
&amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
        &amp;lt;Shape&amp;gt;&lt;br /&gt;
          &amp;lt;Appearance&amp;gt;&lt;br /&gt;
            &amp;lt;Material/&amp;gt;&lt;br /&gt;
          &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
          &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
        &amp;lt;/Shape&amp;gt;&lt;br /&gt;
      &amp;lt;/Transform&amp;gt;&lt;br /&gt;
    &amp;lt;/Group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
(3) Comparison to Fraunhofer's apporach: &lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3316</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3316"/>
				<updated>2011-04-03T00:24:59Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) &lt;br /&gt;
(2)&lt;br /&gt;
(3)&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
Will fill in something before the meeting on April 7th, 2011 (G. J. Kim).&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	<entry>
		<id>https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3262</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://www.old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3262"/>
				<updated>2011-03-22T03:35:10Z</updated>
		
		<summary type="html">&lt;p&gt;Gjkim: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
Will fill in something before the meeting on April 7th, 2011 (G. J. Kim).&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Gjkim</name></author>	</entry>

	</feed>